Well, proposed proofs can be wrong. Maybe that's your point? Also, proving that an algorithm meets a specification tells you nothing about whether the specification was meaningful, nor does it tell you anything about whether a particular translation of algorithm into code is correct. So I'm not advocating "proofs only, no testing."
snip Indeed. One of my favorite old-time war stories is one I call "how I learned the difference between MS-DOS and a real operating...
I would still claim that proofs can significantly increase one's overall confidence in the correctness of a program, especially in cases where the algorithm is at all tricky. I don't know whether Euclid's algorithm counts as tricky; I've spent spare minutes in the past couple of days Googling for various presentations of it, and it's starting to seem closer to intuitive than it did.
I would also claim that proofs are particularly helpful in situations where it is difficult to perform repeatable tests -- e.g., if the code involves multiple threads or processes.
I had a moderately compelling brush with this in a course on concurrent algorithms I took in grad school: The initial homeworks consisted of having us implement algorithms that the instructor had proved correct in clbutt. That all went smoothly. For the last homework, we had to come up with the algorithm, prove it correct, and then write code to implement it. I was completely sure that my algorithm was correct, but despite my best efforts, I couldn't make the proof work. Finally I decided to just write the code anyway, thinking "well, I'm sure it works, even if I can't prove it." Do I need to fill in the rest of the story .... Right. The program was pretty much a total failure.
Okay, the plural of anecdote is not data, but I found it convincing.
The other way in which I think the "proofs of correctness" stuff helps is in encouraging a way of thinking about programming that is .... I'm not sure how to explain this, but in the course of being taught about proofs of correctness I found that I was starting to think about programming in a different way, which I would describe as "static versus dynamic." By "dynamic" here, I mean a focus on tracing various end paths through the code. By "static" I mean thinking about "okay, at this point in the code, what do I know to be true?" My experience has been that I write better code using this approach I'm calling "static". Think buttertions, loop invariants, that sort of thing.
I could probably go on about this at more length, but not unless someone else is interested in continuing the discussion.
Versions of DEC BASIC that I've seen (BASIC 11, Vax BASIC, et al) had a different underlying philosophy than most microcomputer BASICs from the late...
-- B. L. Mbuttingill ObDisclaimer: I don't speak for my employers; they return the favor.