One of my favorite quotes:
“There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies.”
I think about this a lot because it’s true of any complex system or argument, not just software.
This is indeed a great quote (one of many gems from Sir Tony) but I think the context that follows it is also an essential insight:
> The first method is far more difficult. It demands the same skill,
devotion, insight, and even inspiration as the discovery of the simple
physical laws which underlie the complex phenomena of nature. It
also requires a willingness to accept objectives which are limited
by physical, logical, and technological constraints, and to accept a
compromise when conflicting objectives cannot be met. No committee
will ever do this until it is too late.
Rest in peace, he hasn't seen the industry change.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"
As Prof Dijksktra was preparing for his end of life, organizing his documents and correspondence with other researchers became an important task.
Cancer had snuck up on Dijkstra and there was not much time.
One senior professor who was helping out with this, asked him what is to be done with the correspondences. The professor, quite renowned himself, relates that Dijsktra tells him, from his hospital bed, to keep the ones with "Tony" and throw the rest.
The professor adds with a dry wit, that his own correspondence with Dijsktra were in the pile too.
That's a famous quote and age might have mellowed him. But he was not like that at all in person with his students. He did insist that one is precise with ones words.
The origin of the quote may have more to do with cultural differences between the Dutch and Americans.
Fun story - at Oxford they like to name buildings after important people. Dr Hoare was nominated to have a house named after him. This presented the university with a dilemma of having a literal `Hoare house` (pronounced whore).
I can't remember what Oxford did to resolve this, but I think they settled on `C.A.R. Hoare Residence`.
I mean, I like puns but they're a flash in the pan. Jokes get old after a while and you don't want to embed them in something fairly permanent like a building name.
"Surely you've all heard of the Hoare house on campus?" seems like a pretty timeless way to a) keep people from dozing off during that bit of lecture b) cause a whole bunch of people to remember who this guy was and what he did.
He was the professor in the Programming Research Group (known universally as the PRG) at Oxford when I was doing my DPhil and interviewed me for the DPhil. I spent quite a bit of time with him and, of course, spent a lot of time doing stuff with CSP including my entire DPhil.
Sad to think that the TonyHoare process has reached STOP.
Tony's An Axiomatic Basis for Computer Programming[1] is the first academic paper that I read that I was able to understand when I was an undergrad. I think it unlocked something in me because before that I never believed that I would be able to read and understand scientific papers.
That was 35ish years ago. I just pulled up the paper now and I can't read the notation anymore... This might be something that I try applying an AI to. Get it to walk me through a paper paragraph-by-paragraph until I get back up to speed.
Retrospective: An Axiomatic Basis For Computer Programming. This was written 30 years after An Axiomatic Basis for Computer Programming to take stock on what was proven right and what was proven wrong - https://cacm.acm.org/opinion/retrospective-an-axiomatic-basi...
From his Oxford bio: "To assist in efficient look-up of words in a dictionary, he discovered the well-known sorting algorithm Quicksort."
I always liked this presentation. I think it's equally fine to say "invented" something, but I think this fits into his ethos (from what I understand of him.) There are natural phenomena, and it just takes noticing.
I remember attending a tech event at MSR Cambridge, and a speaker made some disparaging comment about older developers not being able to keep up in this modern world of programming.
An older gentleman stood up and politely mentioned they knew a thing or two.
I've had the good fortune to attend two of his lectures in person. Each time, he effortlessly derived provably correct code from the conditions of the problem and made it seem all too easy. 10 minutes after leaving the lecture, my thought was "Wait, how did he do it again?".
He also invented many other things, like enumeration types, optional types, constructors. He popularized the "unions" introduced by McCarthy, which were later implemented in ALGOL 68, from where a crippled form of them was added to the C language.
Several keywords used in many programming languages come from Hoare, who either coined them himself, or he took them from another source, but all later programming language designers took them from Hoare. For example "case", but here only the keyword comes from Hoare, because a better form of the "case" statement had been proposed first by McCarthy many years earlier, under the name "select".
Another example is "class" which Simula 67, then all object-oriented languages took from Hoare, However, in this case the keyword has not been used first by Hoare, because he took "class", together with "record", from COBOL.
Another keyword popularized by Hoare is "new" (which Hoare took from Wirth, but everybody else took from Hoare), later used by many languages, including C++. At Hoare, the counterpart of "new" was "destroy", hence the name "destructor", used first in C++.
The paper "Record Handling", published by C.A.R. Hoare in 1965-11 was a major influence on many programming languages. It determined significant changes in the IBM PL/I programming language, including the introduction of pointers . It also was the source of many features of the SIMULA 67 and ALGOL 68 languages, from where they spread in many later programming languages.
The programming language "Occam" has been designed mainly as an implementation of the ideas described by Hoare in the "Communicating Sequential Processes" paper published in 1978-08. OpenMP also inherits many of those concepts, and some of them are also in CUDA.
there was a time, 10-15 years ago, when they were super cool. at some point they """diluted""" the technicality content and the nature of guests and they vanished into irrelevance.
Hoare Logic + Dijkstra's weakest precondition + Meyer's Design-by-Contract is what should be used to get LLMs to generate proof with code in a correctness-by-construction approach to implementation.
The mistake was not null references per se. The mistake was having all references be implicitly nullable.
He states around minute 25 the solution to the problem is to explicitly represent null in the type system, so nullable pointers are explicitly declared as such. But it can be complex to ensure that non-nullable references are always initialized to a non-null value, which is why he chose the easy solution to just let every reference be nullable.
The null reference was invented by Hoare as a means to implement optional types, which works regardless of their binary representation.
Optional types were a very valuable invention and the fact that null values have been handled incorrectly in many programming languages or environments is not Hoare's fault.
Having "Optional types" only makes sense if your type system is powerful enough.
There are two ways this might happen, both will solve the Billion Dollar Problem but I think one is the clear winner. The first way is explicit optionality, often retro-fitted to languages for example in C# the difference between the types Goose and Goose? are that (in a suitable C# project enabling this rule) the first one is always a Goose and the second might be null instead.
The second way is if you have Sum types you can just add "or it's null" to the type.
I think sum types are better because they pass my "three purposes" rule where I can think of not one (Option<T> replaces optionality) or two (Result<T,E> for error handling) but at least three (ControlFlow<B, C> reifies control flow) distinct problems I don't need separate solutions for any more if I have this feature.
If your type system is too weak you suffer the Billion Dollar problem with Hoare's idea and perhaps if this "feature" had never been invented we'd have all migrated to languages with a better type system decades ago.
I'm pretty sure that this is not true. I talked to Bud Lawson (the inventor of the pointer) and he claimed that they had implemented special behaviour for null pointers earlier. When I talked to Tony later about it, he said he had never heard of Bud Lawson. So probably both invented them independently, but Bud came first.
If we start playing the "who was first" game, then for the Soviet machine Kiev (Kyiv), an "address language" with a "prime operation" was created in 1955-59.
The prime operation and address mapping.
The prime operation defines a certain single‑argument function. Its symbol (a prime mark) is written above and to the left of the argument:
'a = b
where a is the argument and b is the result of the operation.
This is read as: "prime a equals b" (or "b is the contents of a").
The argument a is called an address, and the function value b is called the contents of the address.
The prime function ' defines a mapping from the set of addresses A to the set of contents B, which we will call an address mapping.
He came to give a lecture at UT Austin, where I did my undergrad. I had a chance to ask him a question: "what's the story behind inventing QuickSort?". He said something simple, like "first I thought of MergeSort, and then I thought of QuickSort" - as if it were just natural thought. He came across as a kind and humble person. Glad to have met one of the greats of the field!
Happy to meet you. I was there and I remember that question being asked. I think it was 2010.
If I remember correctly he had two immediate ideas, his first was bubble sort, the second turned out to be quicksort.
He was already very frail by then. Yet clarity of mind was undiminished. What came across in that talk, in addition to his technical material, was his humor and warmth.
When I started university he gave a talk to all the new CompScis which as you can imagine was incredibly inspirational for an aspiring Software Engineer.
Random anecdote and Mr. Hoare (yep not a Dr.) has always been one of my computing heroes.
Mr. Hoare did a talk back during my undergrad and for some reason despite totally checked out of school I attended, and it is one of my formative experiences. AFAICR it was about proving program correctness.
After it finished during the Q&A segment, one student asked him about his opinions about the famous Brooks essay No Silver Bullet and Mr. Hoare's answer was... total confusion. Apparently he had not heard of the concept at all! It could be a lost in translation thing but I don't think so since I remember understanding the phrase "silver bullet" which did not make any sense to me. And now Mr. Hoare and Dr. Brooks are two of my all time computing heroes.
It is not usual to call people with an honorary doctorate "Doctor" except in the context of the awarding institution. Most likely the awarding institutions will have actually specified that the recipient should not give anybody the false impression and I can't imagine Tony is the type to do otherwise.
His title at Oxford was 'Professor', and he was addressed as 'Tony'.
He made incoming DPhil (PhD) students a cup of tea individually in his office at the Computing Laboratory. It was a small group, but still I appreciated this personal touch.
Tony advised me to make money with the software model checker I have been writing. In contrast to the typical practice to make these tools open source and free for use. Would have loved to learn more from him. He was a great teacher but also a great and sharp listener. Still remember the detour we made on the way to a bar in London, talking too much and deep about refinement relations. RiP.
I wrote both my master thesis and PhD on Hoare's Communicating Sequential Processes. I really enjoyed it's simplicity, expandability, and was always amazed that it inspired and influenced language constructs in Go, Erlang, occam and the likes.
Anti patterns are great, they act as escape hatches or pressure release valves. Every piece of mechanical equipment has some analogue for good reason.
Without things like null pointers, goto, globals, unsafe modes in modern safe(r) languages you can get yourself into a corner by over designing everything, often leading to complex unmaintainable code.
With judicious use of these anti-patterns you get mostly good/clean design with one or two well documented exceptions.
I think the first enlightenment is that software engineers should be able to abstract away these algorithms to reliable libraries.
The second enlightenment is that if you don't understand what the libraries are doing, you will probably ship things that assemble the libraries in unreasonably slow/expensive ways, lacking the intuition for how "hard" the overall operation should be.
One of my favorite quotes: “There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies.”
I think about this a lot because it’s true of any complex system or argument, not just software.
Reminds me of this Pascal quote: "I would have written a shorter letter, but I did not have the time."
https://www.npr.org/sections/13.7/2014/02/03/270680304/this-...
This is indeed a great quote (one of many gems from Sir Tony) but I think the context that follows it is also an essential insight:
> The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willingness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.
(All from his Turing Award lecture, "The Emperor's Old Clothes": https://www.labouseur.com/projects/codeReckon/papers/The-Emp...)
"No committee will ever do this until it is too late."
The software I like best was not written by "teams"
I prefer small programs written by individuals that generally violate memes like "software is never finished" and "all software has bugs"
(End user perspective, not a developer)
aged very well
Rest in peace, he hasn't seen the industry change.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"
As Prof Dijksktra was preparing for his end of life, organizing his documents and correspondence with other researchers became an important task.
Cancer had snuck up on Dijkstra and there was not much time.
One senior professor who was helping out with this, asked him what is to be done with the correspondences. The professor, quite renowned himself, relates that Dijsktra tells him, from his hospital bed, to keep the ones with "Tony" and throw the rest.
The professor adds with a dry wit, that his own correspondence with Dijsktra were in the pile too.
John Backus had some correspondence with Dijkstra that's worth a read: https://medium.com/@acidflask/this-guys-arrogance-takes-your...
There's that immortal Alan Kay line "arrogance in computer science is measured in nano Dijkstras".
That's a famous quote and age might have mellowed him. But he was not like that at all in person with his students. He did insist that one is precise with ones words.
The origin of the quote may have more to do with cultural differences between the Dutch and Americans.
Alan Kay himself said this quote is taken out-of-context and so people need to stop repeating it - https://news.ycombinator.com/item?id=11799963
Fun story - at Oxford they like to name buildings after important people. Dr Hoare was nominated to have a house named after him. This presented the university with a dilemma of having a literal `Hoare house` (pronounced whore).
I can't remember what Oxford did to resolve this, but I think they settled on `C.A.R. Hoare Residence`.
There's the Tony Hoare Room [1] in the Robert Hooke Building. We held our Reinforcement Learning reading group there.
[1] https://www.cs.ox.ac.uk/people/jennifer.watson/tonyhoare.htm...
Cowards.
I was awarded the CAR Hoare prize from university, which is marginally better than the hoare prize I suppose
Shame the university takes itself so seriously. The illustrative example of overloading would have been pertinent to his subject of expertise.
I mean, I like puns but they're a flash in the pan. Jokes get old after a while and you don't want to embed them in something fairly permanent like a building name.
This particular word for the oldest profession goes back to Old English. I am fairly sure it would outlive the building.
"Surely you've all heard of the Hoare house on campus?" seems like a pretty timeless way to a) keep people from dozing off during that bit of lecture b) cause a whole bunch of people to remember who this guy was and what he did.
He was the professor in the Programming Research Group (known universally as the PRG) at Oxford when I was doing my DPhil and interviewed me for the DPhil. I spent quite a bit of time with him and, of course, spent a lot of time doing stuff with CSP including my entire DPhil.
Sad to think that the TonyHoare process has reached STOP.
RIP.
Tony's An Axiomatic Basis for Computer Programming[1] is the first academic paper that I read that I was able to understand when I was an undergrad. I think it unlocked something in me because before that I never believed that I would be able to read and understand scientific papers.
That was 35ish years ago. I just pulled up the paper now and I can't read the notation anymore... This might be something that I try applying an AI to. Get it to walk me through a paper paragraph-by-paragraph until I get back up to speed.
[1]:https://dl.acm.org/doi/10.1145/363235.363259
I can recommend NotebookLM [1] for reading through scientific papers. You can then ask it questions and even get a podcast generated.
1. https://notebooklm.google/
Followup on the above with these two classics;
Retrospective: An Axiomatic Basis For Computer Programming. This was written 30 years after An Axiomatic Basis for Computer Programming to take stock on what was proven right and what was proven wrong - https://cacm.acm.org/opinion/retrospective-an-axiomatic-basi...
How Did Software Get So Reliable Without Proof? More detailed paper on the above theme (pdf) - https://6826.csail.mit.edu/2020/papers/noproof.pdf
From his Oxford bio: "To assist in efficient look-up of words in a dictionary, he discovered the well-known sorting algorithm Quicksort."
I always liked this presentation. I think it's equally fine to say "invented" something, but I think this fits into his ethos (from what I understand of him.) There are natural phenomena, and it just takes noticing.
I remember attending a tech event at MSR Cambridge, and a speaker made some disparaging comment about older developers not being able to keep up in this modern world of programming.
An older gentleman stood up and politely mentioned they knew a thing or two.
That was Tony Hoare.
I've had the good fortune to attend two of his lectures in person. Each time, he effortlessly derived provably correct code from the conditions of the problem and made it seem all too easy. 10 minutes after leaving the lecture, my thought was "Wait, how did he do it again?".
RIP Sir Tony.
One of the greats. Invented quicksort and concurrent sequential processes. I always looked up to him because he also seemed very humble.
He also invented many other things, like enumeration types, optional types, constructors. He popularized the "unions" introduced by McCarthy, which were later implemented in ALGOL 68, from where a crippled form of them was added to the C language.
Several keywords used in many programming languages come from Hoare, who either coined them himself, or he took them from another source, but all later programming language designers took them from Hoare. For example "case", but here only the keyword comes from Hoare, because a better form of the "case" statement had been proposed first by McCarthy many years earlier, under the name "select".
Another example is "class" which Simula 67, then all object-oriented languages took from Hoare, However, in this case the keyword has not been used first by Hoare, because he took "class", together with "record", from COBOL.
Another keyword popularized by Hoare is "new" (which Hoare took from Wirth, but everybody else took from Hoare), later used by many languages, including C++. At Hoare, the counterpart of "new" was "destroy", hence the name "destructor", used first in C++.
The paper "Record Handling", published by C.A.R. Hoare in 1965-11 was a major influence on many programming languages. It determined significant changes in the IBM PL/I programming language, including the introduction of pointers . It also was the source of many features of the SIMULA 67 and ALGOL 68 languages, from where they spread in many later programming languages.
The programming language "Occam" has been designed mainly as an implementation of the ideas described by Hoare in the "Communicating Sequential Processes" paper published in 1978-08. OpenMP also inherits many of those concepts, and some of them are also in CUDA.
And, of course, the Go programming language.
I would not say he invented Go, although Go is probably the only relevant implementation of CSP nowadays.
I was adding Go to the list at the very end of the comment:
>OpenMP also inherits many of those concepts, and some of them are also in CUDA.
Talking about Quicksort, John Bentley’s deep dive in Quicksort is quite illuminating. https://m.youtube.com/watch?v=QvgYAQzg1z8
oh man, google tech talks. what a throwback.
there was a time, 10-15 years ago, when they were super cool. at some point they """diluted""" the technicality content and the nature of guests and they vanished into irrelevance.
Yes, but don't forget his formal work also (Hoare logic).
To me, this is his most important contribution; Everybody else built on top of this.
Hoare Logic - https://en.wikipedia.org/wiki/Hoare_logic
Rediscovering it through the Dafny programming language. Brings back memories of a 1994 University course.
I think it is even more relevant today.
Hoare Logic + Dijkstra's weakest precondition + Meyer's Design-by-Contract is what should be used to get LLMs to generate proof with code in a correctness-by-construction approach to implementation.
References:
Correctness-by-Construction (CbC) - https://www.tu-braunschweig.de/en/isf/research/cbc
A Course on Correctness by Construction - https://wp.software.imdea.org/cbc/
And regretful inventor of the null reference!
His “billion dollar mistake”:
https://www.infoq.com/presentations/Null-References-The-Bill...
The mistake was not null references per se. The mistake was having all references be implicitly nullable.
He states around minute 25 the solution to the problem is to explicitly represent null in the type system, so nullable pointers are explicitly declared as such. But it can be complex to ensure that non-nullable references are always initialized to a non-null value, which is why he chose the easy solution to just let every reference be nullable.
The null reference was invented by Hoare as a means to implement optional types, which works regardless of their binary representation.
Optional types were a very valuable invention and the fact that null values have been handled incorrectly in many programming languages or environments is not Hoare's fault.
Having "Optional types" only makes sense if your type system is powerful enough.
There are two ways this might happen, both will solve the Billion Dollar Problem but I think one is the clear winner. The first way is explicit optionality, often retro-fitted to languages for example in C# the difference between the types Goose and Goose? are that (in a suitable C# project enabling this rule) the first one is always a Goose and the second might be null instead.
The second way is if you have Sum types you can just add "or it's null" to the type.
I think sum types are better because they pass my "three purposes" rule where I can think of not one (Option<T> replaces optionality) or two (Result<T,E> for error handling) but at least three (ControlFlow<B, C> reifies control flow) distinct problems I don't need separate solutions for any more if I have this feature.
If your type system is too weak you suffer the Billion Dollar problem with Hoare's idea and perhaps if this "feature" had never been invented we'd have all migrated to languages with a better type system decades ago.
I'm pretty sure that this is not true. I talked to Bud Lawson (the inventor of the pointer) and he claimed that they had implemented special behaviour for null pointers earlier. When I talked to Tony later about it, he said he had never heard of Bud Lawson. So probably both invented them independently, but Bud came first.
If we start playing the "who was first" game, then for the Soviet machine Kiev (Kyiv), an "address language" with a "prime operation" was created in 1955-59.
The prime operation and address mapping.
The prime operation defines a certain single‑argument function. Its symbol (a prime mark) is written above and to the left of the argument: 'a = b where a is the argument and b is the result of the operation. This is read as: "prime a equals b" (or "b is the contents of a"). The argument a is called an address, and the function value b is called the contents of the address. The prime function ' defines a mapping from the set of addresses A to the set of contents B, which we will call an address mapping.
Page 36, chapter III https://torba.infoua.net/files/kateryna-yushchenko/Vychislit...
They were never concurrent, they were communicating. https://en.wikipedia.org/wiki/Communicating_sequential_proce...
That is indeed the correct title, but the processes were concurrent.
However, they were not just concurrent, but also communicating.
He came to give a lecture at UT Austin, where I did my undergrad. I had a chance to ask him a question: "what's the story behind inventing QuickSort?". He said something simple, like "first I thought of MergeSort, and then I thought of QuickSort" - as if it were just natural thought. He came across as a kind and humble person. Glad to have met one of the greats of the field!
Happy to meet you. I was there and I remember that question being asked. I think it was 2010.
If I remember correctly he had two immediate ideas, his first was bubble sort, the second turned out to be quicksort.
He was already very frail by then. Yet clarity of mind was undiminished. What came across in that talk, in addition to his technical material, was his humor and warmth.
He discusses this and his sixpence wager here: https://youtu.be/pJgKYn0lcno
(Source: TFA)
I remember this vividly! I believe he said that he thought of _Bubble Sort_ first, but that it was too slow, so he came up with QuickSort next
This is devastating news.
When I started university he gave a talk to all the new CompScis which as you can imagine was incredibly inspirational for an aspiring Software Engineer.
Grateful to have had that experience.
RIP
With respect I say that the one can only feel gobsmacked about how much complexity has grown.
In the 60s inventing one single algorithm with 10 lines of code was a thing.
If you did that today nobody would bat an eye.
Today people write game engines, compilers, languages, whole OS and nobody bats an eye cause there are thousands of those.
Quick sort isn't even a thing for leet code interviews anymore because it's not hard enough.
Random anecdote and Mr. Hoare (yep not a Dr.) has always been one of my computing heroes.
Mr. Hoare did a talk back during my undergrad and for some reason despite totally checked out of school I attended, and it is one of my formative experiences. AFAICR it was about proving program correctness.
After it finished during the Q&A segment, one student asked him about his opinions about the famous Brooks essay No Silver Bullet and Mr. Hoare's answer was... total confusion. Apparently he had not heard of the concept at all! It could be a lost in translation thing but I don't think so since I remember understanding the phrase "silver bullet" which did not make any sense to me. And now Mr. Hoare and Dr. Brooks are two of my all time computing heroes.
"Sir", not "Mr." if you're going to be pedantic about titles ;)
Edit: Oh and he has multiple honorary doctorates (at least 6!), so would be just as much "Dr." too!
It is not usual to call people with an honorary doctorate "Doctor" except in the context of the awarding institution. Most likely the awarding institutions will have actually specified that the recipient should not give anybody the false impression and I can't imagine Tony is the type to do otherwise.
His title at Oxford was 'Professor', and he was addressed as 'Tony'.
He made incoming DPhil (PhD) students a cup of tea individually in his office at the Computing Laboratory. It was a small group, but still I appreciated this personal touch.
Tony Hoare on how he came up with Quicksort:
he read the algol 60 report (Naur, McCarthy, Perlis, …)
and that described "recursion"
=> aaah!
https://www.youtube.com/watch?v=pJgKYn0lcno
Tony advised me to make money with the software model checker I have been writing. In contrast to the typical practice to make these tools open source and free for use. Would have loved to learn more from him. He was a great teacher but also a great and sharp listener. Still remember the detour we made on the way to a bar in London, talking too much and deep about refinement relations. RiP.
https://news.ycombinator.com/item?id=47316880
249 points by nextos 16 hours ago | 61 comments
I wrote both my master thesis and PhD on Hoare's Communicating Sequential Processes. I really enjoyed it's simplicity, expandability, and was always amazed that it inspired and influenced language constructs in Go, Erlang, occam and the likes.
RIP.
His presentation on his billion dollar mistake is something I still regularly share as a fervent believer that using null is an anti-pattern in _most_ cases. https://www.infoq.com/presentations/Null-References-The-Bill...
That said, his contributions greatly outweigh this 'mistake'.
Anti patterns are great, they act as escape hatches or pressure release valves. Every piece of mechanical equipment has some analogue for good reason.
Without things like null pointers, goto, globals, unsafe modes in modern safe(r) languages you can get yourself into a corner by over designing everything, often leading to complex unmaintainable code.
With judicious use of these anti-patterns you get mostly good/clean design with one or two well documented exceptions.
I watched this video a few months ago.
Virtual HLF 2020 – Scientific Dialogue: Sir C. Antony R. Hoare/Leslie Lamport
https://www.youtube.com/watch?v=wQbFkAkThGk
One of the greatest figure of computing in history and an example of humility as a human.
Thank you for your work on ALGOL, you were multiple decade ahead of your time.
Rest in peace.
Absolutely the GOAT of concurrency. May his ring never die.
Assert early, assert often!
Rest in peace
1) ACM published this book in 2021; Theories of Programming: The Life and Works of Tony Hoare - https://dl.acm.org/doi/book/10.1145/3477355
See the "preface" for details of the book - https://dl.acm.org/doi/10.1145/3477355.3477356
Review of the above book - https://www.researchgate.net/publication/365933441_Review_on...
Somebody needs to contact ACM and have them make the above book freely available now; there can be no better epitaph.
2) Tony Hoare's lecture in honour of Edsger Dijkstra (2010); What can we learn from Edsger W. Dijkstra? - https://www.cs.utexas.edu/~EWD/DijkstraMemorialLectures/Tony...
Somebody needs to now write a similar one for Hoare.
Truly one of the absolute greats in the history of Computer Science.
RIP Legend
Rest in peace.
RIP Sir Tony Hoare
Turing Award Legend.
Needs a black bar!
Seconded.
this is black bar grade great. give us black bar
How many jobs were had or not due to the candidates ability to implement his algorithms?
As a junior dev, I loved to ask interview candidates to implement merge sort or quick sort on whiteboards.
As a non-junior dev I realize how stupid that was.
I think the first enlightenment is that software engineers should be able to abstract away these algorithms to reliable libraries.
The second enlightenment is that if you don't understand what the libraries are doing, you will probably ship things that assemble the libraries in unreasonably slow/expensive ways, lacking the intuition for how "hard" the overall operation should be.