Some Comments from a Numerical Analyst
This is an episode of Thoughts on Functional Programming, a podcast by Eric Normand.
In this episode, I read and comment on an excerpt from the 1970 Turing Award Lecture by James Wilkinson.
"I am convinced that mathematical computation has a great part to play in the future, and that its contribution will fully live up to the expectations of the great pioneers of the computer revolution."
Hi, my name is Eric Normand. You just heard a quote from Jim Wilkinsons Toy 1970 Turing Award lecture. Full name is James Hardy Wilkinson. He was born in 1919 in England. His biography on the Turing Award page, they often have a nice biography.
This one happens to be really good because I think they recognized that a lot of people who would be interested in listening to the Turing Award would not be that familiar with the numerical analysis, the linear algebra, the matrix stuff that he worked on.
It has a good amount of information. I'm going to read a lot from it today. It really shows the importance of this numerical analysis that we take for granted today because I think they did such a good job, just like you can use a computer without worrying about the hardware failing on you, the hardware engineers have done such a good job. You just don't think about all the work they've done.
Likewise, I think with floating-point numbers and mathematical applications, we don't think about the error that is introduced. Let's read from the bio before we get into his Turing Award lecture. James, known as Jim Hardy Wilkinson was a British mathematician who became the leading expert in a new and important field that emerged after World War II.
It goes under two names, matrix computations, and numerical linear algebra. He worked with Alan Turing at the National Physical Laboratory. Now I'm going to read, the commentator in this biography is taking a lot of liberties to explain the importance of his work.
"People are awed at the prodigious speeds at which computers execute primitive arithmetic operations, such as addition and multiplication. Yet this speed is achieved at a price, almost every answer is wrong. It is not due to difficulties and approximation, it is due to an iron law, all numbers using a computer shall have a fixed number of digits."
Normally in math, before you had numerical computing machines, you would solve problems exactly. You could say, "Oh, square root of two," and the square root of two, you would just leave it symbolically as the square root of two even though it's an irrational number. You'd never be able to write all the digits.
Now that forms an approximation when you have to find a fixed number of digits. He's saying it's not the approximation, it's that when you multiply two numbers...We'll get to that. This is what the error comes from.
"The numbers we are talking about are usually floating point numbers that have two parts, a fraction and an exponent. It is the fractional parts with which we are concerned here. Consider them as restricted to 16 digits. The product of two 216 digit fractions needs 32 digits to represent it, and the computer always throws away the last 16 of them. The relative error is minuscule.
"For calculations involving physical quantities, such as pressure or velocity, the given precision is usually adequate, but an application such as astronomy, it may not be." Consider you have to get the angle perfect, to adjust your telescope to aim at a particular star. You do all these calculations, and even just the tiniest angle off, you will miss the star. That's what they're talking about.
"Wilkinson pioneered successful error analyses of all the matrix algorithms of his day." More on that later. He will get back to that. "A host of diverse scientific and engineering problems boil down to solving a few standard matrix problems. Direct methods for solving A * X = B, had been refined for humans with hand-held computing devices since the 1930s.
"It did not seem too much of a challenge to automate these methods. The only cloud on the horizon was the threat of all those round off errors. Might they undermine the hopes inspired by the new devices? Mathematicians have the greatest power, for example, John von Neumann and Alan Turing thought hard about these problems." He talks about...I'll read it.
"Remember that no human being had ever solved 50 equations and 50 unknowns by a direct method." We'd never had to do that before. Now that a computer has the power, the patience, [laughs] the time, the speed to solve such large systems, we don't know what's going to happen.
"Wilkinson had one great advantage over von Neumann and Turing. He had been obliged to solve 18 equations and 18 unknowns with a hand cranked mechanical calculator during World War II. He had seen how amazingly accurate the direct method was. That arduous exercise helped Wilkinson think in a different way from these other experts in accounting for those troublesome round off errors.
"Stop obsessing over how large the error in the final output might be, and ask instead, how little could the data be changed so that the output is exactly correct. This is the fruitful question." This might take a little bit of explaining, but I'd have to read a little bit more for context. Then we can talk about it.
"Wilkinson won the Turing prize for showing that the fears of the experts were unfounded, for understanding precisely the role of round off error in matrix computations and for showing a way to make it all look rather easy. His tool was the so-called backward error analysis. Instead of concentrating on the size of the error, find the problem which the computed solution solved exactly."
He is looking for how far from the real problem are we? Imagine you had a matrix full of approximations. You already have an error bound on these numbers. We're saying it's 10.25763, but plus or minus one percent.
You do this calculation, and you come up with a number that has more error, because you've now done all these rounding errors, because you didn't have enough digits to store all the digits you needed. There's some error in that. You take that answer with the error. You say, "Well, what matrix multiplication would have given me that answer."
A different question, [laughs] "This different matrix that resulted in this answer, how different is that matrix from the one I started with?" If it's within the approximation error bounds, it's probably fine. He's going to formalize that like this. "For the A * X = B problem, let the computer output be Z. The answer is Z as a solution for X. You're solving for X, the answer is X.
"Look for a small matrix, big E and a small vector E, such that A + E * Z = B + E." Instead of X, now you're using the Z, the new answer, the actual answer you got. Instead of A, you're doing A + E, so A plus a difference equals B + E, so B plus a difference.
"If the bounds on big E and little E are tiny, should we not be satisfied? After all the entries in A and B may well be uncertain, if the error is still large than the problem itself, that is, the pair A, B must be close to being ill posed." This was written by Beresford, Neil Parlett, I suggest you read it, it's interesting how he breaks it down. There's a lot more I didn't read.
I read about 10 percent of it, maybe less, out loud here as an excerpt. If you haven't heard one of my episodes where I read these Turing Award lectures, I'll give you a little intro. I'm reading lectures from the Turing Award, I'm probably going to do them in order. I often read some stuff from the bio, that will give context later.
If you go to the ACM page, for each Turing Award laureate, they have a little bio, talks about biographical details, where they were born, where they got their education, and then sometimes a little bit more about their history, where they got their chops, and what problem they were solving.
Then I read from an excerpt from the Turing lecture itself. This one is 1970, it seems like they were not recorded and released publicly in video. Eventually, you can start watching the lectures on YouTube. These are all edited, and published in the "Journal of the ACM," usually about the next year. It was given in 1970 but then published in 1971.
This one was very troubling for me. It's different from the other lectures, it's much less technical and it's also a field that I'm not that familiar with. It's very humorous, and the style is very flowing. The biography talks about him being good conversational style orator. I'm used to finding a sentence or two that sums up the point that they're trying to make.
This one was hard to do that, because I kept finding these points that were spread out over sentences, and then he never hits it on the head. It's a very good read. Another cool thing about it is, he actually worked with Alan Turing for a number of years, worked closely with him. There's a lot of stories in here about Alan Turing, which I found cool.
I also tried to analyze these for historical context, just to get an idea of what the field was like. Of course, you get an idea of what the field was like in 1970 because this was given in 1970. Anything he says in the present tense, you can get some context from. He also talks about what it was like, back in the '30s, and '40s, which I find pretty cool as well.
He was born in 1919, very early in the history of computing. These early Turing Award lectures were by people who were born the teens or the '20s. That's pretty cool that these were captured. It's very important to understand the history of our field. It's not that much history. It's so such a young field.
Let me start reading. The name of it is, "Some Comments from a Numerical Analyst," by J.H. Wilkinson. "From 1946 to 1948, I had the privilege of working with the great man himself, that's Alan Turing, at the National Physical laboratory. I use the term great man advisedly, because he was indeed a remarkable genius."
That's nice, from1946 1948, a few years there. He calls him a genius. Now, I also have to say that at this time in 1970, the work that Turing did at Bletchley Park was still classified. People knew he had done a lot of stuff in the war but he couldn't publish about it, and he couldn't talk about it. His accomplishments with the Enigma machine and stuff were still unknown, even by his close associates.
"I was a military age when the war broke out. And being by nature, a patriotic man, I felt that I could serve my country more effectively. And incidentally, a lot more comfortably working in the government scientific service, than serving as an incompetent infantry man.
"The British government took a surprisingly enlightened attitude on the subject. And almost from the start of the war, those with scientific qualifications, were encouraged to take this course."
Some historical context, I've heard from many sources that the powers that be at the time, that the governments were very scientific at this time, meaning that they recognized that the war would be won largely on who had better technology. This was coming soon after, after World War I, where technology also played a decisive factor, but it was learned slowly.
By World War II, they recognize that we've got to get started as early as possible. If someone with a degree in science doesn't want to hold a gun, we'll just put them in a lab and have them run numbers and invent things.
"I therefore spent the war in the Armament Research Department working mainly on such fascinating topics as external ballistics, fragmentation of bombs and shells, and the thermodynamics of explosives. My task was to solve problems of a mathematical nature arising in these fields using computational methods if necessary."
He's solving these kind of physics problems that are very mathematical, a lot of calculation, maybe they have to use some computation. "Gradually, I became interested in the numerical solution of physical problems. In 1946, I joined the newly formed mathematics division at the National Physical Laboratory. It was there that I first met Alan Turing."
This is after the end of the war. He was not able to be released immediately, so he joined this mathematical division at the National Physical Laboratory. "It was there that I first met Alan Turing. I was to spend half my time in the computing section," that is in another place without Alan Turing and the other half with Alan Turing.
"For several months Alan and I work together in a remarkably small room at the top of an old house, which had been taken over temporarily by NPL to house mathematics division." He's working closely in this small room.
"My task was to assist Turing in the logical design of the computer ACE, A-C-E, which was to be built at NPL," that's National Physical Laboratory, "and to consider the problems of programming some of the more basic algorithms of numerical analysis."
I just imagine this, what if you, what if I was in a room with Alan Turing, given this assignment of designing a computer with him? Of course there hadn't been many computers designed at that point so a lot of this is unknown territory. You're making up a lot, learning all you can from the other computers and slowly making something come together.
The opportunity sounds amazing, but I would have a lot of anxiety about it. "My work in the computing section was intended to broaden my knowledge of that subject." He was working in both Turing's lab and in this computing section that was helping him understand the field.
"Working with Turing was tremendously stimulating, perhaps at times to the point of exhaustion. It was impossible to work half-time for a man like Turing, and almost from the start, the period spent with the computing section were rather brief." I can imagine.
"Turing had a strong predilection for working things out from first principles, usually, in the first instance, without consulting any previous work on the subject. No doubt it was this habit which gave his work that characteristically original flavor."
It's interesting. I did not know that. It makes sense when I read it. It's fascinating to hear this firsthand, someone observing Turing firsthand and telling it to us, because by this time, he was already dead and, sadly. He didn't have much. He didn't leave much of this legacy, this kind of how-do-you-work legacy behind. It's nice to see this documented somewhere.
As you know, his life was rather tragically ended too early. "Turing carried this to extreme lengths. I must confess that at first I found it rather irritating." This referring to how he would work from first principles. "He would send me a piece of work. When I had completed it, he would not deign to look at my solution, but would embark on the problem himself.
"Only after having a preliminary trial on his own, was he prepared to read my work? I soon came to see the advantage of his approach. In the first place, he was not as quick at grasping other people's ideas as he was at formulating his own. What is more important, he would frequently come up with some original approach, what had escaped me and might have eluded him had he read my account immediately."
It's fun to hear how great minds work. It's actually pretty cool to see something like this very practical. It makes me think I should try that next time I ask someone to solve a problem for me. Then, when they're done, I'll solve it myself without looking at their solution. Then, once I'm done, I can look at their solution, and we can compare.
"When he finally got round to reading my own work, he was generally very appreciative. He was particularly fond of little programming tricks. Some would say that he was too fond of them to be a good programmer." That's interesting. He liked little clever things.
Another Turing Award lecture talked about that is Maurice Wilkes. He talked about that, about him being too enamored with these little clever tricks. Here's another thing. You should read this yourself. Let me say that. I can't possibly read it all even though I'm going to read. [laughs] I'm looking at my highlights. I'm going to read quite a bit of it because it is discursive.
You should read it, get the flavor for it. You'll probably get more humor out if you read it yourself. It's very nice humor.
"Had it not been for Turing, I would probably have become just a pure mathematician." Of course, he means pure mathematician pejoratively. "Turing's reputation is now so well established that it scarcely stands in need of a boost from me."
This is one of those little clues that if you're reading something from a historical perspective, like what can I learn about the history, we can see that in 1970, Turing's reputation was already well established. I appreciate that. I didn't know that. We talk about him a lot today. When I was in university, he was mentioned quite a lot.
You never know how much in the past they appreciated his work, especially since the stuff didn't come out until later in the '70s about his work there in Bletchley Park.
"I feel bound to say that his published work fails to give an adequate impression of his remarkable versatility as a mathematician. His knowledge ranged widely over the whole field of pure and applied mathematics and seemed not merely something he had learned from books, but to form an integral part of the man himself.
"In spite of this, he had only 20 published papers, to his credit, written over a period of some 20 years. Remarkable as some of these papers are, this work represents a mere fraction of what he might have done if things had turned out a little differently."
He didn't have that much output. He didn't publish that much. He had a big effect with the papers he did publish, and probably on the people that he worked with, but that's a shame. When I first read it, this phrase, "If things had turned out a little differently," I read that as if he hadn't been convicted of homosexuality, chemically castrated, and died tragically from poisoning at an early age.
That's what I read that as. Maybe in 1970, they were still afraid to talk about that, but that's not what he means. He's talking about his whole life. That's just an ambiguous phrase about three things that he's about to talk about. If you felt that too, wait a second before judging this.
"In the first place, the six years starting from 1939, which he spent at the Foreign Office. He was 27 in 1939, so that in different circumstances, this period might well have been the most productive of his life." This is during the war. This is the classified work that he did, that he could not publish on and even talk about. It wasn't declassified until the mid-1970s.
A few years ago, they published some of his memos that he had circulated at the office that were still classified, hadn't been unearth, or whatever. This stuff is still coming out, and it's a shame, they even say that the office is at Bletchley Park, because no one knew, because it's all classified, they let them decay.
In the '70s, they were beyond repair by the time that they were like, "Well, these buildings are important. We should keep them." This is the thing that he's talking about, like in different circumstances, if things had turned out differently, this would have been the most productive of his life.
"He seemed not to have regretted the years he spent there, and indeed, we formed the impression that this was one of the happiest times of his life. Turing simply loved problems and puzzles of all kinds, and the problems he encountered there must have given him a good deal of fun.
"Certainly, it was there that he gained his knowledge of electronics, and this was probably the decisive factor in his deciding to go to NPL to design an electronic computer rather than returning to Cambridge."
People just gleaning information, "It must have been where he got his idea of electronics and making a computer," but they didn't know what he was doing there. A second factor limiting his output was a market disinclination to put pen to paper. He just didn't like to write. 20 papers, 20 years, about a paper per year.
"While I was preparing this talk, an early mathematics division report was unearthed. It was written by Turing in 1946. Its main purpose was to convince the committee of the feasibility and importance of building an electronic computer." This is in probably 1970, they found this report from 1946 about building a computer. Now, this is the time when Wilkinson was working with him.
"It is full of characteristic touches of humor and rereading it for the first time for perhaps 24 years, I was struck once again by his remarkable originality and versatility. As early as 1946, Turing had considered the possibility of working with both interval and significant digit arithmetic, and the report recalled forgotten conversations, not to mention heated arguments, which we had on this topic."
I think that's so cool that he got to relive this experience a little bit when he found this paper. He was saying that Turing was ahead, that he was very early in this work.
"Turing's international reputation rests mainly on his work on computable numbers, but I like to recall that he was a considerable numerical analyst, and a good part of his time from 1946 onwards was spent working in this field. While at NPL, he wrote a remarkable paper on the error analysis of matrix computation."
Now we're in a section, he's ending this Turing section. It's called, "The Present State of Numerical Analysis."
"Numerical analysis is unique among the various topics which comprise the rather ill-defined discipline of computer science. I would be very sorry to see numerical analysis sever all its connections with computer science. Numerical analysis is clearly different from the other topics in having had a long and distinguished history. Only the name is new, it appears not to have been used before the '50s."
This is interesting, because even in the '70s, just by the way he's talking about it, it seemed like there wasn't an appreciation of numerical analysis of this error, analyzing the errors of computations, and how to deal with that. It doesn't seem like people were that interested in it at the time even though it was foundational, He does want it to be part of computer science. He doesn't think that it belongs, say, in math.
"Some like to trace its history back to the Babylonians. Certainly many of the giants of the mathematical world, including both the great Newton and Gauss themselves, devoted a substantial part of their research to computational problems.
"Many of the leaders of the computer revolution thought in terms of developing a tool, which was specifically intended for the solution of problems arising in physics and engineering. This was certainly true of the two men of genius by Neumann and Turing.
"Turing regarded such applications as the main justification for embarking on what was even then a comparatively expensive undertaking. A high percentage of the leading lights of the newly formed computer societies were primarily numerical analysts, and the editorial boards of the new journals were drawn largely from their ranks."
Computing had changed quite a bit since this time in the '40s up to 1970. Apparently, numerical analysis was one of the main topics back then. I have to imagine that they did such a good job that we just take it for granted. They've slowly improved, but it. It solved well enough.
I find that interesting. Maybe I'm jumping the gun a little bit, matrix multiplication is really important now in neural networks. It could be that these things are getting more attention now. I'm not up on the research there.
"The use of electronic computers brought with it a new crop of problems, all perhaps loosely associated with programming. Quite soon, a whole field of new endeavors grew up around the computer." New endeavors meaning endeavors besides these physics and engineering problems that you're trying to solve with a computer.
He's talking about programming, programming languages, parsing. "Many people who set out originally to solve some problem in mathematical physics found themselves temporarily deflected by the problems of computerology." That's what he calls everything besides numerical analysis.
"...deflected by the problems of computerology, we are still waiting with bated breath for the epic making contributions they will surely make when they return to the fold. Clothed in their superior wisdom."
He's obviously being facetious here, but he's saying that a lot of people were deflected. They were distracted by the new problems that arose from the computer, programming and things like that. They weren't solving their physics problems anymore. They're probably not coming back.
"In contrast to numerical analysis, the problems of computerology are entirely new." He did say that you could trace things back to the Babylonians in numerical analysis.
He's saying it's a stretch to do that, but you could if you tried. They had systematic computation back then. Then, once the computer was invented, there's this whole new field that's not just, how do we apply this to big number crunching problems?
This restless activity and excitement over every new thing also reminded of a quote by Alan Kay, who says that programming is a pop culture. We don't know our history. We're not doing a more refined culture. It's pop culture. It's all about what's being talked about these days.
"Computerology has a vital part to play in ensuring that computers are fully exploited. I'm sure that it is good for numerical analysts to be associated with a group of people who are so much alive and full of enthusiasm. I'm equally sure that there is merit in computer science embracing a subject like numerical analysis, which has a solid background of past achievement.
"Inevitably though, numerical analysis has begun to look a little square in the computer science setting. Numerical analysts are beginning to show signs of losing faith in themselves. Their sense of isolation is accentuated by the present trend toward abstraction in mathematics departments, which makes for an uneasy relationship."
He's hedging a bit. He's saying, excitement is good, enthusiasm is good, but compared to that stuff, numerical analysis looks slow and unexciting. They feel even more isolated because math departments are not talking about the practical computation side anymore. They're talking about an abstract math now. Must be tough. There is a silver lining.
He does talk about some good stuff in a minute so just hold on. "There's a second question which is asked with increasing frequency. What's new in numerical analysis? This is invariably delivered in such a manner, as to leave no doubt that the questioner's answer is nothing. Of course, everything in computerology is new. That is at once its attraction and its weakness.
"Only recently, I learned that computers are revolutionizing astrology. Horoscopes by computer, it's certainly never been done before. I understand that it is very remunerative. Seriously though, it was not to be expected that numerical and analysis would be turned upside down in the course of a decade or two."
People notice that numerical analysis is slower than the other parts of computer science. Again, my own personal experience is with the language Clojure. Clojure takes a very slow, plodding, methodical approach to change. People don't hear about it, because there's nothing new to announce. The language is not new, nothing has changed.
We haven't deprecated half of the language and made a new half to modernize it. It changes very slowly. In the last five years, not much has happened. The people who use it think, "Well, that's a good thing, my code from even 10 years ago can still run." Meanwhile, everyone thinks it's dead because it's not getting on "Hacker News" or something.
You see this contrast between, let's say, the difficulty of learning classical music, versus the ease of picking up a few chords on a guitar and playing some rock music. Classical music is hard, not because it's trying to be hard [laughs] but because it's deep. There's a lot to learn. It requires a lot of skill. It just takes a lot more effort to get into it. It's not surprising that it's not as exciting.
"Over the last 300 years, some of the finest intellect in the mathematical world have been brought to bear on the problems we are trying to solve. It is not surprising that our rate of progress cannot quite match the heady pace, which is so characteristic of computerology.
Despite all the new stuff coming up all the time, when you look back over 20 years, the major points would be the big things and other stuff, you've forgotten about them. They would look very similar, maybe. This is him looking back from 1950, and this is in 1970.
"We are fortunate here in the little book written by V. N Fediva, an admirably concise and accurate account of the situation as it was in 1950. A substantial part of the book is devoted to the solution of the eigenvalue problem. Scarcely any of the methods discussed there are in use today.
"In fact, as far as non Hermitian matrices are concerned, even the methods, which were advocated at the 1957, Wayne Matrix Conference have been almost completely superseded. Using a modern version of the QR algorithm, one can expect to produce an accurate eigensystem of a dense matrix of order 100 in a time which is of the order of a minute.
"At the 1957 Wayne Conference, we did not appear to be within hailing distance of such an achievement." He's talking about two different points in time, 1950, that the book showed the state of the art at the time, and in 1970, you wouldn't do that. The state of the art has progressed so much that you wouldn't use any of it anymore.
Then another point in time, 13 years prior in 1957, it was at a conference that the methods were almost completely superseded. He's saying that, we've made a lot of progress. The field is completely different from what it was 13 years ago.
"Comparable advances have been made in the development of iterative methods for solving sparse linear systems of the type arising from partial differential equations." That's cool. You look back 20 years, and you have made a lot of progress. Maybe it's not as exciting. Maybe it's the nature of matrix multiplication. It doesn't seem that fun.
"When I joined NPL in 1946, the mood of pessimism about the stability of elimination methods for solving linear systems was at its height, and was a major talking point. Bounds had been produced which purported to show that the error in the solution would be proportional to four to nth power.
"This suggested that it would be impractical to solve systems of quite modest order. I think it was true to say that at that time, in 1946, it was the more distinguished mathematicians who were most pessimistic, the less gifted being perhaps unable to appreciate the full severity of the difficulties."
He's going back and talking about his contribution and how revolutionary it was compared to what the greatest minds were doing at that time. People thought that this was an impossible problem. That this was one of those halting problem kinds of things where he's like, "We'll never know if a program is going to stop."
If you looked at this, and you saw that wow, the error is proportional to 4\^n, so the more unknowns you have in your problem, that's quite a large...Every unknown you add multiplies it by four. That's a huge growth.
"While I was at the armament research department, I had an encounter with matrix computations which puzzled me a good deal. After a succession of failures, I had been presented with a system of 12 linear equations to solve. I was delighted at last at being given a problem which I knew all about, and had departed with my task confident that I would return with the solution in a very short time.
"However, when I returned to my room, confidence rapidly evaporated. The set of 144 coefficients suddenly looked very much larger than they had seemed when I was given them. I finally decided to use Gaussian elimination with what would now be called partial pivoting."
He's given this problem. It seemed easy, but then when he sat down to start, he thought wow, this is going to be a big deal. "Anxiety about rounding errors n elimination methods had not yet reared its head. I used 10 decimal computation more as a safety precaution because I was expecting any severe instability problems." He used 10 decimals, 10 significant digits.
"I slowly lost figures until the final reduced equation was of the form." Then he gives this, 0.0000376235 * X, the 12X. X-sub 12 = 0.0000...Just this random numbers, but he's showing that, it's kind of an example.
"At this stage, I can remember thinking to myself that the computed X sub 12, derived from this relation could scarcely have more than six correct figures, even supposing that there had been no build up in rounding errors. I contemplated computing the answers to six figures only.
"However, as those of you who have worked with a desk computer will know, one tends to make fewer blunders if one adheres to a steady pattern of work. Accordingly, I computed all variables to 10 figures, though fully aware of the absurdity of doing so."
Here he was thinking there's going to be a bunch of error, and even with all this error, there can't be more than six significant digits. I mean, you just look at, that these numbers, you can't look at them unless you're reading along. They're these long decimal numbers, four leading zeros and then six non-zero digits. Likewise, on the other side of the equal sign. You're going to solve for X.
How can you have more than six significant digits if you only started with six digits on both sides? He's going to do it to 10, just because that's what he's going to fill out all the digits. "Being by that time a well-trained computer, I substituted my solution in the original equations to see how they checked." He plugged them back in to check.
"To my astonishment, the left-hand side agreed with the given right-hand side to 10 figures. That is to the full extent of the right-hand side. That, I said to myself, was a coincidence. 11 more coincidences followed," because remember there are 12 equations so 12 unknowns. He's doing this 11 more times.
11 more coincidences followed, though perhaps not quite in rapid succession because they take a long time to calculate. Funny.
"I was completely baffled by this. I felt sure that none of the variables could have more than six correct figures, and yet the agreement was as good as it would have been if I had been given the exact answer and had then rounded it to 10 figures."
Somehow, he had this experience of doing these large computations with matrices, and found that even with all the rounding errors, it was still good to 10 digits. I don't really understand the significance of this.
Apparently, this was the experience that led him see differently from Turing and von Neumann that this was something very important that even if you started with six digits on both sides, they still would come out the 10 digits correctly.
"My taskmaster had to admit, he was impressed when I claimed that I had the exact solution corresponding to a right-hand side, which differed only in the 10th figure from the given one. As you can imagine, this experience was very much in my mind when I arrived at NPL and encountered the preoccupation with the instability of elimination methods.
"I still believe that my computed answers had at best six correct figures. It was puzzling that in my only encounter with linear systems, it was the surprising accuracy of the solutions which required an explanation." It's puzzling to him as well. I am not familiar with numerical analysis.
I'm glad he's puzzled too, or he was puzzled at the time. Makes me feel a little bit better about myself, my own mathematical understanding.
"Sometime after my arrival, a system of 18 equations arrived in mathematics division. After talking around it for some time, we finally decided to abandon theorizing and to solve it. A system of 18 is surprisingly formidable, even when one has had previous experience with 12.
"We accordingly decided on a joint effort. The operation was manned by Fox, Goodwin, Turing and me. We decided on Gaussian elimination with complete pivoting." They're going to solve this giant problem, then we're going to work together.
"History repeated itself remarkably closely. Again, the system was mildly ill conditioned. The last equation had a coefficient of order 10\^-4. The residuals were again of order 10\^-10. That is of the size corresponding to the exact solution rounded to 10 decimals.
"I suppose this must be regarded as a defeat for Turing since he, at that time, was a keener adherent than any of the rest of us to the pessimistic school. However, I'm sure that this experience made quite an impression on him and set him thinking afresh on the problem of rounding errors in elimination processes.
"About a year later, he produced his famous paper, rounding off errors in matrix processes, which together with the paper of John von Neumann and H. Goldstein did a great deal to dispel the gloom. The second round undoubtedly went to Turing.
"We can fairly claim today to have a reasonably complete understanding of matrix stability problems, not only in the solution of linear systems but also in the far more difficult eigenvalue problem.
I'm glad somebody has a complete understanding because I don't. I don't really get what happened there. I get the mechanics of it, but I don't understand how that can lead you to a breakthrough of understanding. Except that maybe the problem wasn't as bad as they thought. Had they not done enough matrix multiplications to see it wasn't as bad as they thought?
The theory said 4\^n, the error would be on the order 4\^n when in fact, it still worked out to 10 decimal places. I don't know. "There are other respects in which we have not been particularly successful even in this field." He's talking about the failures in the matrix field now.
"Most important of these is a partial failure in communication. The use of algorithms and a general understanding of the stability problem has lagged much further behind developments than it should have. The basic problems of matrix computation have the advantage of simple formulations.
"I feel that the preparation of well-tested and well-documented algorithms should have advanced side by side with their development and analysis."
They didn't release enough code. They didn't have libraries of matrix operations that were well-tested and well-documented. Maybe at that time, you don't release a library as a GitHub Repo. You can release it as a book, have people translate it into the programming language that they used at their computing site, but they didn't do it enough. They might have had a bigger impact they had done that.
"There are two reasons why this has not happened. One, it is a much more arduous task than was appreciated to prepare the documentation thoroughly. Two, insufficient priority has been attached to doing it. It's more difficult, and we didn't care about it as much.
"I think it is of vital importance that all the work that has been expended on the development of satisfactory algorithms should be made fully available to the people who need to use it. I would go further than this and claim that it is a social duty to see that this is achieved."
Interesting, I know that there is a lot of work now released. The work has been done, probably ongoing work, but projects like NumPy, those kinds of projects bringing these numerical methods into popular usage through libraries in popular programming languages. I think that this might be happening, might have happened already.
"A second disquieting feature about work in the matrix field is that it has tended to be isolated from that in very closely related areas, in particular linear programming and statistical computations. Workers in linear algebra and linear programming seemed until recently to comprise almost completely disjoint sets, and this is surely undesirable.
"The standard computations required in practical statistics provide the most direct opportunities for applying the basic matrix algorithms. Yet, there's surprisingly little collaboration."
I can see how that would be a problem. I'm not sure about the current state right now in 2021, whether they're collaborating now. "I think it is primarily the duty of people working in the matrix field to make certain that their work is used in related areas. This calls for an aggressive policy."
It's actually common in these Turing lectures that it gets into more than opinion, but imperatives for the field. That's great. What is the field overlooking? That's often answered in these Turing Award lectures.
"A third disappointing feature is the failure of numerical analysts to influence computer hardware and software in the way that they should. In the early days of the computer revolution, computer designers, and numerical analysts worked closely together and indeed, were often the same people.
"Now there is a regrettable tendency for numerical analysts to opt out of any responsibility for the design of the arithmetic facilities, and a failure to influence the more basic features of software."
That's interesting. I'm not sure of what happens at places like Intel, when they're working on the CPUs. I imagine that this is the same problem. On GPUs, I'm sure they're talking to the numerical analysts, because that's what they do, especially the ones that are designed for neural networks. Those do matrix multiplications all day long.
This might be a vindication of that again. I'm not sure, maybe I'm wrong. It seems that they would want to talk to these numerical analysts. "One of the main virtues of an electronic computer from the point of view of the numerical analyst is its ability to do arithmetic fast. Need the arithmetic be so bad?" That's his little last burst.
Final comments. These are the comments I read at the beginning. "I am convinced that mathematical computation has a great part to play in the future, and that its contribution will fully live up to the expectations of the great pioneers of the computer revolution."
That is the end. It's all I wanted to read from it. Read it yourself. There's also in the notes on my site, the page for this episode. There's a link to something I found, it's a bibliography page of all of his published works, plus a lot of recorded video so you can get a feel for the man and his style of talking.
Maybe if you prefer watching video or listening, there's interviews and stuff like that on there to get a better idea. I haven't watched all of it, but there could be some more Turing stories in there which I'd be happy to listen to myself. Thank you for listening. I'm going to see who the next Turing Award winner is. Excuse me while I look at this.
Eric: That was 1970. Next up is John McCarthy, yay. That'll be fun. John McCarthy is the inventor of Lisp. I have a lot to say about that one. I'm sorry, I have to apologize because this one, this is not my field. I don't have that much to say about it. It was very nice learning a little bit about it through both the biography and the lecture itself.
I didn't have much to say, not much analysis. I found it interesting. I hope you found that interesting too. They're going to be like that a lot. Sometimes I am more interested in the topic and I've thought about it more so I have more stuff to say. Sometimes the people are not so clear. [laughs] He's very clear as well.
If they're not clear, and it's a topic I'm interested in, there's a lot to say because I can explain what they mean. Him being so clear and also it not being something I was familiar with, not much to say. Anyway, I hope you subscribe, you can catch the next ones. I'll sign off.
My name is Eric Normand. Thank you for listening, and as always, rock on.