The #3 most important idea in Computer Science
Well, this one is probably the most controversial. But I think the #3 most important idea has something to do with the problem of exponential growth of the number of possible states we can represent as we add bits to our system.
Eric Normand: What is the third most important idea in Computer Science? My name is Eric Normand and these are my thoughts on Functional Programming. Now this, again, is not directly Functional Programming related, but it's something that I think we use a lot in Functional Programming, and so I'm going to talk about it.
What is the third most important idea? To me, it is the idea of the trie, the T-R-I-E, pronounced trie, stands for retrieval. Let me frame the problem here. Our machines have millions and millions of bytes, billions of bits. Each bit doubles the number of possible states that our software can be in, the computer can be in. Doubles.
This is so many doublings, it's ludicrous to imagine keeping all of that maintained in some kind of coherent state without some help. We need serious help. Every bit that we add doubles the number of representable states.
One thing that we do with this exponential growth, is we manage it by saying, "Well, this little piece right here, these 64 bits are going to represent a number." We chop up our possible states into different states. But it doesn't change the game, it just helps us organize it better and talk about the transitions between them a little better.
We still have this immensely complex system that we're trying to manage. Even if you chop it up, you still got all these little pieces that you've chopped it up into that you have to manage. We need some way of counteracting that exponential increase in complexity, that doubling and doubling and doubling.
I believe that we have something to counteract the exponential increase and it's the trie. It's an organized tree structure. Trie, the first time I said it, the trie is T-R-I-E, but an organized tree structure, meaning T-R-E-E, branching that gives us logarithmic growth.
Meaning the more stuff you add, the more efficient the system becomes because you've got overhead. You've got some overhead for storing stuff in a tree but the more stuff in there, the more the overhead is amortized logarithmically. You need less and less overhead per each new thing added to the tree.
That lets us organize this mountain of data of possible states that we have into something coherent that we have very good properties about. It's got logarithmic access to any node in there in the tree, to any piece of information in the tree. It lets us build indexes. In Clojure, we use them to represent entities as well.
This is my vote for the third most important idea in Computer Science. I'd love to hear what you think because I don't know if I've shared that with many people. I'm not quite sure what other people's opinions are on this. Whether you even have thought about what the most important ideas are in Computer Science.
To me, Computer Science is one of those interesting things that is informing us about how we can think about the world, how it's helping us build new understanding of the world. I'm talking about the world outside of computers.
Obviously, it helps us learn about computers, but this idea that information can be organized into structures that give us logarithmic access to them, and they grow logarithmically, is a thing that we might not have thought of before we had so much information and a study of information architectures.