I've forgotten the details and facts that I had to memorize when I was learning about President Lincoln or William Shakespeare. I used to think I had a good memory for facts, but as I get older, more and more of them seem to be leaking out of my head. That doesn't really worry me though, because there's something much more powerful than memorizing and that's observation.

By observing what's happening, understanding how it works, and applying that knowledge to new situations, you can build up higher-level thinking skills that generalize much better than a long list of memorized facts. You can use this power of observation to derive solutions to problems without having to remember the solution or needing to look it up. Let's look at a few examples to get a better idea of what I'm talking about.

#### The Legend of Gauss

Every time I hear this story, the teller notes that it's unclear whether it's true or not, so consider this that notice, and let's assume it's true for this discussion. Legend has it that when Carl Friedrich Gauss was in elementary school, his teacher was having a trying day and wanted to keep the class busy for a while so that she could take a break. She gave the class the task of adding the numbers from 1 to 100, figuring that this would take them quite a while.

A couple minutes later young Gauss said he was done, and the solution was 5050. Unfortunately for the teacher, Gauss had seen through the problem to the underlying pattern of summing a series of consecutive numbers and figured out that a simple equation could be used to calculate the solution without doing 99 additions:

*∑*

_{i=1..n}n = n*(n+1)/2If taken at face value, a student may think that this is a nice story to show how smart Gauss was and now we have this equation to memorize for adding a series of numbers that follows a specific pattern, but that's not what the story is about at all. The point of the story is that Gauss made an observation about the problem as a whole, namely that if you add the first and last numbers (1+100), it's the same as adding the second and second-to-last numbers (2+99), and so on until you have 50 additions of the same number (101), which is a single multiplication. That observation allowed Gauss to sidestep the perceived difficulty of the original task.

That same observation would allow you to solve similar, but not quite equivalent problems where the above equation would provide little help. What if the series was from 100 to 200 instead? Now we're not starting at 1, and even worse, there are an odd number of numbers so they don't all pair up. You could memorize another equation that works for this generalized case:

*∑*

_{i=k..n}n = (n-k+1)*(n+k)/2But you could also observe that you could start pairing up numbers with the second and last ones (101+200), get 50 additions of 301, and add 100 to the result to get 15150. Saving one addition for later neatly sidesteps the problem of having an odd number of numbers. Now what happens when the series is

*2 + 4 + 6 + … + 96 + 98 + 100*

Ugh. We have to memorize a different equation. But wait, do we have to? This is the same type of problem, only with half the additions. There are 25 additions of 102, so the result is 2550. We can keep making slight modifications to the series that invalidate the equation we come up with for solving it, but we can still find a pattern to solve the problem quickly if we are observant. What if the series was

*2 - 4 + 6 - … - 96 + 98 - 100*

or even

*1 + 2*+ 3 + 4 + 5 + 2 + 3 + 4 + 5 + 6 + … + 96 + 97 + 98 + 99 + 100

#### Linear Interpolation

I have a confession to make. I cannot, for the life of me, remember the equation for linear interpolation. It comes up all the time in the work that I do, but I can never remember exactly what the equation is. It doesn't matter, though, because what I'm trying to do with linear interpolation is connect a line between two points and find another point on that line. I do remember what the equation for a line is:

*y = m·x + b*

I know how this equation works. I know that I can find the slope with

*m = (y*

_{1}- y_{0})/(x_{1}- x_{0})Then I can find the intercept by solving the line equation for

*b*:

*b = y - m·x*

Now that I know the equation for the line I'm using for interpolation, I can plug the

*x*I want to estimate into the equation for the line and get

*y*. It may not be quite as fast as using the linear interpolation equation, but knowing the equation for the line and knowing how to use it is more generally applicable to a wider variety of problems. This method of problem solving is useful in programming as well.

#### Patterns and Algorithms

Like any technical field, programming involves memorizing a ton of stuff. Programming languages, syntax, frameworks, libraries, methodologies, etc. all require some amount of memorization that you can't really avoid. However, there is plenty of room to still be observant and take advantage of programming wisdom already gained.

Our brains are giant pattern recognition engines, and much of programming is about recognizing patterns in the problems we're trying to solve so that we can apply known solutions, maybe with more or less extensive modifications. Innovation is mostly the application of old ideas to new problems, and software is an ideal area to leverage that technique.

The general ideas of software solutions have been packaged up as patterns and algorithms. Much like equations in mathematics, the best way to use these patterns and algorithms is not to try to memorize them and apply them blindly to defined problems, but to understand how they work and be observant of how they can be modified to apply to new situations.

For example, I work a lot with DSP algorithms, but I rarely apply an algorithm directly from a book to a problem. Normally, the algorithm needs to be tweaked before it will work for the specific problem at hand, and most of the time more than one algorithm could work. By understanding how the different algorithms work, and what their strengths and weaknesses are, I can build an algorithm from multiple components that will solve the problem better.

Software design patterns should be used in a similar way, and code bases start to get bloated and convoluted when a big pattern is squeezed into a problem that's too small for it. Most of the time a simpler version of the pattern will work better, if the programmer understands how the pattern works and how to scale it down for smaller problems. Not every program needs an abstract factory for every object instantiation or a fully implemented state pattern to manage a small state machine.

Understanding the fundamentals of how patterns and algorithms work also helps you to remember more of them. You may not remember exactly how they're implemented, but knowing that a pattern exists that solves a certain class of problems and how it solves those problems in general is more than half the battle. It's much easier to search for how to implement a particular pattern or algorithm than it is to search for what algorithm is needed to best solve a particular problem. With the knowledge of how patterns and algorithms work, you can be more observant about where they can be applied. Memorization is necessary for some amount of basic knowledge in any field, but observation is where the real problem solving power lies.