Trade Offs

Computing is all about trade offs. The fundamental trade off is always processing time vs. memory space. You almost always have the choice of computing things at run time or precomputing a table of expected results at compile time and looking the answer up when you need it. There are exceptions, of course. You can’t possibly anticipate all of the network messages that you will receive before hand in all instances. Just as you can’t anticipate the text a user will input to an arbitrary text input field. But these interactive input cases are rarely constraining cases in the design of high performance software.

Why would you care about how fast a program computes an answer? The example that immediately comes to mind is the software that controls the internal combustion engine in modern gasoline cars. There is a definite time budget for the software to decide when to engage the spark to ignite the fuel in the cylinder. If the spark is too early or too late the results can be catastrophic.

On the other hand why would you care about memory space? In this day and age that is good question.  But there are still instances where resources are tightly budgeted. For instance, on space craft, every once of weight to be launched adds to the cost to launch the payload. In cases where timing is not critical, the programmer might choose to expend clock cycles to compute relatively infrequently used values rather than dedicate memory space to lookup tables. This would buy him added functionality at the expense of just a little more computation time.

I said that time vs. memory was the fundamental trade off but it is not the only one. Another common trade off is abstraction vs. performance. Often times the programmer faces the choice of programming in a higher level language that provides more features that more closely mimic the way that humans think about problems. These languages often offer features specifically designed to help prevent the programmer from making subtle errors that would be hard to find.

Sometimes these languages do there magic at compile time and minimize their impact on run time performance. Sometimes they are forced to do their checks at run time and end up impacting performance that way. Sometimes, they add data to the memory footprint of the application which also subtly impacts it’s performance. These languages are constantly being improved by the computer science community to the point that these impacts are minimized.

But sometimes, you need to get as close to the native machine architecture as possible. In these cases, the languages of choice have long been the C language, the C++ language, and assembly language. These languages allow the programmer more control over how the application computes its answers at the expense of having to manage more complexity and less protection against subtle logic errors.

There will always be trade offs in programming. It is the nature of the activity. It is so totally flexible to do whatever the programmer can imagine that no language can do any more than provide a framework within which their imagination can create the next great application.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.