The Forest for the Trees

It is slightly unnerving to discover that in spite of no particular planning on your part you have ended up living in one of the best places to live. Of course there are all sorts of caveats on that statement. If you mess with the criteria enough you can say that about any place. But when national magazines write articles about the ten best places to live and work in the United States and your city keeps showing up on the list it’s hard to deny that there is something to it.

There are several ways that this might have happened and then there is the way that it actually happened. The latter is as good a story as any. It was 1975 and I was a college student. My wife was pregnant and I needed a job. I took the civil service exam for postal worker and made top marks on it. The problem was, veterans were given a 10 point lead over non-veterans. So, someone who made a 95 on the test ended up with an adjusted score of 105 and was given preference for the job.

I looked for work for weeks but I didn’t know how to look for a job. The only jobs I’d ever had were the result of knowing someone that knew me or my parents. My job experience was somewhat limited. I had been a guitar player and gunfighter in a western theme park and I had been a probationary supply clerk for the Illinois Central Gulf Railroad thanks to my father-in-law. The gig with the railroad was messed up when I had a minor wreck and was out of work for a week.

It occurred to me that perhaps if I joined the Army I could expand upon my job skills and at very least I would be a veteran and thus eligible for the preferential hiring policy at the post office. I talked with the recruiter and told him that I wanted to enlist for the longest school that had training in fixing digital computer hardware. He suggested Pershing Missile Repairman and I embarked on an adventure that would lead me to Huntsville, Alabama.

I spent nine months in Pershing school where I learned to repair two different computers and related peripheral hardware. Part of that peripheral hardware was the guidance system of the Pershing missile. It was an exciting time. After I graduated from the school, I was sent to Neu Ulm Germany to practice my newly learned trade. After an adventurous two years there, I got sent back to Huntsville to be an instructor in the Pershing school.

After I got out of the Army, I new I wanted a career in computers. After an abortive start with a small startup in Birmingham, I returned to Huntsville once again. I didn’t plan to live in Huntsville. There were just a lot of good jobs that required my skills with computers. I started out at Intergraph, a rapidly growing Computer Aided Drafting startup. I had several jobs in the aerospace industry including a twenty five year run with one of the leading airplane manufacturers.

Life has been good. But now I find myself looking around for something new. I want to use my experience with computers but I also want to explore my newly developed writing skills. I also want to change my work hours some. I’m tired of getting up before dawn to get my writing done and get to work by eight o’clock. I’ve always been more of an afternoon person anyway.

This certainly didn’t go the way I expected it to but four thirty comes early tomorrow and I’m still committed to my current job. Consequently, I don’t plan on scrapping this post, or rewriting it. I will tuck it in bed, tag it, write a title for it, and head for bed myself.

Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Blog Struggles Continue

Something interesting happened to me on the way to work this morning. I thought of a good idea for a blog post. I intended to jot down a reminder to myself when I got to work but I forgot. This evening when I sat down to write my blog, I couldn’t remember my idea. That is the very epitome of frustration.

I thought perhaps if I wrote about the event, it would jog my memory. So far, that hasn’t worked. I think that by the time I do any necessary errands after work, pick up dinner, and come home, I’ve run out of stamina to do anything else. My mind is a total blank.

I spent a little bit of time reading Wikipedia. I have occasionally found that I am inspired to write something by reading arbitrary articles that I find interesting there. That works better when I’m not at the end of a hard day. I’d actually prefer to write my blog post first thing in the morning. The problem is, I don’t have time to do that before I go to work.

I have started giving some serious thought to looking for another job and retiring from my current job. I have considered trying to write for a living. I’m not confident in my ability to do that yet. Jumping in the deep end on faith is a young man’s game. I may not consider myself that old but I’m definitely not a young man any longer.

I am interested in writing about the history of computers, the history of computer science, and the history of computer languages. As I browse Wikipedia and search the internet with Google, I discover there is a lot of studying to be done before I know enough about it to tell a coherent story. And after all, the most important part of history is story.

Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

The Purposes of Computer Programming Languages

Computer programming can be viewed on many levels. Superficially it is a means for specifying how a computer should perform a given task. If that were all it was, we could must enter a long sequence of numbers that correspond to the machine instructions to accomplish the task.

Modern computer languages go much further. They provide platforms for communicating, not only the specific tasks to accomplish but our assumptions about them, the structure of the data that they will manipulate, and the algorithms that will accomplish them.

But even beyond that, they provide a common language for people to talk to each other about programming tasks. As such, they evolve with the growth of our understanding of the activity of programming, its attributes, and the algorithms that are the final product of the activity.

To summarize, we write programs to enable the automated solution of computational problems by computers but also to communicate with each other about these computational processes. In the interest of clarity of communication, we have seen the trend of programming languages toward higher and higher levels of abstraction with an ever increasing investment in compiler technologies that translate these abstractions into efficient executables that provide the computer tasks that we set out to specify in the first place. It is ultimately this communication that offers their greatest benefit.

Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Programming Principle to Ponder

In my years as a programmer I have discovered a number of simple facts about computers that aren’t obvious at first. I thought that I’d share a few of them with you.

The first is what I call the fundamental theorem of Computer Science. It is in any system you can trade processing for storage and vice versa. An example may serve to help illustrate what I mean. Say for instance you need a function that returns the sine of the integers between 0 and 89. You can either write an algorithm that computes the sine or you can have an array of 90 floats that are preloaded with the sine of the first 90 integers.

The first will be more expensive in terms of the time that it takes to return a result. The second will be more expensive in terms of the memory that it takes to store the table. The correct choice will depend on whether you need a fast answer or a memory efficient one.

Another fundamental principle of programming I learned from a book entitled The Pragmatic Programmer by Andy Hunt and Dave Thomas. They call it the DRY principle and it stands for Don’t Repeat Yourself. This principle was first espoused by database gurus in the form of first normal form.

The idea is if you store the same value more than one place in your program you run the risk of changing it in one of those places and forgetting to change it in the other. It is a simple thing to do but it helps avoid hard to find bugs.

One more and I’ll call it a night. It was first brought to my attention by David Heinemeier Hanson (or DHH as he is commonly referred to by the community), the original architect and author of Ruby on Rails. He calls it configuration by convention. To explain I need to describe how people handled configuration of their programs before.

There were two popular approaches. One was to specify the configuration of your program with so called command line options. These usually consisted of symbols, either single letters or entire words, that were associated with a value for the option.

This soon got rather cumbersome if there were a lot of options. The first attempt to simplify the specification of options was by creating a file that had a special syntax that made it easy for a program to associate the option specifier with the value to be assigned to the option. The problem was that the configuration file syntax was often complex and error prone. For example, XML was a popular syntax used in configuration files.

And, when people started using configuration files the proliferated such that every new library that you adopted in your program would have it’s own configuration file.

DHH observed that a large percentage of the things that are configured by configuration files can be established by having conventional configurations. For example, a common configuration parameter was the specification of the directory where particular files of interest to the application could be found. Instead, DHH established a default directory layout that the application used and documented it.

He asserted that software should be opinionated. It should expect things to be done a particular way and reap the benefits of simplification that these assumptions enabled.

I think the thread that runs through these principles is that the most important thing a programmer needs to do is think about the problem that they are trying to solve and ways that they can solve it instead of what most of us do which is to try to reuse techniques that were successful on previous projects.

This is only bad if it is done without careful thought about the project at hand. Are you trying to drive a nail with a monkey wrench? Programmers are often too quick to start coding instead of taking the time to think about the problem.

Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Kellie’s Professional Origin Story

When I was a teenager, I was interested in electronics. I liked listening to AM radio broadcasts from all over the country. I read Popular Electronics magazine. I built small electronic kits and took apart old inoperable television sets for their parts but mostly to learn how to solder and desolder.

When I was in my first year of college Popular Electronics published a construction article on how to build your own personal computer. I had always been intrigued by computers and avidly read science fiction books and watched science fiction tv shows and movies. I wanted a computer. But the $600 price tag was way beyond my meager student finances.

I found a Plato terminal in the library and obtained an account on it. Plato was a time sharing system that provided computer based instruction in everything from psychology, physics, literature, and even computer programming. In particular, there were lessons on the Tutor language, the computer language in which all of the instructional material on Plato was implemented. I pursued it with great relish and wrote short animated presentations with it.

Time passed. I got married. We were extremely broke students. Inevitably my wife got pregnant and I had to look for a job. It was during a recession and I had no marketable skills. I decided to remedy that situation and spoke with an Army recruiter. I told him I wanted to enlist for the longest school that taught computers. I figured that they wouldn’t spend any more time than necessary on training and consequently the longest school would have the most content. I was right.

For the next year I learned every circuit in the commercial minicomputer that served as the ground control computer of the Pershing missile system. Along the way, I learned a little bit about programming from my course work and a lot about programming from magazines like Byte, Kilobaud, and Compute! to name just a few.

I tell this story to explain that my perspective on computers has always been two pronged. That is, I have an appreciation for both the hardware that comprises the computer and the software that runs on it. Most people in the computer business specialize in either computer hardware or computer software. I decided early in my career that I liked to write software but I also enjoy understanding how the hardware works so that I can make the computer do things that other people might not imagine that it was capable of.

Another of my long term interests has been in Artificial Intelligence. But that is a topic best left for another post. Dinner and the weekend beckon and I have managed to fulfill my daily writing goals early today.

Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

More Rant

As my colleague Danny Cutts pointed out in a comment on my post yesterday, I criticized the status quo in software development without making any constructive suggestions for how language selection ought to be done.

The short answer to that question is that it is a topic that sorely needs research. There are, in fact people in academia all over the world that are investigating these issues. There are many interesting approaches. I have been impressed by the results obtained by using languages that enforce constraints against mutable data and encourage programmers to avoid side effects.

I am an admitted fan of Lisp and I think that Clojure has done the best job of all the modern Lisps of advancing the state of the art of Lisp like languages. Not only has it made data immutable by default, it has also unified the operations upon collections of all sorts. It has also baked in thread safety to the point that it’s hard to get concurrency wrong.

And the final aspect boosting Clojure over the top in the comparison of modern Lisp implementations is the fact that it is built on top of the JVM and provides direct access to all those incredible libraries available for the Java platform. It is truly the best of both worlds.

Another language that is oft maligned but is far better than it is widely thought to be is Javascript. It has long suffered from lack of respect due largely to being forced out the door prematurely for marketing reasons and then forced to live with its unfortunate choices due to its wide spread adoption as the universal web scripting language.

Modern implementations, Node.js on the server, and the evangelism of Douglas Crockford have all gone a long way toward improving Javascript’s reputation not to mention it’s attractiveness as a generic platform for application development.

Languages should be chosen to best address the needs of the problem domain. That is much easier said than done. We are suffering from too many options. People try to overcome that problem by artificially constraining the list of choices. Perhaps they would do better to use the prescription that the Unix community suggests (sometimes attributed to Kent Beck):

  1. Make it work.
  2. Make it right.
  3. Make it fast.

What that means is, first hack out a naive solution, sometimes referred to as the simplest thing that might work. Then, refine the solution to cover all the edge cases that you overlooked in the prototype. Finally, instrument the code to find out where it is spending most of its time and concentrate on optimizing that part of the code.

Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

In Which I Discover A History of the Personal Computer

Lately I have been thinking about the early days of personal computers. I was curious about the timeline of when the various computers were introduced. I had a fairly good idea about most of the early makes but there was one that I didn’t know much about when it was first introduced. It was a line of computers made by a company called Ohio Scientific, originally Ohio Scientific Instruments. The reason that I was interested was that it was the computer that was sold by the company that I went to work for when I got out of the Army.

I looked Ohio Scientific up on Wikipedia and one of the references at the end of the article led me to a book called A History of the Personal Computer: The People and the Technology. Someone, hopefully with permission of the copyright holder, had converted each chapter to PDF and made it available on the web.

It has proven to be a gold mine of details about the early days of personal computing. I will be commenting on it as I read it with personal experiences that occurred contemporaneously with events described in the book. I recommend the book to anyone that is interested in the history of computers through 2001 when the book was published.

Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.