More Rant

As my colleague Danny Cutts pointed out in a comment on my post yesterday, I criticized the status quo in software development without making any constructive suggestions for how language selection ought to be done.

The short answer to that question is that it is a topic that sorely needs research. There are, in fact people in academia all over the world that are investigating these issues. There are many interesting approaches. I have been impressed by the results obtained by using languages that enforce constraints against mutable data and encourage programmers to avoid side effects.

I am an admitted fan of Lisp and I think that Clojure has done the best job of all the modern Lisps of advancing the state of the art of Lisp like languages. Not only has it made data immutable by default, it has also unified the operations upon collections of all sorts. It has also baked in thread safety to the point that it’s hard to get concurrency wrong.

And the final aspect boosting Clojure over the top in the comparison of modern Lisp implementations is the fact that it is built on top of the JVM and provides direct access to all those incredible libraries available for the Java platform. It is truly the best of both worlds.

Another language that is oft maligned but is far better than it is widely thought to be is Javascript. It has long suffered from lack of respect due largely to being forced out the door prematurely for marketing reasons and then forced to live with its unfortunate choices due to its wide spread adoption as the universal web scripting language.

Modern implementations, Node.js on the server, and the evangelism of Douglas Crockford have all gone a long way toward improving Javascript’s reputation not to mention it’s attractiveness as a generic platform for application development.

Languages should be chosen to best address the needs of the problem domain. That is much easier said than done. We are suffering from too many options. People try to overcome that problem by artificially constraining the list of choices. Perhaps they would do better to use the prescription that the Unix community suggests (sometimes attributed to Kent Beck):

  1. Make it work.
  2. Make it right.
  3. Make it fast.

What that means is, first hack out a naive solution, sometimes referred to as the simplest thing that might work. Then, refine the solution to cover all the edge cases that you overlooked in the prototype. Finally, instrument the code to find out where it is spending most of its time and concentrate on optimizing that part of the code.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

A Short Rant from My Soapbox

Everyone has preferences. Chocolate over vanilla, rock music over country music, and so on. Never is that any more apparent than in programmer’s preferences for programming languages. Often it is for the language that they learned first but then sometimes you find a programmer who has found another language that is so much better for the task at hand that they becoming a fanatic advocate for the new language.

I am a connoisseur of programming languages and after forty years of learning as many new languages as I could put my hands on I have finally had a revelation. Programming languages all have strengths and weaknesses. These vary greatly from language to language. Problems that programmers set out to solve have a wide variety of attributes. Some languages are better for solving some problems than others.

Now given any particular programmer, you have a fairly unique set of languages that they know and problems that they are comfortable solving. This influences their preferences for languages. They will, in fact, be more productive using a language that fits the problem they are trying to solve and is familiar to the programmer.

However, it is the mark of a master programmer to examine the problem at hand and pick the language to write the solution in based on the characteristics of the problem. This is particularly rare because many companies allow management, that have only a vague understanding of the issues involved in writing a significant piece of software, determine which programming language the programmer will be required to use.

I tried to come up with some pithy analogy to illustrate how wrong this is but it is so uniquely and totally wrong that I could think of no analogy that came close to illustrating my point.

The root of this problem stems from the mistaken belief that there is one language that is the best to program all problems with. I think that is fundamentally mistaken. If you read the work of Kurt Gödel he proved, somewhat paraphrased, that a formal system could either be consistent or complete. Programming languages are by definition consistent. Therefore, they can not be complete, which is to say, that no one computer language can express the solution to all problems.

Management means well. Most people do. They want to do their job to the best of their ability. They think that by insisting that their programmers use this or that language, they are reducing the risk of failure. In fact, they are not. They are increasing the risk of failure by limiting the domain from which solutions can be gleaned.

This is not to give the programmer carte blanche to use any language they like just because it is exciting or they are comfortable programming in it. Rather the programmer needs to be trained to recognize the relevant attributes of problems and how to evaluate programming languages to find an appropriate one to use to solve it. In short they need to be experienced professionals.

The situation is made even worse by the fact that, while we wouldn’t dream of having an automotive engineer design an airplane, we somehow have the idea that any joker can write software. It’s true. Anyone can write a program. The quality of the program that they write will vary greatly with their experience and aptitude though.

When I joined the company where I work some twenty five years ago, managers did not have computers. Anything with a keyboard was considered secretarial work and beneath their station. This has changed as the old school managers have retired one by one and been replaced by a spreadsheet wielding younger generation.

I think the next big paradigm shift that needs to happen is to realize that programming is an engineering discipline deserving the respect of any of the other engineering disciplines. Not only that but it needs to be understood that being trained in one of the other engineering disciplines doesn’t automatically make you a good programmer.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Desperately Awaiting AI

I have been reading about the history of the development of the electronic programmable computer recently. Even when you consider the development of Colossus which wasn’t acknowledged by the UK until the 70’s, the origin of the programmable electronic computer was around 1943. That means that computers are approximately seventy three years old. That is longer than I’ve been alive but in terms of maturity of the field, I think it is still early days.

We have learned a number of important principles of computing but there are plenty more to be discovered. It would be incredibly short sighted to think that we have discovered even the tip of the iceberg when it comes to algorithms.  The age of distributed computing has barely started. Artificial Intelligence is still relegated to dank holes in the ground where it can be ignored.

That is except for such organizations as Google which continues the search for a plugin capable of producing the illusive artificial intelligence that we have been waiting for. We’ve waited since the late fifties straight through into the eighties. It seems that it was always going to be here in approximately five yours. Never have so many estimated so poorly.

We have had AI winters and AI springs. We have seen AIs in movies and television shows. We are told by some experts that AI emerged quietly in the 90’s. Ray Kurtzweil says it is due twenty years from now. I am running out of time. I need a Singularity sooner than later. Or at least functional immortality so that I can out wait the coming of AI.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Anglophilia

When I got up this morning I discovered that my wife had built a fire in the fireplace and discovered an app on AppleTV that allows us to watch live streams of broadcast television from around the world. We particularly enjoy watching British television and so we have spent the day watching our favorite British shows when they are being shown in the UK. I realize that this brands us as utter Anglophilic geeks but we don’t care.

This is the final straw that will result in our cutting the cable and relying on internet feeds and local broadcast feeds for all of our television. This will save us a chunk of money and in the long run result in us watching more of what we enjoy and less garbage that we watch because it’s there. It may even help inspire us to turn off the TV and do other things instead.

In any case, it has been a good day. I’ve only been out of the house once to take a bag of garbage to the dumpster. It is still cold out there. I think I’ll leave it out. The fire is still crackling and the second episode of Sherlock is on later. Stay warm, be careful, and enjoy the rest of your weekend.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Did I Mention It’s Cold Outside?

Well it’s cold as hell in Huntsville, Alabama and southerners don’t handle cold very well. Friday there was a prediction of between half an inch to an inch of snow and the whole city preemptively shut down. We got nothing more than a flurry here and a flurry there but the wind was biting. I used the time to pack up my office for my move to my new assignment at Marshall.

I ventured out today to pick up some lunch. Traffic was light. I expect most people were staying in and avoiding the cold. I enjoyed the drive after the heater warmed up. It’s been years since that has been a major issue as the temperatures have been so mild that just being in the car was sufficient to be comfortable.

I spent the bulk of the afternoon intending to write this blog post but instead being distracted by various web sites. In my defense, when I am having trouble coming up with a topic to write about I sometimes browse around until I read something that gets me started. Today that didn’t work.

What finally got me started was noticing my two dogs in their matching pajamas napping on the couch next to them. They usually make a big deal of escorting my wife to the bathroom but today they were content to just stay by me. It seems the floor was too cold.

Stay warm, be safe, and enjoy you weekend.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

In Which I Discover A History of the Personal Computer

Lately I have been thinking about the early days of personal computers. I was curious about the timeline of when the various computers were introduced. I had a fairly good idea about most of the early makes but there was one that I didn’t know much about when it was first introduced. It was a line of computers made by a company called Ohio Scientific, originally Ohio Scientific Instruments. The reason that I was interested was that it was the computer that was sold by the company that I went to work for when I got out of the Army.

I looked Ohio Scientific up on Wikipedia and one of the references at the end of the article led me to a book called A History of the Personal Computer: The People and the Technology. Someone, hopefully with permission of the copyright holder, had converted each chapter to PDF and made it available on the web.

It has proven to be a gold mine of details about the early days of personal computing. I will be commenting on it as I read it with personal experiences that occurred contemporaneously with events described in the book. I recommend the book to anyone that is interested in the history of computers through 2001 when the book was published.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Origins of the Maker Culture

When I was in high school I subscribed to Popular Electronics magazine. Popular Electronics was a spin off from Popular Mechanics. Both magazines were created to feed the voracious appetites of do-it-yourself hobbyists. Popular Mechanics focused on cars and woodworking and home repair. Popular Electronics focused on radio and tv and high fidelity audio equipment. Both magazine featured several construction projects in each issue.

By the time I graduated from high school, the microprocessor had been invented and the embryonic personal computing movement had started. Early computer hobbyists had to build their own computers. There were a few kits and some so called development systems intended to help designers learn how to program the novel single chip CPUs. There were local clubs in places like Silicon Valley and Boston where computer hobbyists showed off their creations to each other and helped each other master this fascinating new hobby.

In Peterborough, NH, Wayne Greene, a ham radio operator and magazine impresario, started a magazine called Byte: The Small Systems Journal. It covered both hardware topics and programming topics and featured both construction articles and programming articles in every issue. More importantly it had ads from all the various suppliers of parts and kits and assembled computers and accessories.

An entire generation of computer hobbyists learned how to build and program personal computers reading Byte magazine and the other magazines, like Kilobaud, Compute!, Dr Dobb’s Journal, and many others. Some focused on the hardware. Others concentrated on printing programs that could be typed in directly to your personal computer. None was as pivotal in the education of computer hobbyists as Byte.

I later got a B.S. degree in Computer Science but I learned about computers from my instructors at the Army Pershing Missile school and Byte magazine.

When I see the new generation of hobbyists programming Raspberry Pis and Arduinos and reading the new crop of educational magazines that have popped up to support them, I have hope that we will have innovative computer hobbyists in the future.

One thing that todays computer hobbyists have that we didn’t is the internet. We had to send off for spec sheets and buy books to learn about our hobby. While that is still a productive approach, most people today make their first stop Google when they need to read up on some detail for a project they are doing.

Computer hobbyists now call themselves Makers or DIYers or even hacker, although the former turn has come to have bad connotations in some circles.

I write articles like these because I think that these new hobbyists deserve the opportunity to find out about the history of their hobby. I’m not sure what kind of computers they’ll be using but I’m sure that there will be computer hobbyists for centuries to come.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Science Fiction Ideas

Those of you who have been reading this blog for any length of time know that I am both an avid follower of developing technology and an aspiring science fiction writer. Today I was discussing the possibilities for near term space missions. In particularly, we were talking about the fact that although a manned mission to Mars was exciting, a more practical mission would be to capture an asteroid and transfer it to one of the Earth-Moon Lagrange points.

Lagrange points are places where the gravitational forces of two large bodies  combine to equal the centrifugal force of a smaller body. There are five Lagrange points in such a system. The smaller body doesn’t need any propulsion to maintain its position at a Lagrange point. This makes it an ideal place to build a space station or even perhaps a colony. The material from asteroids would be the most cost effective stuff to use to build such a colony. It is very expensive in terms of the cost per pound to lift material from the surface of Earth. Moving the same amount of material from place to place in space is relatively much cheaper.

A mission to Mars on the other hand requires much longer exposure to the dangerous cosmic radiation of space, a dangerous landing on Mars, and practical isolation from the rest of humanity due to the expense and length of such missions. Where a trip to Mars takes about nine months at shortest, a trip to a Lagrange point takes only days.

That is not to say that I don’t want us to mount missions to Mars. Rather I think we would have a lot better chance of mounting successful missions to Mars if we practice the skills required closer to home. Also, if we can demonstrate the economic value of space with less expensive, near Earth missions, we will be more likely to be able to interest investors in the larger investments involved in such activities as terraforming Mars so that it could support a human colony.

The discussion turned to why would we want to colonize Mars in the first place given that it is on the edge of the so called Goldilocks zone where surface temperatures are tolerable for human habitation. I mused that it was the only candidate that we had technology capable of reaching within a reasonable timeframe. I mentioned that even at relativistic (near light) speeds, it would take centuries to get to the nearest stars making such trips effectively one way.

At that point it occurred to me that there might be a way short of superluminal drive technology. If we are able to develop functional immortality, that is, we learn to cure all diseases and suspend or reverse aging such that the only way that people die is accidents. Then the length of trips between stars might become tolerable to individuals.

The discussion triggered a lot of productive ideas for possible future science fiction stories. I will add them to my list of ideas for future exploration.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Hacking is Rampant

The internet is both a wonder and a nightmare. It is hard enough to avoid getting compromised by a hacking or phishing attack yourself but now it has gotten even harder to render assistance to relatives that are less computer savvy than you are. The problem is that there are so many different ways that hackers can attack and if you take someone like me who uses Apple computers at home and Windows computers in a corporate environment at work, it is difficult to know what to do when someone has been hacked. Especially when they have been hacked with a phishing attack.

Phishing is a type of hacking where the attacker is able to get your computer to put up a message that looks like a legitimate error message. Then, they give you a phone number to call. It is supposedly a phone number for a legitimate company but in actuality it is the phone number of the hackers that are mounting the attack. They then take you through the process of “fixing” the problem over the phone. What they are actually doing is getting you to make your computer vulnerable to a more serious attack that will either allow them access to any sensitive information that you have stored on the computer or else possibly lock your data so that you can’t access it until you pay them a ransom.

When we detect a hack at work, we are told to immediately call the security team to contain it. When I got a virus on my Windows machine a long time ago, I wiped the disk and reinstalled the operating system. That was Windows 95 so you can tell how long ago it was. It is one of the reasons that I use Apple computers. They aren’t totally immune to hacking attacks but there are far fewer of them on Macs than there are on PCs.

I am supposed to be the computer expert. It is frustrating when I have to say, “I don’t know what to do about your problem.” My wife said I should have reassured her that she hadn’t done anything wrong. I understand the sentiment but I didn’t think of it in that fashion. I was worried about telling her not to worry and it turning out that there was a reason for her to worry.

Someone more familiar with the setup of her computer is having a look at it. I am hoping that it was a tempest in a teapot and that there is nothing irreparably wrong with her computer. If not. I hope she hasn’t lost anything important. I feel so helpless.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Work Hard and Improve

Today I watched an interview with Stephen Moffat. He was telling the interviewer that he was rubbish. I’m not British so I may be missing some subtle shading of that phrase but I took it to be him being sincerely self deprecating. He then went on to say that the one thing that he would take credit for is working hard. He said he had to work all that much harder to overcome the fact that he was rubbish.

Creativity is a strange quality of human intelligence. It can’t be forced but it can be courted. You can’t sit around and wait for the muse to inspire you. Instead, you have to sit down and write as if you already were inspired. You have to fake it until you make it. If you are putting words on the page you are a lot more likely to write something good than you are just staring at the page. Writing is hard. You have to do whatever is necessary to get something down for a first draft. Then you read what you’ve written and decide if there is anything there that you can salvage. If so, you pull it out and work on it. If not, you just keep on writing some more.

Creativity is more about taking a different perspective on things and using your judgement to recognize when you have written something worth improving upon. The old adage that enough monkeys typing randomly on typewriters will eventually write a Shakespearean play isn’t too far from the truth. When you sit down to write, you aren’t typing randomly but otherwise the chances of any given session producing something worth pursuing is a long shot. You can improve your odds by paying attention though.

Notice the things that work for you. Perhaps you work better while listening to music. Perhaps you prefer silence or the television playing in the background. I have found that whatever I have supplying the soundtrack to my writing process is better if it isn’t too interesting. It needs to sink beneath the wave of words flowing from my subconscious onto the page.

I also find that I have times when I have lots of ideas and other times when I can’t think of a single one. So, I try and capture the ideas on lists when they are plentiful so that I can browse through them when they aren’t. Often the act of browsing those lists will inspire more new ideas to be added to the list. Remember, you have two jobs. Principally to keep a stream of words flowing onto the page but secondarily to keep the pump primed by capturing new ideas and stockpiling them.

So, the answer to the perennial question about where a writer gets their ideas, they get them from the notes that they have so laboriously taken when they were inspired. Or as Robert Heinlein, a favorite Science Fiction writer of mine, once said, There Ain’t No Such Thing As A Free Lunch sometimes pronounced TANSTAAFL.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.