Quo Vadis?

I am building a lifestyle for myself. It is based on a practice of writing daily. I write a thousand words in my journal every morning. That is an opportunity to record things that I want to remember and limber up my mind for the work of the day. Sometimes I actually undertake writing projects in my journal. For example, when I participate in NaNoWriMo (National Novel Writing Month), I do so in lieu of writing journal entries every day.

This is largely a matter of schedule. If I wrote a thousand word journal entry and seventeen hundred words a day on a novel, it would take me about two hours a day. As I work for eight hours a day, five days a week, that would levy a pretty heavy toll on my time.

The other daily practice is writing in my blog. It is a different kind of endeavor. It is measured not in terms of how many words I write, although it is usually between three hundred and a thousand words long. Rather, it is however long it needs to be to get across whatever idea I’m exploring in that particular post.

In short, my blog is an exercise in writing coherent essays to be read by my readers, whoever they might be. I have tried to decide what the theme of my blog is to no avail. It appears that it is a blog about whatever interests me at the moment. The theory there is that I can’t hope to interest anyone else if my topic doesn’t interest me in the first place.

I have written about writing a lot. I have written character sketches and “still life” like sketches of locations. I have serialized a science fiction story. To be fair it was a draft of a story. I have written a good bit about programming because I am passionate about programming. I have written short memoirs of my youth.

I have managed to capture the interest of some people judging by the comments that I get. I try to pay attention to which posts get comments and write more like them. The point of this blog is to learn to write for other people to read.

I guess I’ll end with a tip of the hat to the individuals that inspired me to blog in the first place. First, there is Dave Winer, the man that was so committed to personal commentary that he invented the blog. And then there is Paul Graham, a renaissance man that taught me about lisp and startups and essay writing, all by example. Thanks to them I have found a place from which to start. Only time will tell what this blog becomes.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Love of Lisp

I have an on again, off again love affair with a language called lisp. It is the second oldest high level computer language with only Fortran being older. It is deceptively simple at its core. It wasn’t even meant to be an actual computer language when it was created. It was a notation created  by John McCarthy in 1958 to talk about Church’s lambda calculus. Shortly after he published a paper about it, one of his graduate students, Steve Russell, implemented it on a computer.

Lisp distinguishes itself by being comprised of a half a dozen or so primitive functions, out of which the entire rest of the language can be derived. Just because it can, doesn’t mean that it should be, so most modern lisps compile to either machine code or virtual machine byte code. This typically results in a considerable performance boost.

Lisp was heralded as the language of artificial intelligence. That was probably because it had the novel property of homoiconicity. That is to say, the structure of a lisp program can be faithfully and directly represented as a data structure of the language. This gives it the singular ability to manipulate its own code. This was often thought to be one of the necessary if not sufficient capabilities for a machine that could reason about its own operation.

While this was intriguing, the thing that drew me to lisp was the conciseness of expression that it facilitated. Programs that took hundreds of lines to express in other programming languages were often expressed in four or five lines of lisp.

Lisp was also the first dynamic language. It allows the programmer to continue writing code for execution even after the original program has been compiled and run. The distinction seemed important enough to McCarthy that he termed lisp a programming system instead of a programming language.

I have always found lisp an excellent tool for thinking about data, processing, and the interactions between them. Most other programming languages require a great deal of translation from the design to the finished implementation.

And so, I find myself reading and studying a book called How to Design Programs. It is a text on program design that was written using the DrRacket language system, based on the Scheme dialect of lisp. It is interesting to see the ways that the authors approach certain topics. I hope to get the chance to apply their insights to teaching a class using the book as a text.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

A Web Site for Developing Web Sites

Back in January of 1993, Marc Andreessen and his team released the Mosaic web browser. It captured my imagination for several reasons. First, it brought the promise of a platform independent means for sharing information across the internet. It was not only a hypertext system but a hypermedia system.

At that point, the network barely had enough bandwidth to support the transmission of photographs, much less video. But Mosaic didn’t restrict the type or size of content. It was built to allow the browser to be extended to support new media types and protocols as the were developed. That was the inspiration for the name of the program. It was a Mosaic of protocol engines and renderers.

The second feature that captured my imagination was the description of the input mechanisms provided by that early version of HTML. I reasoned that if this browser could be evolved to allow arbitrary input as well as rendering new kinds of output as they were developed, then it was for all intents and purposes a platform independent, Graphical User Interface (GUI).

This came at a time when users were arguing over which operating system would dominate the world of desktop workstations. There were three major contenders. First, there was the Macintosh. Then the PC running Windows. Bringing up the rear was unix and linux both running the Xwindows system.

Here, presenting itself in the guise of an humble hypertext reader, was a potential answer to the tower of Babel situation that we found ourselves in. Realizing that vision has taken the better part of ten or fifteen years.

The technologies that made this possible are Cascading Style Sheets, Javascript, HTML5, the Apache web server, and Node.js. These are not the only technologies that contributed to this web application platform, but they are the most significant ones.

At this point we have the means to make web development easy and platform independent but lack the resolve to implement a web development tool that runs on the cloud and is simple enough to use that mere mortals (and managers) can use it to maintain their information on the web.

There is actually several packages that come close to providing the cloud centric development that I am talking about. The one that has captured my imagination is called XWiki. It allows you to create content interactively by using the same kind of tools that you do to browse a web site.

The place where XWiki falls short at present, is in its lack of an obvious way to import a complex brand identity framework and use it as a template upon which to implement the actual content of the site. It should be possible to import the content from other programs or files, as well as dynamically creating it in the framework.

I’m sure such frameworks exist. They’re just not open source or as simple to use as I would want them to be. I’m still intrigued by XWiki but it has fallen back in my estimation of it’s ability to  be easily extended to support the kind of web site development that I’m trying to foster.

I haven’t really talked much about my vision for this tool. That may be because I am still fleshing it out in my mind. I will give it some thought, take some notes, and make another stab at specifying the tool that I’m dancing around tomorrow.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Unconditional Positive Regard

My father was a professional teacher. For the most part he taught Speech, English, and Drama in high school. He was a big believer in life long learning. Every summer he and my mother both enrolled in graduate level college courses. My brother and I attended summer school programs at the university school.

I remember one summer when he was studying for a masters degree in guidance counseling he was taking some course or another in psychology. He would often share some of the interesting things that he learned with my mother, brother, and me. On this occasion, he was telling us about a counseling technique that was based on showing the client unconditional positive regard.

Unconditional positive regard is not unconditional approval. Instead it is an attitude that one takes toward the person instead of the things that the person may have done. By giving the client your positive regard without making it conditional on anything that they do or refrain from doing, you open up the potential for dialog with them.

I took this technique to heart and used it in my personal relationships. Consequently, I made friends with people that were outcasts. In some cases, they were actually pathological liars. That is, they actually believed the tall tales that they told. They were so hungry for friendship that they were extremely loyal to me. The problem is that they tended to put off other people that weren’t so willing to accept these people at face value.

I always used the following criteria when deciding whether to accept things that these people told me, “Would it hurt anyone or thing to accept what they are saying?” When I say accept, I mean that I didn’t argue with them or openly contradict them. I wouldn’t stand by and let them tell lies about other people but if they wanted to tell me that they had been taken for rides in UFOs, I was willing to take them at their word.

The thing that I didn’t realize that resulted from this practice was that I became considered to be an outcast by many people because I had friends that were outcasts.

I’ve been giving this a good bit of thought lately. I think it is a contributing factor in why it has taken me so long to learn accepted social behavior. Thank goodness my wife is so good at such things. She keeps me from making too many social gaffs.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

The Making of a Programmer, Part IV

I finally finished my first bachelor’s degree and realized that I was going to have to change jobs to realize any monetary benefit from it. I did some networking and managed to find a job working on the so called Star Wars program, officially known as the Strategic Defense Initiative or SDI. I was part of a group that had a charter to apply Artificial Intelligence technology. The situation we were faced with was that we didn’t have any projects that needed AI technology.

Consequently, I found myself writing networking software to support distributed simulations of SDI systems. It was very similar to the work that I had been doing. It wasn’t AI but it paid the bills.

My next career move was to take a position as a system administrator at the Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC). This was a very challenging but rewarding time in my career. The principle challenge was that we were so grossly understaffed. By the time I got this message across to management, I was totally burned out. Reluctantly, I took a job with a less demanding schedule.

Then, I got a rare opportunity to join an Artificial Intelligence Laboratory at a major aerospace company. I jumped at it. I haven’t done much AI work but I have had a challenging, varied, and ultimately rewarding career there. I had no idea I would still be there twenty five years later.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

The Making of a Programmer Part III

I worked at Intergraph for six years. When I started there it was a startup in the exponential growth phase. It was exhilarating to come to work every day. I was working on cutting edge technology. My first job at Intergraph was as a technician. I tested and repaired specialized computers that were optimized to search graphics files for patterns. My Army training had prepared me for just such a job.

I enjoyed my work for six months or so. Then I got tired of working with broken computers all the time. One of the engineers discovered that I knew how to program and gave me some tasks to do. Before I knew it, I was working for him and he was the head of the newly formed networking department.

I was given the task of writing a program to copy files from one computer to another over the network. Originally the plan was to write the program using Pascal, I high level, structured programming language. I was uncomfortable with that. I kept finding problems with the implementation of Pascal that would have to be worked around. I finally suggested that we implement the program in assembly language.

Programs on the PDP-11 were often given three letter names. I called my file transfer program IFM and told my boss it stood for Intergraph File Management. I told my friends the truth. It stood for It’s F%#$ing Magic.

After making IFM work between two PDP-11 computers my next challenge was to add VAX-11 computers to the mix, that is I had to transfer files from PDP-11 to VAXs, VAXs to PDP-11s,  and VAXs to VAXs. Luckily, VAX-11 assembly language was a super set of PDP-11 assembly so the translation went smoothly.

The next challenge came when Intergraph decided to build an intelligent desktop workstation that ran Unix. I was provided an early prototype of the workstation, serial number 8, to get file transfer working. This time the underlying file system was different than it was on the DEC machines. I had to start from scratch. I decided to use C, the system programming language made famous by the Unix system.

My new file transfer program was called FMU for File Management Utility. I leave it to the reader’s imagination what that actually stood for. C proved to be a powerful language and I learned to employ all kinds of heuristics to determine how to preserve the semantics of the files while transferring them from one type of file system to another.

It was during this time that I went back to college. I had over two years of credit under my belt and the college that I was attending didn’t have a Computer Science degree program at the time. So, I took Physics, and Calculus, and all the computer science classes that they offered. I ended up getting a Bachelors of Science degree in General Studies.

I worked full time while getting that degree. The last term before I graduated, the college announced that they were going to start offering a Computer Science degree. I asked what it would take to get a second degree in Computer Science. They looked at my transcript and said that all I would have to do would be to take forty more hours to establish residency and I would have the degree.

By this time I was friends with most of the professors in the Computer Science department. I arranged to teach one class and take two ever term until I finished my second degree. I taught Operating Systems a number of times, 8086 Assembly Language, and Artificial Intelligence. It was a great time, but all good things must come to an end.

One of the other colleges in the area had a bunch of Computer Science professors with PhDs that didn’t get tenure. The college that I was attending snapped them up and didn’t renew their contract with anoy of the adjunct professors with less than a PhD. I took my second Bachelors in Computer Science and called it a day.

Next installment I’ll talk about my experiences working on the Star Wars program, working for NASA, and landing my dream job at an AI lab.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

The Making of a Programmer, Part II

When we left off I was talking about my experiences circa 1980. I had been writing Computer Aided Instruction (CAI) for the Army in BASIC. In particular, I was writing code for the Commodore Pet. It ran a particularly nice version of Microsoft BASIC, complete with support for both audio cassette storage and disk drives connected via the IEEE-488 GPIB interface standard.

Personal Computers of this era rarely had hard drives. The hard drives made developing software for the Pet relatively nice. It was while working there that I discovered that it was possible to write self modifying code on the Pet. That was, to my mind any way, a necessary, if not entirely sufficient, requisite for creating Artificial Intelligence.

During a Christmas leave we went home to Murphysboro, Illinois to visit my parents. My dad was a high school teacher and was negotiating the teacher’s salaries for the next school year. He had access to a Radio Shack TRS-80. I wrote a BASIC program that was essentially an early forerunner of a spread sheet to allow him to analyze the effect of salary adjustments on the overall cost of a given proposal. He could run two or three scenarios in the time that it took the school board to analyze one. I was proud of my impromptu hack.

After I got out of the Army, I went to work for a little company in Birmingham that specialized in selling personal computers to small businesses. They were particularly appreciative of my ability to go back and forth between building and troubleshooting hardware and writing software.

My big achievement there was a program that allowed a person with a blueprint of a sheet metal part to describe the part to the computer so that the computer could generate a paper tape to control the machine that automatically punch out the part. The paper tape was called a Numerical Control (or NC) tape. I called my program an NC Compiler. I had to write an assembly language driver to control the paper tape punch that was hooked up to the computer.

It is important to say that I wasn’t learning how to program in a vacuum. For my entire four years in the army and for years afterwards I subscribed to Byte magazine. Byte magazine was completely devoted to personal computer hardware and software. They published schematics of hardware, and listings of software. Every August the published their annual computer language special issue in which they featured a different computer language every year.

Byte is where I learned about Pascal, Lisp, Smalltalk, Forth, Modula 2, Ada, Prolog, and other languages that I don’t even remember off the top of my head. They also published reviews of the various personal computer hardware and software products. It was the only magazine that I had ever subscribed to that I read the advertising as diligently as I read the articles.

There were other computer magazines that were influential like Kilobaud, and Dr. Dobb’s but Byte was the best of the lot. I wonder how kids today learn about computers but then I remember that they have something that we didn’t. They have the internet. If you want to learn something about programming today you have your choice of articles, books, or even videos telling you how it’s done. For that matter, you have the complete back catalog of Byte magazine and Popular Electronics at your finger tips. Of course, they are a bit out dated now. The are interesting from a historical perspective I guess.

When I left the small startup in Birmingham they still owed me several months pay. I finally was able to negotiated a swap for some flaky computer hardware in lieu of the back wages that I had little hope of ever seeing. Subsequently, I spent many a frustrating hour investigating the operating system of the little computer by translating the numerical operation codes back to the assembly code mnemonics so that I could analyze them, a process called disassembly.

It was about this time that I decided to go back to college and finish my bachelor’s degree. In the next installment I will talk about the languages that I was learning, and some of my experiences working for Intergraph.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

The Making of a Programmer

I sometimes think about the way that I learned about computers. I started out with a subscription to Popular Electronics magazine. At the time, I didn’t know much about electronics. I quickly learned about Ohm’s law which describes the behavior of electrical current in a resistive circuit. I picked up various details about other types of circuits and components by reading articles about them.

My dad’s cousin, Jim Shrewsbury had written an introductory book on radio. I read it cover to cover. I decided that I wanted to get an amateur radio license so I bought a copy of the ARRL Radio Handbook. I had trouble mastering morse code so I didn’t obtain a license until I was an adult. But I learned a lot about electronics.

In January 1975, Popular Electronics published the first installment of a two part article on building a small computer. Up until then, computers were large machines that took up lots of space and required lots of power. The idea of an individual owning one was rather novel. My imagination was sparked. I wanted one.

I read everything I could find about computers. I got access to a multi-user computer called Plato at the college library. When I was desperate for a job I enlisted in the Army for training in digital computer repair. After a struggle to pass basic training, I found myself in Huntsville, Alabama at Redstone Arsenal to learn how to fix the computers in the Pershing missile system.

The first part of the course taught us about digital logic, the basic building blocks of computers. Next, we learned about a small computer built specifically to teach how computers worked called the ComTran-10. I later learned that it was patterned closely after the Digital Equipment Corporation PDP-8 minicomputer. This was my first experience with writing assembly language programs for a computer.

Assembly language is the native language of a computer. It varies between computer models. Each assembly language instruction corresponds to an instruction that the central processing unit (CPU) of the computer executes natively. Once you’ve learned how to write programs in assembly language there is no program, in theory, that you can’t write.

That is of course a bold statement. It assumes that you can break down the problem into simple enough pieces that it can be expressed in assembly language. It also assumes that when you finish breaking down the problem, you have enough hardware resources, memory, CPU time, etc., to actually execute the program.

The way most modern programs are written are in higher level languages. Higher level languages are programs that translate programs from languages that are easier for humans to understand into the raw binary numbers that represent the assembly language instructions that the CPU can directly execute.

The process of translation is magical. The first higher level languages were written in assembly language. But then, the subsequent higher level languages could be written in the earlier higher level languages. In modern times, the only people that actually know assembly language are the people that design CPUs and the people that write higher level languages. (Even though the higher level languages are written in other higher level languages, the still have to generate assembly code as output.)

In the rest of the course, we studied the two computers that actually comprised the Pershing system, the guidance computer and the launch computer. We studied every component in the entire system. We learned to troubleshoot the system down to the individual component level. In the end, we knew how those computers worked.

When I got back to Redstone Arsenal from my tour in Germany, it was to spend the balance of my enlistment as an instructor in the school where I had learned about computers. By this time the Commodore Pet computer was being used to teach students about digital logic. The Pet was the forerunner of the famous Commodore 64 computer that many kids of a younger generation than me cut their computing teeth on. The Apple II and the Radio Shack TRS-80 were also on the market by then.

BASIC was the language that all of them were programmed in. It was the language that I used to become a journeyman programmer. It was exciting. It was more powerful than any language I’d programmed in before (with the possible exception of the Tutor language that the Plato system used). But for all that BASIC was an awful language. But it was the best thing that we had at that time.

That was a turning point in my career as a programmer and it will serve as a good stopping point in my story. In the next installment, I’ll give some examples of the kinds of programs that I wrote in BASIC and the languages that eventually replaced it.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

A Word or Two About Apple

I am an unabashed Apple fanboy. I have owned Apple products for years. I am well and truly invested in the Apple ecosystem of content and applications. I still believe that Apple produces the best products that are on the market today. That’s not to say that I don’t have concerns about Apple’s future.

The Apple of today is different from the Apple of a few years ago. A few years ago they were still executing Steve Job’s vision. Apple is an outstanding engineering company. They have proven time and again that they can design and build the best devices on the planet.

Apple has never succeeded by competing with the other companies in their market. They have done so by defining new markets and new products. They have repeatedly imagined totally new ways to deliver wondrous experiences that have taken would be competitors years to understand, much less replicate.

Apple still builds outstanding products. What they need is a visionary. Until they find one, the competition will continue to catch up with them until they have totally lost their edge. I hope they find one soon. I hope they return to the days of reality distortion fields and products that have to be touched and seen to be appreciated.

For now, I love my new MacBook Pro. I think the Touch Bar is entertaining if not indispensable. I think the USB-C ports are brilliant. It is without a doubt, the best computer I have ever owned. Some people insist on driving the best cars. I insist on owning the best computers. This one lives up to my standards.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Time is a Relative Phenomena

One of the peculiar attributes of human beings is our ability to think about time. We have a perhaps unique awareness of time passing that is based on our ability to remember events that happened before the present and to anticipate events that will happen later. Consequently, we have a clear concept of both the future and the past.

Furthermore, we divide time up into intervals, from short intervals such as a second or less, through moderate intervals of minutes and hours, up to longer intervals of days, weeks, months, years, and longer. A strange phenomena that accompanies our ever evolving perception of time intervals is that the older we get, the shorter a given time interval seems to us. This is because our perception of time is based on comparison of any given duration against the total time that we have been alive.

I recently spent some time thinking about what it would be like if we could perceive time at a radically different scale, for instance at the scale of billionths of a second or nanoseconds as they are called. That is the approximate duration of operations that occur in the central processing unit of a computer.

I was writing a science fiction story in which one of the characters merges their mind with that of a computer. It occurred to me that one of the communication barriers to be overcome between artificially intelligent entities and humans would be one of their vastly different perception of time. As a consequence, large portions of the story that happen inside the machine take place in the blink of an eye on the human scale.

I have always had an intuition that artificial intelligence would not be created by programmers writing programs that were intended to behave in intelligent ways but rather would emerge as a federation of systems that were composed of parts that were programmed by people. Perhaps this has already happened but these artificially intelligent entities perceive time on such a vastly different scale that they are unable to imagine life at any other scale.

Imagine if you will that Sequoias were intelligent beings that perceived time on the scale of years. Would we be able to fathom their intelligence? Or even more extreme, imagine that stars are intelligent creatures that perceive time on a scale of millennia. How would we communicate with them?

The world is a strange place. The more you think about it, the more you realize that it is even stranger than you can possibly imagine. That is not to suggest that you shouldn’t try. On the contrary, stretch your imagination at every opportunity. It is what makes mankind great.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.