Behavioral Norms

A wise woman named Anne Lamott said, “Every single thing that happened to you is yours and you get to tell it.” At this point in life, I have plenty to tell. A lot of what I have to tell is about people at their most vulnerable. I have often been the quiet observer on the edge of dramatic situations. Having said that, I would expect that I would understand people better. But that is far from the case.

I grew up around actors. Stage actors are good at figuring out why people do the things they do and feel the way they feel. Film or TV actors are good at remembering their lines, keeping a neutral expression on their face, and hitting their mark. In both case they are presenting the characters as written by the playwright.

I started life in a bassinet back stage every night. My mother was the lighting operator for a summer theater production. My father was away in the Army at the time. I don’t have any conscious memories of that summer but I expect it had some effect on my early development.

When I was seven, my mother and father and I were members of the cast and crew of another semi-professional theatrical company. The production was a play called Stars in My Crown and it told the story of the taming of the Tennessee river by the TVA. It was actually much more entertaining than it sounds. There were songs and dancing and an interesting story line.

I was much too young to understand the backstage intrigue that went on nightly among the adult members of the company. Actors are often vain, emotional, and self centered. It was fairly confusing for me. But I soaked it up like a sponge.

By the time I moved into my teenage years, I had internalized all of the archetypical artistic behavior. I participated in orchestra, choir, plays, debate, and speech competitions. I knew how to act like a tortured artistic soul. So much so that I sometimes wonder if there was any relationship between the way I acted and who I really was.

After I graduated from high school I worked in the summers at a western theme park, with gunfights, saloon shows, magic shows, and Kentucky long rifle demonstrations. The cast and crew were mostly college students working there during the summer. The typical actor behavior was the norm there as well.

I was consequently at somewhat of a disadvantage when it came to knowing what normal human relationships were like. My parents were both school teachers and lead rather Bohemian lifestyles, or so I thought. It turns out they were actually fairly normal.

I haven’t associated much with theater people the last half of my life. I find that sad. I enjoyed their company. I have often thought about getting involved with local theater productions but I have the disadvantage of knowing what hard work that is and can’t bring myself to commit to that level of effort on top of a full time day job.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Teaching an Old Dog New Tricks

I’ve always been a fan of simple text editors when it comes to writing. Word processors have too many knobs. It is too tempting to get distracted by how something looks and spend your writing time messing around with how something looks instead of doing what you set out to do, write.

But as I get better and better at maintaining my focus on writing I find that there are other tools available to writers than just traditional word processors. These tools are designed to help writers manage the mass of words that they create and easily view them in different ways.

One such tool is a program called Scrivener. It is the Cadillac of such tools in many writers’ opinions. Among its features are support for outlining, a cork board view of the elements of a piece, a version management facility, and tools to help collect references and organize them outside of the main flow of the work in progress. There are many more features, so many that learning to use them effectively is overwhelming in its own right.

I got a copy of Scrivener for Christmas last year. I’ve used it off and on since then but as with many power tools it isn’t the first thing that comes to mind when I have a fast approaching deadline. I usually don’t want to spend a lot of time rediscovering how to do simple things with it. Until tonight.

Tonight I was reading my email and various articles that caught my attention in some of the newsletters to which I subscribe. One of the articles described a process for writing a novel by the seat of your pants using Scrivener.

The idea was to write lots of scenes of approximately five hundred words apiece. Each scene is stored in a scene element that has an associated synopsis card. As you collect more and more completed scenes you assemble them into a story using the cork board. When you have ideas for scenes, you create a synopsis card for it. When you sit down to write, you scan the cork board for unwritten scenes and choose one to work on next.

I can see how this would make writing something like a novel easier. Then as I thought about it some more I realized that it would be a good way to collect ideas for blog posts. It would help me tackle ideas that I wanted to spend time researching and polishing for maximum effect.

So here is the first attempt at teaching this old dog a new trick or two. I have high hopes for using it to take this blog to the next level. I may even achieve my long time goal of having several finished posts in the wings ready to publish on days when I am otherwise pressed for time.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Dreams Do Come True

I used to dream of owning a computer. Starting when Popular Electronics ran the article on the Altair 8800 I yearned for my own computer. As it turned out, the Altair 8800 was pretty much a hangar queen (aviation talk for a plane that seldom makes it off the runway). As it was originally configured, the only input devices were the toggle switches on the front panel and the only output devices were the lights that corresponded with the switches.

It didn’t take long for MITS to offer a serial card that allowed the 8800 to talk to a teletype machine. That gave it a keyboard and a printer and, on fancy teletype models, a paper tape punch/reader. Bill Gates and Paul Allen wrote the first version of Microsoft Basic for that machine.

I followed the blossoming of the personal computer hobby, largely by reading magazines like Byte, Kilobaud, and Compute. As time passed, the market grew and there were multiple ready made personal computers available. They always exceeded my budget by a considerable sum. Each year the capabilities of the latest models grew by factors of two or three while the cost remained essentially constant. For many years, the machine that I wanted cost approximately $1000. It was a different, more capable machine each year, but the cost was constant.

The first computer that I owned was itself somewhat of a hangar queen. It was an Ohio Scientific CIP. This was a 6502 based computer, as was the Apple II and the Commodore Pet. The particular machine I owned was given to me by a former employer in lieu of back wages that he owed me. It had been sitting in the shop for years with hardware bugs in it that none of us had been able to totally exterminate. It was better than no computer at all, but just barely.

Soon after that, I got my first real computer system. It was a Kaypro II. It was euphemistically called a luggable computer. It was too bulky and heavy to really be considered portable. It had two floppy drives, a z80 processor, a keyboard, an 80 character by 25 line display, and it ran CP/M. I was ecstatic.

I’ve owned many computers since then, some of them expensive, some of them incredibly inexpensive. I have several Arduinos that cost less than $20. I own several Raspberry Pis that are in the same general price range. Cell phones are more powerful computers than corporate data center mainframe computers were in the sixties. I can only imagine what people will think of our computers in twenty years.

But when it comes right down to it, I use my current fancy Apple laptop for the same thing I used that Kaypro II for, to write programs and to write prose. I occasionally use the graphics or sound capability that the Mac has and the Kaypro didn’t. But mostly, I write. Oh, and I surf the web. But that’s a story for another blog.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Don’t It Always Seem To Go…

We are heading toward a truly mind boggling future. It won’t happen abruptly, but like the frog that is placed in a pot on the stove it will happen before we are aware of what is happening. I am talking about the increasing connectivity that will evolve into technologically enabled telepathy.

We have demonstrated in the past that anything that we can imagine we can build. What is more important, not only can we build it, but there is no effective way to prevent it from being developed. Furthermore, the speed with which we can develop new technology is accelerating at an exponential rate.

This has the potential for unprecedented good but at the same time it has almost as great a potential for unbelievable evil. We can only hope that the quality of communication increases such that empathy tips the scales in favor of good outcomes. The challenge we may find ourselves facing is how do we deal with people that are empathy impaired? And furthermore, who hasn’t found themselves overwhelmed by the suffering of others to the extent that they withdraw? Empathy is for a time suspended in a kind of self defense mechanism.

This leads to another concern. When we are so interconnected that we can communicate with potentially everyone else how will we keep from being overwhelmed by the enormity of the stream of information flowing to and from and through us. Will we be aware, as individuals, of the larger issues that the collective mind is contemplating or will we only be aware of the issues that are being considered locally?

We don’t understand how our brains organize the billions of neurons that comprise them into an organized whole that embodies a sense of self. How can we possibly conceive of the meta-self that will arise from the swarm mind that will be enabled by direct Brain Machine Interface?

And this doesn’t even acknowledge the fact that with computers in the mix as mental prostheses, we may find ourselves facing the brave new world of non-human emergent artificial general intelligence. Will we control it or will it control us? Or, more likely, will we merge with it to become an even more powerful meta-intelligence?

These sound like the crazed ravings of a science fiction obsessed nerd but the fact is if we don’t think about these issues now, we will find ourselves waking up to the emergent reality. As much as we can’t put the Geni of the internet back in the bottle, we won’t be able to put the Geni of high bandwidth human machine interfaces back into the bottle either. We need to have these conversations now while we can still steer the outcome in a favorable direction.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

The Devil is in the Details

I finally figured out how to solve the most frustrating misfeature on my MacBook Pro. I have one of last fall’s MacBook Pros, a fifteen inch model with the Touch Bar. I have loved it since it arrived early last December. All except for one thing. The default position of the Siri icon is right above the delete key. I am a touch typist and do not look at the keys while I type. When I am writing, I accidentally hit the Siri key two or three times during a typical half hour session.

When I do, my first inclination is to reach up and touch the x on the Siri window that would make it go away if this machine had a touch screen like the iPad. Instead, I then have to use the touch pad to move the cursor to the x and dismiss the Siri window. This disrupts my train of thought and slows my writing down. It frustrates me greatly.

I knew that the Touch Bar was configurable but I hadn’t bothered to figure out how. Tonight, I decided to ask my close personal friend Google about it. Within minutes I was on the Touch Bar configuration screen. I got there by pressing the Custom Control Strip button on the Keyboard Panel in Settings. It was actually quite easy to substitute the Notification icon for the Siri icon. The Notification icon is a toggle. Touch it once, it pops up the Notification panel. Touch it again and it dismisses it. Unlike the Siri icon that pops up Siri and then toggles between two modes within the Siri window on alternate touches. Problem solved.

This is the kind of detail that seldom slipped past Steve Jobs. He was the ultimate user advocate. He wanted the software to be as frustration free as possible so that he could sell the hardware for a healthy markup. The purpose of the software was to enchant the user so that they loved using the machine. It was a tactic that worked well.

Don’t get me wrong, Apple still makes the best computers around. They have the most innovative operating environments on the market. I love my MacBook Pro. They are just getting a little sloppy on a few small details here and there. They need someone with vision to relentlessly insist on the best experience for the users.

They also need to rehearse their keynotes and other special events until they are flawless. These public events are where the reality distortion field kicks in, if you’ve done your homework and rehearsed your presentation adequately. I think they need to pay more attention to writing the script. I got the impression there wasn’t a script at Monday’s WWDC keynote. Just an outline on the back of an envelope. If there was a script, it needed more work.

Apple is about design and appearances. They need to remember their strengths and play to them. And continue making excellent hardware and the best software on the market.

Disclaimer: I own Apple stock.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Shiny Happy Apple Products

Apple opened it’s’ World Wide Developer’s Conference today, affectionately known as WWDC by the Apple crowd. The Keynote address is legendary for being the major event where revolutionary new products are announced. This year’s Keynote did not disappoint.

It was not as tightly presented or executed as it was when Steve Jobs was alive and still at the helm. But it was filled with glimpses of amazing products, some available immediately but most only available to developers in beta so that they can have a head start developing their products that take advantage of the new features announced today.

Three products stand out in my mind. The first is the sneak preview of the iMac Pro, a smoking new machine available next Christmas. It isn’t really intended as a consumer machine but there will be many professional programmers that will buy one anyway. At the high end it features an 18 core Xenon processor and a display like none ever built before. It borrows cooling technology from the MacBook Pro line to make it the coolest running desktop in its processor class. They are offering two terabytes of SSD storage as a high end option. It is truly a breath taking machine. And at $4999 for the entry level machine, it is modestly priced. Did I mention it supports up to 128GB of RAM?

The second product that impressed me was iOS 11 for the iPad. They are finally turning it into a machine that can do most everything that a desktop can do. It is certainly a premiere appliance computer. It much of a developer’s machine but then all the developers will be clamoring for an iMac Pro. They have added an app that lets you access the file system. They have finally mad multitasking useful. And they have added drag and drop between apps. They’ve added a bunch more features but those were the ones that stood out in my mind.

The third product that wowed me was the HomePod smart, wireless speaker system. Once again, Apple has taken their time and got the product right instead of rushing to be the first to market with a new product. By waiting until they got it right they have guaranteed their dominance of the category.

I know I said there were only three things that grabbed my attention and for the most part that is true. I do want to comment on three new libraries that they made available to developers today. The first is Metal2, their graphics library. When combined with the other two libraries, core ML Machine Learning and ARkit Augmented Reality API, they have set themselves up to be the premiere immersive environment development platform. This market is growing exponentially and may single handedly keep Macs relevant in the shadow of the huge mobile device market that accounts for most of Apple’s revenue.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Problem Solving with Modern Computer Languages

There are different ways of arriving at the same solution to a problem. You can be good at recognizing patterns and matching aspects of the problem at hand with problems that you have seen in the past. You can exhaustively enumerate the potential solutions to the problem, judging each one to decide if it solves the problem and how well. You can imagine the problem solved and work backwards toward the conditions from which you are starting. You can conduct experiments to determine what methods might be useful in solving the problem and which would not.

The list of ways to solve a problem goes on and on. The way to choose one of them depends upon how your mind works. For instance, you might be better at remembering things than you are at thinking of all the possible solutions. You may instinctively recognize patterns. Probably your style is some combination of methods. It may even vary from problem to problem.

I suspect that we will discover the same characteristics are evident in Artificial General Intelligence (AGI). We will find that as we automate all of the various approaches to solving problems we will be producing a catalog of techniques that will serve the AGI as well as it does us. We are already building such catalogs. They are called library packages.

Every computer language has them. Most recent language have centralized repositories that make these packages available and sophisticated dependency management software that assist the programmer in ensuring that the versions of the libraries that are loaded to support any given application are compatible with each other.

Some cutting edge languages are building the package management facilities right in to the syntax of the language. This acknowledges the importance of dependency management while simplifying the task for the programmer. Instead of having to look for an external dependency management package and perhaps choosing between several competing solutions, the programmer just uses the built in facility.

Another feature of the modern computing environment is the facility with which languages are interoperating with each other. Whether it is through the facilities of a Foreign Function Interface (FFI) or an Application Programming Interface (API) or some other mechanism for interoperating with different software facilities, the modern programmer is more often faced with an embarrassment of riches when it comes to ways to implement commonly used library facilities.

When combined with the exponential increase in raw computing power, todays programmer is able to spend more time coming up with abstract solutions to the problem at hand and less time trying to conform their solutions to digital representations.

My major hope is that we don’t forget how all the underlying infrastructure works. It would be a shame if we got so sophisticated that no one knew how to fix the basic system software that interacts directly with the hardware. Of course, as time goes by, that will become abstracted into the design of the hardware as well.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

Gödel’s Incompleteness Theorem

One of my favorite mathematical theorems is Gödel’s Incompleteness theorem. There have been several books written on it. My favorite one is Gödel, Escher, and Bach: An Eternal Golden Braid by Douglas Hofstadter. Just recently one of my favorite channels on You Tube, Numberphile, posted several videos on the topic: Gödel’s Incompleteness Theorem, Gödel’s Incompleteness Theorem (extra footage 1), and Gödel’s Incompleteness Theorem (extra footage 2) all interviews with professor Marcus du Sautoy.  They do a better job than I can at describing what the Theorem is but I’ll give it a shot. If I confuse you, I refer you to the videos or Hofstadter’s book.

First, I draw your attention to the paradox represented by the following sentences:

The next sentence is true. The previous sentence is false.

If you think about it for a minute you will see that neither sentence can be either true or false. If the first sentence is true then it tells us the second sentence is true. But the second sentence tells us that the first sentence is false. If, as the first sentence asserts, the second sentence is true then the first sentence must be false in which case the second sentence must be false. As you can see, we can’t come to a consistent conclusion about the truth or falsity of theses sentences.

What Kurt Gödel did was encode mathematical statements such that each one had a unique number that was derivable from the axioms used to prove it. Then he encoded the statement that asserted that: “there is no proof of this statement”. If there is a proof of the statement, the statement is false. If there isn’t a proof of the statement, the statement is true but isn’t provable thus there is a true statement that is not provable within the constraints of the mathematical system.

This has profound philosophical implications. It implies that there are things that are true but unprovable. This other disturbing conclusion is that there may be no such thing as objective reality. Reality may be a product of our interpretation of our sensory input and yet everyone may be perceiving the world differently.

So then we think about some of the more practical aspects of this idea. Take for example programming languages. Programming languages are formal mathematical systems for expressing procedural statements. They are subject to the same analysis as other formal mathematical systems such that given any particular language, there are truths that cannot be expressed in it.

Taken to the extreme, all sufficiently sophisticated languages can be shown to be ultimately equivalent. This is called being Turing complete. Since this class of languages can be shown to be equivalent, it can be deduced that there are some truths that cannot be programmed.

Further, it can be extrapolated that since our minds are finite electro-chemical systems that there are some truths that cannot be understood by the human mind. Perhaps that is as close to a proof of the existence of a superior being as we’ll ever be able to understand.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

 

Ancient Fairies

I’m going to write this post using the language favored by the writers of the Ancient Aliens series. I find it always entertaining and occasionally thought provoking. Their choice of language does more to destroy any confidence that I have in anything they say. Some of it may actually be true but I will be surprised if and when we discover that it is.

What if we’ve got the source of the ancient aliens all wrong? What if, instead of the stars, they come from alternate universes? The whole concept of alternate universes alleviates the question of how they overcome the immense distances between Earth and the stars.

Is it possible that the ancient aliens and the fairies are one and the same race? As Arthur C. Clark observed, sufficiently advanced technology will appear as magic to sufficiently primitive peoples. Are we sufficiently primitive compared to the ancient aliens?

Did these entities reveal themselves to the administration of Dwight Eisenhower only to be surpassed by J. Edgar Hoover and the head of the CIA? How did the fiasco at Roswell even happen? Why did the US government spend seventy years denying the existence of aliens? Could it be that there were no aliens, only highly evolved humans from the future?

The key to the genre is to ask intriguing questions and propose fantastic answers to them while claiming that they are the only possible answers to the questions. The aren’t. They aren’t even the most likely answers but you gloss over that fact.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.

The Future’s So Bright…

If I’ve learned anything from reading, and watching, science fiction all my life it’s that anything we can imagine we can eventually accomplish.  For example, in the late sixties a show called Star Trek featured hand held communications devices that the crew of the Enterprise used to routinely communicate with each other wirelessly. Ok, so this wasn’t that far fetched. We had radio transceivers that could do that job. The were called walkie-talkies.

The thing that was missing was the infrastructure to allow the solution to scale to the point where everyone could have their own communication device, we call them cell phones, and could talk to practically anybody else on the planet. Not only that, but these cell phones now embody computers, cameras, music players, as well as no less than three radios on different bands for different purposes.

The cell phone exists because Gene Roddenberry imagined it and some engineer decided that he could build it. What’s next? We already have self driving cars. We are well on our way to building fleets of space ships with which to colonize the solar system.

Our electronics gets smaller and smaller. Intel just announced a CPU chip with eighteen cores on it. And while I sit here wondering what we’ll do with eighteen cores on a single chip, another part of my brain is remembering the old adage that applications expand to consume all resources available to them. The question becomes what will we be able to do with eighteen cores?

Artificial Intelligence is another area of rapid advancement. We have been predicting the advent of Artificial General Intelligence in science fiction for many decades. Now, it seems like it is actually going to happen in the near future. We have an entire stable of specialized AI applications. For instance, face recognition, sound recognition, pattern recognition, just to name a few.

We have Siri, Alexa, and Echo, all listening to hear what we are asking and searching the web, one of the more surprising developments of the last decade or so, for any and all knowledge know to mankind. Gone are the days of bar bets about who recorded what song and when? Or who played Billy Mummy’s big sister on Lost in Space (Angela Cartwright)?

In short, all of these wonders, even search engines started life in the fertile imaginations science fiction writers. If you want to know what new product tomorrow’s headlines are going to announce just dig out your back issues of Fantasy & Science Fiction or Analog Magazine. At the rate we’re going, we’ll have a hard time remembering what the world was like ten years ago, much less fifty.


Sweet dreams, don’t forget to tell the ones you love that you love them, and most important of all, be kind.