Yeah, I’ve Still Got It!

It has been an exhilarating day. I configured Jenkins to build a Groovy application that I wrote at work using the Gradle build script that I wrote on Monday. Then I figured out how to get my Grails web app that I developed with NetBeans imported into JetBrains IntelliJ IDEA. Both of these were accomplished with minimal reference to the great oracle, the internet. This reassured me that I am still a competent developer even if I’m not twenty something any more.

I get an adrenalin rush from figuring out these problems. The thing that was missing when I was working on the Grails web app earlier this year was a good development tool. NetBeans is a good development tool for what you pay for it (it is free). But it is designed for developing Java applications and hasn’t kept up adequately with the evolution of the JVM ecosystem of languages and developer tools. Eclipse (another free development tool) is better but again it is very Java centric.

IntelliJ IDEA on the other hand is written for paying customers. It has a consistent development model across the languages that it supports and is still actively evolving along with the platforms it supports. Even the free community edition outshines NetBeans and Eclipse.

The proof is in the ease with which I accomplished tasks in hours or days, that had taken me weeks or months when I was developing with NetBeans. I am now firmly convinced that one should not scrimp on one’s tools. And now it is time to say good night.

My New Taylor GSmini Koa-e Guitar

My New Guitar
My New Guitar

Pam bought me this Taylor guitar for an early Christmas present. It is made of rare Hawaiian Koa wood. It sounds and plays like a dream. Not to mention how beautiful it is. The size is perfect for an apartment. I can comfortably sit on the couch and play it without taking up more than a single seat worth of space.

It has a strong, bright tone. The strings are right above the fret board making it very easy to play. The intonation is perfectly in tune everywhere on the neck. I have never owned such a finely crafted guitar. It may be small but it is loud and if I need it to be louder, or I want to record with it, it has a built in pickup and volume and tone controls.

The Koa wood is endemic to Hawaii. It grows at altitudes between 330 and 7500 feet and requires between 33 and 197 inches of annual rainfall. It is a tone wood which is to say it is known for the musical tone it imbues instruments made from it.

I suppose I sound a little bit like a fanatic about it but until you’ve heard it played, it’s hard to describe how it sounds. I love my new guitar and my sweet wife who bought it for me.

Experiments with Gradle in the IntelliJ IDEA IDE

I’ve recently started using JetBrain’s IntelliJ IDEA software to develop software written in Groovy at work. It is an excellent package well worth it’s modest price. I have been investigating the differences between the Ultimate edition that my employer purchased for me at work and the community edition that JetBrain’s offers for free. While the extra features available in the Ultimate edition are nice, I have been finding my experience with the community edition as good or better than my previous experiences with Eclipse and Netbeans.

Lately, I have been experimenting with using the Gradle build tool within IDEA. I have figured a lot of things out about it but I am having trouble getting it to build an executable jar file. I’m sure it is just a matter of configuring something correctly within the jar task but I haven’t figure out how to do it yet.

I have learned a lot about IDEA and Gradle but I am just going to have to keep studying until I figure out how to get things to work the way that I want them to. I’ll write a post when I figure out how it’s done.

UPDATE: I figured it out. You just need to add the following lines to your build.gradle file. This includes all of your dependent jars in your executable jar. By the way, substitute the name of your main class  for  ‘org.me.package.Main’.

task fatJar(type: Jar) {
    manifest {
        attributes 'Main-Class': 'org.me.package.Main'
    }
    baseName = project.name + '-all'
    from { configurations.compile.collect { it.isDirectory() ? it : zipTree(it) } }
    with jar
}

A Word or Two Before I Call It a Day

Today was a busy day. I was on my own and I got most of the things on my TODO list done. I am so tired I can barely think but I made a commitment to blog every day so I’m going to write something.

It has been a while since I so single-mindedly worked on a project like this. I organized the stuff that we have stashed in our spare bedroom so that I could vacuum the carpet and then shampoo it. When I finished that, I assembled two plant shelf units for my wife to use in her plant business. I also made lunch for myself and went to the store for dog food.

It doesn’t sound like I did much but it filled my day. I was planning to spend some time building a radio kit that has been riding around in my pocket for several weeks now. I don’t think I could keep my eyes open to read the instructions much less solder. At least you can’t burn yourself while writing a blog.

Of Gradle, Groovy, and How I’ve Come to Love Build Automation

I finally got my project at work to build using Gradle. Grade is a build tool, something like make or ant except that it is implemented as a Domain Specific Language (DSL) built on top of Groovy. Groovy is a remarkable language in its own right. It is a dynamic language that compiles to Java byte code so it runs on the Java Virtual Machine (JVM). It can freely call code written in Java and Java code can call code written in it. This gives Groovy an enormous head start in terms of the variety of libraries that it can take advantage of right out of the box.

What is so great about Groovy, anyway? Well, it is a lot less verbose than Java for one thing. You rarely need to use semicolons in Groovy. Usually, it knows where the end of a statement is without you having to tell it explicitly with a semicolon. Another thing Groovy is good at is figuring out the types of variables without explicitly being told. This makes it easy to define a variable using the def keyword and letting Groovy figure out the type of the variable by what you assign to it. Groovy is touted as a scripting language and it does serve in that capacity very well but it can also be used to write very succinct and flexible object oriented code, like Java. Another place where Groovy saves typing is with imports. All of the more commonly used library packages are included by default.

Groovy also adds a new syntax for cleanly entering Map constants. This makes creating keyword/value data structures much easier. These are very useful for collecting information such as configuration parameters. There are lots of other neat features that Groovy brings to the table but to get back to Gradle, it is an application, written in Groovy specifically for managing the build process.

Gradle makes the build process a lot more expressive. It is more concise while at the same time being more flexible. It is easily extended both in an ad-hoc fashion by writing code specific to the build at hand as well as in a more general fashion by supporting plug-ins that can be shared among many different projects.

Using Gradle to automate my build process has turned a tedious job into one that is as exciting for me as writing the rest of the code in my application is. If you are developing in Java or Groovy or any other language for that matter, I suggest that you give Gradle a look.

The Annual ARES Christmas Dinner

Tonight was the annual Huntsville/Madison County ARES Christmas Dinner. ARES is the Amateur Radio Emergency Service. We get together one night a week and practice setting up an emergency communication network. We get together once a month and meet to discuss emergency communications topics. We often have a presentation on one aspect of emergency radio or another at these monthly meetings.

Once a year, we have a Simulated Emergency Test (SET). The leadership gets together and concocts an emergency scenario and we react to the simulation as it unfolds. It involves deploying radio operators to various served agencies, The Red Cross for example, where we help coordinate a rational response to the simulated emergency. It gives us an opportunity to see how complete our “go bags” are and practice our skills in a more realistic environment.

Anyway, the Christmas Dinner is a once a year opportunity to get together socially and meet each other’s wives and husbands. It was a nice dinner. I enjoyed it. It was a practical demonstration that even nerds can be social upon occasion.

Another Facet of the Blogger Emerges

The theme for today seems to be Data Analysis/Data Science. Three separate times today the topic of Data Analysis has come up. As I sat down to write my blog post tonight, it struck me as interesting that the topic had woven itself through my day so thoroughly. I took it as an opportunity to introduce this facet of my interests to the readers of my blog. If this isn’t your cup of tea, check back next post. My interests are many and varied and I intend to write on all of them. In other words, if you don’t like this post, come back for the next one. It’s bound to be different.

As may already have become obvious, I’ve recently embarked on a journey of exploration of Data Analysis and Data Science. If you read the Wikipedia articles that I linked to in the first paragraph above, you will see that the field of Data Analysis is very broad and the field of Data Science is somewhat controversial. While initially they seem to be different but related fields, the more I try to characterize the difference between them, the more I realize that they have a lot in common.

I think the problem with trying to differentiate between them is that they both appeal to the naive interpretation of their names, which in both cases is incredibly broad. I am reminded of the problem that the field of Artificial Intelligence has struggled with for its entire existence, namely that there isn’t an unambiguous definition of Artificial or Intelligence that rigorously captures what the practitioners of the field intuitively understand it to mean.

Getting back to the inspiration for this post, the first time the subject came up today, I was getting a demonstration of the work that some of my colleagues were doing with Oculus Rift and the Unity development environment. We ended up discussing the fact that the customers, for whom they were developing applications, had started by capturing their working data using Microsoft Office applications like Excel, Access, and PowerPoint. Over time, their data had become so large that these applications had become unwieldy. My colleagues had taken the data that had been captured with these legacy processes and imported it into a new application and had thus been able to provide  a more unified way to manage the data.

One of the things that was learned along the way is the customer had learned to love their existing processes. Consequently, the application that was being developed to supersede those older tools had to retain much of the look and feel of them in order to gain acceptance from the customer. This was a very important realization. Earlier in my career I have had personal experiences where customer acceptance was never achieved because of an aversion to the perceived difficulty in learning a new interface. Thus, the first observation that I gleaned about large collections of data is that the owners of the data have to be accommodated when you are trying to evolve their use of their ever growing collection of relevant data.

A little bit later I had a conversation with a colleague about my understanding, naive as it is at this stage, of what Data Analytics is and how it is relevant to the large aerospace company for which we both work. Strangely enough, the conversation soon turned to the fact that the first thing that we, as would-be practitioners of Data Analysis, would have to do is to educate the engineering and business staff about the benefits that could be accrued from using the data that is already being collected in this way while, at the same time, being careful to respect their perspective on the data and the ways that they are currently using it.

Then, when I got home and was reading my personal email, I came across a link to Big Smart Data, the blog of Max Goff. I met Max a number of years ago in Seattle while he was working for Sun Microsystems. He was a Java Evangelist and at the time I was struggling to persuade the software developers where I worked of the benefits of write once, run everywhere, the  battle cry of Java at the time. I followed his career as he left Sun and started a consultancy in Tennessee. Somewhere along the line, I lost track of where he was blogging. I was thrilled to find his latest blog and also excited about the fact that he was exploring the same aspects of big data that form the core inspiration of my interest in Data Analysis.

A former boss of mine once said something to the effect that you could tell when an AI application was going mainstream when it had a database behind it. I think there is a lot of wisdom in that observation. Access to a large data store is necessary but not sufficient for emergent AI. I believe we are on the cusp of the emergence of Artificial Intelligence, ambiguous as the definition of it may be. I believe that Big Data, Data Analysis, and Data Science are going to be instrumental in this emergence.

When I first came to work at the aforementioned big aerospace company, it was because I was given the opportunity to work in their AI laboratory. AI winter soon set in and I spent the intervening years doing what I could to help the company solve hard problems with computers. Along the way, I have worked on some amazing projects but I have always longed to pursue the goal of emergent artificial intelligence. I am beginning to see the potential for pursuing that goal arising again. Things just naturally come around in cycles. And so that was my day from the perspective of Data Analysis and Data Science.

Electron is Awesome

I finally got the current version of Netlog, my program to help me create logs of the ARES Training Net, moved over from being a web app to being a desktop app in the electron framework. I had to require jquery and schedule the init function to be run 100 microseconds in the future instead of depending on the apparently non-existent onReady event of the document. Figuring this out took me several minutes but it really wasn’t that difficult at all. I suspect that getting it to run as an app on windows and linux will be even easier. I wouldn’t be surprised if getting it to run on Android and iOS wasn’t fairly easy as well.

I suspect there will be a bunch of applications that work this way in the near future. I might even get them to let me write an app in Coffeescript at work. I doubt it. It’s a little bit too free wheeling for the corporate environment. I guess that’s my main problem. I’m too much of a rebel to excel in the corporate environment.

I spent all of my time yesterday learning about photon and electron and forgot about writing my blog post. Well, in the spirit of moving on, here is my blog post for today. Tomorrow is another day. I hope I can get my momentum back and post again tomorrow.

Pondering Blogs and Blogging

I have missed a couple of days posting here but I am not going to let that discourage me. I am determined to continue making blog posts as frequently as possible. I am writing them for two reasons. First, I am posting to become more proficient at expressing myself in writing. I find that when I write my thoughts down, I can more easily examine them and evaluate them.

The second reason I am writing is to share my thoughts with others. I have noticed that there are a few people that have subscribed to my blog. They get notified when I make a new post. I can only assume that at least some of them read what I’ve written. I’ve also set up a utility that copies my blog posts to Facebook. I’m sure that I get a few readers there on occasion.

Which brings me to the point of this post. If you have read this far, take a moment to make a comment, whether here or on Facebook. Let me know what you like about my blog. Let me know what you don’t like. I don’t guarantee to change what I post but I’m certainly interested in what you think.

If you’ve got something you’d like to discuss at length, set up your own blog, make a post, and share the link in the comments here. Establishing a conversation is another goal of my blog. I know, I said there was only two reasons but now we’ve both learned something.

Code Practice the You Tube Way

I got my code practice in a little bit differently today. I was looking at You Tube to see what kind of videos they had of people with QRP (low power) CW (Morse code) radios. They had several videos of people building the kit that I have, the Pixie 40m ham radio.

Another video featured the MFJ Cub and in addition to showing how it went together, the videographer showed his first contact with it. I was able to copy most of it. He had put subtitles on the screen but I tried to copy the code without looking at the screen. I’m not quite there yet but I’m becoming confident that I can do this.

The key thing I’ve learned this time around (I’ve attempted to learn Morse code on numerous occasions) is the importance of learning the code at speed. Morse code sounds different at twenty words a minute than it does at five words a minute. At five words a minute, you get all hung up in dots and dashes. At twenty words a minute, you hear the sound of entire letters. That is when you can start to recognize the sound of entire words. And that is when you can copy code straight off the radio.

It is an exciting thing to learn a new skill. I am getting excited by the prospect of getting on the air old school with nothing but a key and headphones between me and my radio.