Of Gradle, Groovy, and How I’ve Come to Love Build Automation

I finally got my project at work to build using Gradle. Grade is a build tool, something like make or ant except that it is implemented as a Domain Specific Language (DSL) built on top of Groovy. Groovy is a remarkable language in its own right. It is a dynamic language that compiles to Java byte code so it runs on the Java Virtual Machine (JVM). It can freely call code written in Java and Java code can call code written in it. This gives Groovy an enormous head start in terms of the variety of libraries that it can take advantage of right out of the box.

What is so great about Groovy, anyway? Well, it is a lot less verbose than Java for one thing. You rarely need to use semicolons in Groovy. Usually, it knows where the end of a statement is without you having to tell it explicitly with a semicolon. Another thing Groovy is good at is figuring out the types of variables without explicitly being told. This makes it easy to define a variable using the def keyword and letting Groovy figure out the type of the variable by what you assign to it. Groovy is touted as a scripting language and it does serve in that capacity very well but it can also be used to write very succinct and flexible object oriented code, like Java. Another place where Groovy saves typing is with imports. All of the more commonly used library packages are included by default.

Groovy also adds a new syntax for cleanly entering Map constants. This makes creating keyword/value data structures much easier. These are very useful for collecting information such as configuration parameters. There are lots of other neat features that Groovy brings to the table but to get back to Gradle, it is an application, written in Groovy specifically for managing the build process.

Gradle makes the build process a lot more expressive. It is more concise while at the same time being more flexible. It is easily extended both in an ad-hoc fashion by writing code specific to the build at hand as well as in a more general fashion by supporting plug-ins that can be shared among many different projects.

Using Gradle to automate my build process has turned a tedious job into one that is as exciting for me as writing the rest of the code in my application is. If you are developing in Java or Groovy or any other language for that matter, I suggest that you give Gradle a look.

The Annual ARES Christmas Dinner

Tonight was the annual Huntsville/Madison County ARES Christmas Dinner. ARES is the Amateur Radio Emergency Service. We get together one night a week and practice setting up an emergency communication network. We get together once a month and meet to discuss emergency communications topics. We often have a presentation on one aspect of emergency radio or another at these monthly meetings.

Once a year, we have a Simulated Emergency Test (SET). The leadership gets together and concocts an emergency scenario and we react to the simulation as it unfolds. It involves deploying radio operators to various served agencies, The Red Cross for example, where we help coordinate a rational response to the simulated emergency. It gives us an opportunity to see how complete our “go bags” are and practice our skills in a more realistic environment.

Anyway, the Christmas Dinner is a once a year opportunity to get together socially and meet each other’s wives and husbands. It was a nice dinner. I enjoyed it. It was a practical demonstration that even nerds can be social upon occasion.

Another Facet of the Blogger Emerges

The theme for today seems to be Data Analysis/Data Science. Three separate times today the topic of Data Analysis has come up. As I sat down to write my blog post tonight, it struck me as interesting that the topic had woven itself through my day so thoroughly. I took it as an opportunity to introduce this facet of my interests to the readers of my blog. If this isn’t your cup of tea, check back next post. My interests are many and varied and I intend to write on all of them. In other words, if you don’t like this post, come back for the next one. It’s bound to be different.

As may already have become obvious, I’ve recently embarked on a journey of exploration of Data Analysis and Data Science. If you read the Wikipedia articles that I linked to in the first paragraph above, you will see that the field of Data Analysis is very broad and the field of Data Science is somewhat controversial. While initially they seem to be different but related fields, the more I try to characterize the difference between them, the more I realize that they have a lot in common.

I think the problem with trying to differentiate between them is that they both appeal to the naive interpretation of their names, which in both cases is incredibly broad. I am reminded of the problem that the field of Artificial Intelligence has struggled with for its entire existence, namely that there isn’t an unambiguous definition of Artificial or Intelligence that rigorously captures what the practitioners of the field intuitively understand it to mean.

Getting back to the inspiration for this post, the first time the subject came up today, I was getting a demonstration of the work that some of my colleagues were doing with Oculus Rift and the Unity development environment. We ended up discussing the fact that the customers, for whom they were developing applications, had started by capturing their working data using Microsoft Office applications like Excel, Access, and PowerPoint. Over time, their data had become so large that these applications had become unwieldy. My colleagues had taken the data that had been captured with these legacy processes and imported it into a new application and had thus been able to provide  a more unified way to manage the data.

One of the things that was learned along the way is the customer had learned to love their existing processes. Consequently, the application that was being developed to supersede those older tools had to retain much of the look and feel of them in order to gain acceptance from the customer. This was a very important realization. Earlier in my career I have had personal experiences where customer acceptance was never achieved because of an aversion to the perceived difficulty in learning a new interface. Thus, the first observation that I gleaned about large collections of data is that the owners of the data have to be accommodated when you are trying to evolve their use of their ever growing collection of relevant data.

A little bit later I had a conversation with a colleague about my understanding, naive as it is at this stage, of what Data Analytics is and how it is relevant to the large aerospace company for which we both work. Strangely enough, the conversation soon turned to the fact that the first thing that we, as would-be practitioners of Data Analysis, would have to do is to educate the engineering and business staff about the benefits that could be accrued from using the data that is already being collected in this way while, at the same time, being careful to respect their perspective on the data and the ways that they are currently using it.

Then, when I got home and was reading my personal email, I came across a link to Big Smart Data, the blog of Max Goff. I met Max a number of years ago in Seattle while he was working for Sun Microsystems. He was a Java Evangelist and at the time I was struggling to persuade the software developers where I worked of the benefits of write once, run everywhere, the  battle cry of Java at the time. I followed his career as he left Sun and started a consultancy in Tennessee. Somewhere along the line, I lost track of where he was blogging. I was thrilled to find his latest blog and also excited about the fact that he was exploring the same aspects of big data that form the core inspiration of my interest in Data Analysis.

A former boss of mine once said something to the effect that you could tell when an AI application was going mainstream when it had a database behind it. I think there is a lot of wisdom in that observation. Access to a large data store is necessary but not sufficient for emergent AI. I believe we are on the cusp of the emergence of Artificial Intelligence, ambiguous as the definition of it may be. I believe that Big Data, Data Analysis, and Data Science are going to be instrumental in this emergence.

When I first came to work at the aforementioned big aerospace company, it was because I was given the opportunity to work in their AI laboratory. AI winter soon set in and I spent the intervening years doing what I could to help the company solve hard problems with computers. Along the way, I have worked on some amazing projects but I have always longed to pursue the goal of emergent artificial intelligence. I am beginning to see the potential for pursuing that goal arising again. Things just naturally come around in cycles. And so that was my day from the perspective of Data Analysis and Data Science.

Electron is Awesome

I finally got the current version of Netlog, my program to help me create logs of the ARES Training Net, moved over from being a web app to being a desktop app in the electron framework. I had to require jquery and schedule the init function to be run 100 microseconds in the future instead of depending on the apparently non-existent onReady event of the document. Figuring this out took me several minutes but it really wasn’t that difficult at all. I suspect that getting it to run as an app on windows and linux will be even easier. I wouldn’t be surprised if getting it to run on Android and iOS wasn’t fairly easy as well.

I suspect there will be a bunch of applications that work this way in the near future. I might even get them to let me write an app in Coffeescript at work. I doubt it. It’s a little bit too free wheeling for the corporate environment. I guess that’s my main problem. I’m too much of a rebel to excel in the corporate environment.

I spent all of my time yesterday learning about photon and electron and forgot about writing my blog post. Well, in the spirit of moving on, here is my blog post for today. Tomorrow is another day. I hope I can get my momentum back and post again tomorrow.

Pondering Blogs and Blogging

I have missed a couple of days posting here but I am not going to let that discourage me. I am determined to continue making blog posts as frequently as possible. I am writing them for two reasons. First, I am posting to become more proficient at expressing myself in writing. I find that when I write my thoughts down, I can more easily examine them and evaluate them.

The second reason I am writing is to share my thoughts with others. I have noticed that there are a few people that have subscribed to my blog. They get notified when I make a new post. I can only assume that at least some of them read what I’ve written. I’ve also set up a utility that copies my blog posts to Facebook. I’m sure that I get a few readers there on occasion.

Which brings me to the point of this post. If you have read this far, take a moment to make a comment, whether here or on Facebook. Let me know what you like about my blog. Let me know what you don’t like. I don’t guarantee to change what I post but I’m certainly interested in what you think.

If you’ve got something you’d like to discuss at length, set up your own blog, make a post, and share the link in the comments here. Establishing a conversation is another goal of my blog. I know, I said there was only two reasons but now we’ve both learned something.

Code Practice the You Tube Way

I got my code practice in a little bit differently today. I was looking at You Tube to see what kind of videos they had of people with QRP (low power) CW (Morse code) radios. They had several videos of people building the kit that I have, the Pixie 40m ham radio.

Another video featured the MFJ Cub and in addition to showing how it went together, the videographer showed his first contact with it. I was able to copy most of it. He had put subtitles on the screen but I tried to copy the code without looking at the screen. I’m not quite there yet but I’m becoming confident that I can do this.

The key thing I’ve learned this time around (I’ve attempted to learn Morse code on numerous occasions) is the importance of learning the code at speed. Morse code sounds different at twenty words a minute than it does at five words a minute. At five words a minute, you get all hung up in dots and dashes. At twenty words a minute, you hear the sound of entire letters. That is when you can start to recognize the sound of entire words. And that is when you can copy code straight off the radio.

It is an exciting thing to learn a new skill. I am getting excited by the prospect of getting on the air old school with nothing but a key and headphones between me and my radio.

Ham Radio High Jinx

I have been much more active in ham radio the last couple of years. My activity has been accelerating this last year in particular. I have purchased a quad band radio that has allowed me to branch out and start communicating on 6 meters as well as 2 meters and 70 centimeters. I haven’t got an adequate antenna to use the 10 meter capability of my radio though. I don’t have a place to mount one on my vehicle if I did have one. 10 meter antennas are longer and more massive than the shorter wave antennas like 6 meters and shorter.

I live in a two bedroom apartment on the ground floor. I have not figured out a way to put up an antenna indoors. Consequently, my vehicle is my radio shack. That is a major reason why I haven’t ventured into HF activity (that’s 10 meters and longer). I am building a little low power (5 watt) radio that will transmit CW (Morse code) on 40 meters. When I say building what I should say is that I am carrying around the kit of parts in my pocket. I need to carve out some time and start assembling it.

When I have built it, I should be able to put up a simple antenna and get on the air. That is, if I learn to send and receive Morse code. To that end, I have been attending a class every Friday night and practicing as often as possible in between. I am beginning to be able to copy code if I listen to it over and over. I expect to start getting a lot better at it soon. I’d like to have my little 40 meter Pixie (the radio I was talking about) built by then. I have a little over a month before the class is over. I am drawing the line in the sand here and publicly stating that I intend to have it built and tested by the end of the CW class.