New Hosting Machine

I host this little web site at home on an Ubuntu Linux machine.  Just for kicks, I recently bought a new, more beefy machine and migrated everything over from the old machine to the new one.

I use LetsEncrypt certificates to provide HTTPs on this site.  (They’re FREE)  When I was copying files from the old machine to the new machine, I just copied the entire /etc/letsencrypt directory and went on my merry way.  Everything ran fine and Apache was happily running with the certificates at their specified locations.

Yesterday, however, my certificates expired.  I tried to run “certbot” to renew my certificates but I received a strange error.  First, however, I was warned that my certificates in /etc/letsencrypt/live were not symbolic links.  It was true; I had simply copied the certs from wherever they were to the “live” directory.  So, I just put the certs somewhere else and made a symbolic link to them.  That’s when the strange error started happening when I tried to renew the certificates.  The error was something like this:

TypeError:  not supported between instances of 'NoneType' and 'NoneType'

Thankfully, certbot is written in Python so I could try to take a look at the code.  After looking it over a bit, I figured out that it’s probably best to put the certificate in the “archive” directory and link to it from “live”.  There were 14 PEM files in the archive directory, and the last was named “cert14.pem”, so I just linked the cert.pem in the live directory to that file.  I did this for all four files used to support TLS and when I reran certbot…BINGO!  All the certs had been renewed and the site was back in working order.

Long Time Since Update

It’s been a long time since I’ve written here. During this time, I got a new job and have been working on an Android application for Soft-Pak. I’m also working on the back end. Basically, the Mobile-Pak project has been in my hands for close to a year now.  I had never worked on an Android application prior to this job, so it’s been a bit of a steep learning curve.  My feet have also been held to the fire several times because Mobile-Pak is critical to the operations of the customers.  If Mobile-Pak isn’t running, trash trucks don’t go out.   That’s bad.

Using REST Assured for Testing

I decided to start testing my Dropwizard RESTful interface by using REST Assured.  After poking around a little bit to figure out how to test my service as deployed, I arrived at using REST Assured after seeing it referenced in quite a few places.

After I figured out the basics of using it with its “given/when/then” syntax and setting headers, I wondered how to keep from setting the headers – including Content-Type – on each request/response.  I found out that to do this you must work with a RequestSpecBuilder.

I also found out a little bit about JSONPath while working on this aspect of my project.

Investigating Dropwizard

I was over at /r/java scanning the posts when I ran into a question about Java microservices and what people are using.  I saw two comments mentioning Dropwizard so… thought I’d take a look at it.

It’s similar to Spring Boot, but perhaps with a more heavy emphasis on simply banging out RESTful web services.  Dropwizard uses Jetty under the covers to handle all of the socket stuff and whatever other kind of servlet functionality it might need.

Within the Reddit thread, there’s also quite a bit of talk about Docker and how people are using it.  Dropwizard has introduced me to the idea of “fat jars” which I’d never heard of previously – which seem in a way to be a competitor to using Docker to package up your application.  Basically, a fat jar is one giant jar which includes all of the class files from every dependency of your application.  Here’s a web resource on the topic from 2012.

Doing Some Coursera Work

I’ve enrolled myself in “Algorithms, Part I” from Princeton University on Coursera.  I’ve put a hold on the textbook through my San Diego Public Library iOS app and expect it to be at my local branch by the weekend.  What a time we live in!

I hope to go all the way through the course and not get distracted by the many other topics which interest me.  Gotta stay on course!

Hadoop Time

It’s finally time for me to look into Hadoop a little more closely.

I cloned the Hadoop 2.7.3 repo and built it without too much difficulty.  I even got the daemon running locally in single node mode.  After running through a few of the examples, etc., I wondered what Amazon Web Services had to offer as far as running a Hadoop cluster goes.  That’s when I discovered Amazon EMR.  After looking into the documentation – which is as excellent as I expected from the Amazon folks – I decided I would run through their example and incur the $1.05 or so it would cost to bring up a cluster and play with it.  Hopefully another post will be forthcoming on the experience…

Building S2N

s2n is the new TLS/SSL implementation by Amazon.

I retrieved the git repository and tried to build it, only to have it crash while running a unit test.  I put some debugging statements in the test but they never showed up.  I was getting an “error 139” error and initial searches indicated it might have been a gcc problem.

Turns out, my stack was blowing up during initialization of an array and I needed to increase my stack size using “ulimit -s 16284”.  An interesting little problem for me since I haven’t worked in the C world for quite some time.