I graduated

1.5 months ago. I guess it should feel like something? Clearly not, as I've been in no hurry to blog about it in more depth. And it only took 8.5 years :d

That is also probably the reason. I've been working almost full-time for about 4.5 years now, and my studies were already somewhat complete even when starting to work. During that time they just progressed slowly and sluggishly, but relatively surely. The two larger challenges were passing swedish and then of course the thesis. Everything has just been one large marathon, with the goal of just reaching the finish line.

And now I did. Sure, it's an acomplishment, but is it really anything that special? The goal was always there and reaching it didn't really come as a surprise; there was no great and overwhelming feeling reaching it. The race just ended, and there was little more to it than that.

* * *

But actually, there is a bit more. In order to get to that let's talk a bit about the thesis itself. Like my Bachellor's, Master's was about something that I was researching at the time. For many years I've been interested in backend development. Now everything's finally giving so much return of interest functionality-wise that it's worth to invest in deployment. What good is it to be able to write backends with only little effort when making them available and easily accessible is uncharted territory.

So for my thesis I wanted to take a look into the process that went to creating backends in a way that made them easier to deploy. I didn't want to chomp on a piece too big, so I purposefully didn't seek an automated deployment pipeline. I knew better than that. But the real friends are the ones we made along the way, so buckle up ;)

I had previously written a simple restaurant menu parser for my Nokia e51 in PHP. When the page was requested, the server fetched the menus from preset restaurants and rendered a slimmed down HTML-only page with just the menu contents. This was very advantageous compared to opening the site(s) of the restaurants manually and waiting for needless data transfer over a slow mobile connection, and then the slow rendering of complex content and scripts on the low-power mobile device.

Anyway. When the parser was in need of an upgrade I ported it to this new thing called .NET Core and hosted it on a now-retired iteration of usva. Then at one point I also dockerized it and made it run on Azure. This was done because that was the only application running on usva, and now I could do other things with the server (for example turn it off :p). But anyway. That was the first step on the journey of improved hosting.

Several years passed till the thesis. I needed an application to use as the base for it, and after making a list of possibilities the menu parser was kinda the only reasonable choice. Other alternatives would have been far too work-intensive in their implemenation itself. I needed a tried concept so that I could focus on just researching deployment and hosting (and common design things). But of course I couldn't not upgrade the application, so for the thesis I took the menu parser logic of the old system, and wrote a whole new site, including a history database for the foods and even some user-specific stuff as an example. I also added an admin API and some other features such as centralized structured logging.

While the academic goal was to present some design guidelines for containerized applications, the pratical goal was to create the basic building blocks that could be later used for more involved DevOps concepts, namely build and deployment automation. At the start I was not sure how these would end up looking, or even if I'd ever get to see them.

But ultimately I ended up with something I can be quite satisfied with! I could use the Dockerfile to automate building the application, and then further use Docker compose to define the runtime: namely the exposed ports, configuration and the whole set of services (application, updater-parser, database and log storage). What still remains to be improved is how the database is used, as there remained some manual steps for database migrations. But for everything else the process was automated quite far.

The goal was to simplify hosting and deployment, and it really did get a lot simpler. Just one command to build the app, another to copy it over to the target server, and one final command to update the running instances. But what is even more imporant is that it is the same three commands no matter the app. This opens the way to efficiently build automation in the future by replacing just those commands. But it also makes the manual work easier.

* * *

The work done on the thesis is just a part of a larger continuum about backend development. While the thesis (and graduation in general) is a great step, the work never ends. And I'm already busy with the "next" steps - as indicated by the surprisingly many blog posts as of late. There is always so much more work to be done to get better. This same train of thought extends to my other hobbies, too. In the shadow of the continuum, there just doesn't seem to be any time to feel the pride and acomplishment of the past.

At least alerting works...

I've also done some work in setting up infrastructure monitoring. There's still work to do - and do better - but at least I have one alert. And it works.

I think I might have just been thinking of doing something more leisurely, but Grafana sent me a Telegram message that there was something wrong with NATS response times. I open the link and see that there is no data, meaning that the instance is probably down. But there's also no any other data. Fuck. Everything going up in flames this soon?

But wait a sec. There is data, momentarily. Then it disappears again. I bring up logs for InfluxDB and and see an error "panic: keys must be added in sorted order". I spend quite a while trying to figure out what exactly is wrong and how to proceed, almost giving up. It seems that lot of the tooling for fixing and managing the files has been removed or made internal-only. But then I find an up-to-date guide for rebuilding the index and decide to try it.

Because my installation is dockerized, and there seems to be some issues with the rebuild command, I had to chown the data directory to "some user", and then run the repair command, and then chown the files back. And yay! It works again. For reference, the docker command I used: docker run --rm --user 1000 -v /path/to/influxdb-data/:/data influxdb:1.7.9 influx_inspect buildtsi -v -datadir /data/data -waldir /data/wal

At least least it works again. But just as I though everything was going nicely... Maybe the problem is the server itself? It served as my desktop earlier, but I moved away from it due to constant crashes with GTA V, and much more rare crashes other times. Maybe I have to invest in some proper hardware :o We'll see. Maybe it'll work again without issue for a long time. Pls :s

Testing the stack with CircuitPython


Like I stated earlier, one of the reasons why I’ve now been so much about improving my stack is the fact that I backed Meadow F7 a while back. I kinda want to maximize my productivity with it, so I’ve been doing what I can beforehand (and while the mood lasts).
I already talked how this work included setting up Grafana and other data collection facilitations. I also ‘teased’ about using NATS for registering some long-running ad-hoc jobs. I’ve been calling the NATS-based thing Sumu. More about it later. But anyway, now these were put to a somewhat unexpected test when I ordered bunch of preparatory stuff from Adafruit and got a Circuit Playground Express as a freebie.
I’m trying to keep this post short, so I’ll just state that it can’t really be any simpler to get some samples running with it. Basic documentation is very thorough and there’s a bunch of sample code available, and with a bunch of sensors included in the board itself. As kind of an embedded Hello World I moved to the embedded temperature sensor after blinking a led and playing some drum samples.
The code on the device reads the temperature once a second, and prints the averaged value every 10 seconds over the serial console. On PC I have a LinqPad script registering itself to Sumu (with a health check), reading the serial console, and pushing data to Postgres while also serving the current (or next) temperature value via Sumu to browsers (via Node-Red). There’s even error handling and retrying in case the serial console gets disconnected for some reason (for example the device is unplugged). All this in 180 lines (with majority of it being serial console stuff :p) and maybe an hour or two! Feeling good about things!

Let’s just hope that this is just the beginning, and not the peak :s