I love scripting.  I mean, back in the 90s, after my brother taught me HTML 4, I took my knowledge further and started to program in PHP, but since leaving the field to become a Linux Systems Admin, my coding skills started to favor Python.  But I rarely ever publish scripts of my own, and the reason for that is maintenance.

Now there are some times, depending on the complexity of your script, that you can get away without updating it for several years.  I mean my original backup script was in use for over 5 years and didn’t need modifying, and even when I did modify it, it wasn’t because it had broken, but because I wanted to reorganize how the files were saved offsite.  However the same cannot be said for my weather script.

Since September 2018 I have been recording the temperatures for Ottawa and Montreal using the OpenWeatherMap API.  This recorded a few interesting points:

  • Date and Time (using EPOCH time stamp)
  • Name of the City
  • Country of the City
  • Latitude of the city
  • Longitude of the City
  • Sunrise for the date
  • Sunset for the date
  • Humidity at the time
  • Temperature at the time
  • Min Temperature for the date
  • Max Temperature for the date
  • Air Pressure at the time
  • Cloud conditions at the time
  • Windspeed at the time

And this script ran for 4 years, but then suddenly stopped.

Now I did not know this at the time, because I never set up any verification or email settings if something went wrong.  I mean, it was a simple enough script, to make an API call, read the JSON values, and put them into the Database.  Nothing complex.  But there are some other issues at play, which I did not know until I fixed the script and realized it still would not run.

So lets start with the basics, what started it all?  How did I come across the issue in the first place?  After all, the article is titled “The Value of Checking on Your Custom Scripts Often”, which of course implies I don’t, and really… I don’t.  So, how did I know it wasn’t working?  It was all by chance, and I mean lucky chance.  Every time there is an update to WordPress I get a notification stating my websites have been updated.  One of these sites is no longer maintained as a WordPress site, but it was still active.  So I uninstalled WP from the domain, and while doing this, it is always good practice to delete the associated database as well.  While looking through my account databases, I realized I had way too many, I only have a handful or so of active sites, but I had almost 20 databases, so I decided to do some spring cleaning.  While cleaning up dead DBs I came across my weather DB and remembered I had a script running in the background collecting data, and I decided to go a check the DB to see when the last entry was.  This is when I found out it has not been recorded anything for over 2 years. This hit me hard, as this year we have had a fairly warm February compared to most, and with 5 years of data, I could compare this February to the last few years and see how strange it has been.  But with a 2-year gap, and not having temps for this month, how could I do that?  I was stunned and disappointed that I had not verified my script more often.

So the first thing I did once I realized the gap, was test my script.  I sent it through python3 (as I was sure I had coded it for Python 3 and not Python 2) and I got an error back that urllib2 could not be found, and after some research, I found it could not be installed as it was no longer supported by Python 3 and my options were urllib and urllib3. I converted my script to use urllib and reran it, at which point it told me mysql.connector was not installed.  after installing this package and testing it once more, it worked.  But this got me thinking, why were these packages not installed?  Well, this was 100% human error and my fault.  You see, in March 2022, when my script stopped working, I moved from VestaCP to HestiaCP and from CentOS to Ubuntu.  This of course made a lot of sense, I had forgotten to install the libraries when I moved everything over.

So I tested my script after switching to urllib and installing mysql.connector and everything seemed fine, but there was still the issue of having 2 years of data missing.  I was still bummed. So I did a quick search online for historical data to see if I could find a source with the same info as what bihourly (every 2 hours, not every half hour) query does.  Lo and Behold OpenWeatherMap has a historical data branch.  Not free mind you, but not too expensive.  I was able to get the 2 years of data for $20 USD ($10 per location) however the historical data is missing the Sunrise and Sunset for each day, but to be honest, my interest has always been in the temperature, and not the sun’s position.

As an afterthought, and something I might look into in the future, is to broaden the scope of my data.  As I said, I started recording the data in 2018 but found that I could request over 40 years of historical data.  So I could pretty much backfill 40 years of data for another $20 USD and continue to keep it updated with my script.  And since I do not care about the Sunrise and Sunset, I could just remove it from my script altogether and not record it.  I think I might tackle this in a few weeks and start on the PHP page to display it…

Anyway, back on track, after ordering the JSON for the missed data, I got ChatGPT to help me generate a simple insert script to add all the missing data.  I tested it with a dummy DB which had the same structure so that I could easily add them into the real database when I could confirm the script would do what I wanted.  I then ran my original script to get the latest data and test it out to make sure everything was working again.

So, in the end, I realized the importance of making sure everything is working as it should, on a regular basis.  I could have saved myself some trouble and $20 USD had I simply made sure to check my custom script on a regular basis.