Home Sensor Network Part 4 - Setting Up A MQTT Server

May 16, 2023 · 1088 words · 6 minute read

Now that we have a server up and running we can start using it as a collection point for readings from sensors from around the home. To do that we're going to install an MQTT broker, plus a script that will subscribe to the broker and write the incoming data into a database.

We don't have any sensors yet, but before we worry about that (in the next thrilling instalment) we need somewhere to send that data, and store it. That's what we're setting up now.

Mosquitto MQTT Broker

First we need the MQTT broker, for this we'll use the go-to and simple Mosquitto, that is also present in the Raspberry Pi repositories, so it can be installed with nothing more complicated than an apt-get. Depending on your Linux distribution, and this is the case for Debian based ones, you probably also want to install the package 'mosquitto-clients', as that provides you with command line tools to test the server. (On Manjaro, this was included in the main mosquitto package.) With that you can run commands like mosquitto_pub and mosquitto_sub to test publishing and subscribing to your running server.

Before setting the configuration in /etc/mosquitto/mosquitto.conf, it's probably worth thinking a little bit on security.

MQTT has a few security options, but for this I'm going stick with using a username/password combination. That's reasonably easy to manage with my clients and not having to do anything complicated with certificates. This does mean all our messages will be sent in plain text, and not encrypted, but I think there's little harmful information in knowing how warm different rooms in our house are. That combined with the options in the script should prevent any accidental reading of any MQTT messages not intended for this server.

When adding usernames and passwords, remember to hash the password file with:

mosquitto_passwd -U passwords.txt

(In the Ansible role you have to make sure that you reset the content of the /etc/mosquitto/passwords.txt file after every run, otherwise you will end up hashing, and rehashing, the same file repeatedly, and the hash will no longer match the password that you've set.)

Once the configuration is set, you can enable mosquitto via System-D, and it should start up every time the computer starts.

Saving Data to the Database

Having the MQTT broker running is all good and well, but once the data comes rolling in we want to save it somewhere. That's almost all that the Python script store-mqtt-data does.

The initial version only has a few tables, which are in addition to the tables created by the previous get-dwd-weather-data, and this could either be stored in the same database, or a different one, as there is no overlap. In my case I'm keeping everything in the same database, so there's only one to backup. It might make sense to split it into two should the performance suffer from trying to read or write too many things are the same time into the database; but then I might have more issues generally around the performance of my lowly Raspberry Pi.

The initial data I'll be keeping is the temperature and humidity from all the rooms where there is a sensor, plus the readings from the gas meter. Anything else will require an update to this script later, and I don't want to bog myself down with adding lots of features at this point that I don't need (yet?).

Additionally there's a table called 'stations' that will map the various sensors to where they are placed. There were two options for doing this:

  1. Make the sensors location specific, give them a name, or MQTT channel to publish to, that's specific to where they are placed, or
  2. Make the sensors generic, and then map the sensor name to where it's placed.

I went for the second choice, which I thought would be easier to manage. It means sensors can be made all the same (no sensor specific code) and they could be moved between locations if you want with only updating an entry in this database, as opposed to firing up the Arduino IDE and uploading an updated customised version of the software.

One of the main things I'll want from the data is to know what the latest value is at each location, to get a 'what are the temperatures now?' view. Querying all the tables filtering for all the active locations will end up being quite a large ask of the Pi once the data starts to grow. That's why there is an additional 'lastUpdates' table, where the most recent value at each station is stored.

This table also means we can compare the current to the previous value easily, and then only write a value into the main table for temperatures and humidities when a value has changed. In a house with a central heating system that should be trying to keep a steady temperature (are least when it's heating in the winter) this should cut down the amount of values we store by a large amount, as for extended periods the temperature won't be changing. That should save disk space (less of a worry) and query times (more of a worry on the resource limited Pi).

The installation process for this script is similar to the previous weather data script installation, clone the script into a folder, create a specific Python virtual environment, and then run it.

One addition is that we want this script to run constantly so it can subscribe to any message that comes into the broker, which means we can't just trigger it with a cron job. For this there's a System-D service file that can be installed; replace all double curly bracket placeholders with the appropriate values. This is the first time I've had to create a service file, but it wasn't as hard as I'd feared, and means that the script will be restarted after every reboot (and other interruption), and ensures that it doesn't start before the Mosquitto broker that it's dependent on, causing confusion.

One catch was that in the file you have to call the script via Python with the -u flag for 'unbuffered', otherwise log messages sent from the script won't appear in the System-D journal, causing you much confusion when you're trying to debug it and see the messages sent to the journal.

As with all the other parts in setting up this sensor network, I've created an Ansible role that collects all the steps so you can run it automatically.


Overview: Home Sensor Overview