Many developers vastly prefer writing code for Raspberry Pi’s over writing code for microcontrollers. The operating system stack on the Pi facilitates many programming luxuries that are not available in the constrained, low-level environment of micro-controller development. With all the talk on how much developing for the Pi is exactly like writing code that is hosted on say, Heroku, it’s easy to forget how much of the work on the Raspberry Pi is actually sysadmin-ing. Once you are honest with yourself about this being a significant percentage of the work that you’re doing it becomes clear that it’s time to get serious about tooling.

Writing idempotent scripts makes products more reliable

We built our tools on the principle of idempotency, the idea that an operation will produce the same results if executed once or multiple times. As an example, one thing you might do in a script is add a line of code at the end of a file.

echo "add this line" >> file

If you don’t check for the presence of that line you will add another one every time you run the script. Better is:

if !(grep -q "add this line" file;) then
  echo "add this line" >> file

In the process of developing you’ll often be running the same scripts a few times over. Sometimes you'll be refining your tools and changing the script. Other times you may be scaling up, putting code on a number of different instantiations of the product. You may be great at admin of your products, but there will always be a time when you wonder if you already ran a script on the current product or not. Instead of wondering and worrying, it's much nicer to know that your script will behave itself regardless.

One task we automated was clearing the console on boot and shutdown of the device, which required changing a fair number of system files. If this process were done manually it wouldn't have been as reliable or as quick. We might miss a step! Scripting this process removed the chance for human error.

Get the script here (developed for Raspbian Jesse).

Tooling makes products more scalable and improve project documentation

Sysadmin work usually entails creating files, changing (system) files, installing tools and starting processes. To ensure that a sequence of steps we went through on one device was repeated in the same way on other devices we created several scripts. For example: we wrote install scripts to create systemd services, set hostnames and sync devices, making our our process easier to scale.

This is our script for setting hostnames based on the device's MAC address:

When processes such as the above aren't scripted best case is that they end up as a how-to with a sequence of steps to follow in the project's README. Your project's documentation will be more significantly more readable when such details are scripted and only the script's high level function is included in your documentation.

Tooling can improve development time

A lot of the code you write you’ll be able to simulate on your development machine. However, in an environment that uses sensors and outputs to say, a screen without HDMI connectivity, then you’ll eventually get to the point of needing to code and test directly on your device. A lot of time can be wasted on manually syncing and forgetting to sync and then wondering why your code isn’t working. One way we greatly improved our development process on the Mac was through a combination of rsync and watchman. Rsync will come pre-installed on your Mac. Watchman can be installed via Homebrew.

$ brew update
$ brew install watchman

The following script called ‘’ syncs our local directory with a remote device. This code assumes you haven’t renamed the pi user.

This code loads a local .env environment file for determining the hostname, changes directory to the one containing the script and then runs rsync in the current shell process. Rsync copies the current folder to the remote project path. The flags indicate it to:

  • -r run recursively
  • -l preserve all symlinks
  • -p preserve permissions and
  • -t preserve modification timestamps

It excludes from syncing all the binaries generated by Python and the local virtual environment. When files are deleted locally rsync makes sure they are also deleted remotely thanks to the --delete flag.

The following code is a wrapper around a couple of watchman commands to clear any current watches, and to watch the directory of the project, telling it to trigger our sync script on file changes:

The line set -eo pipefail uses the 'set' bash extension to fail any command where part of the command fails.

With the sync process running automatically in the background developing on our local machines felt like developing directly on the device, without any of the complications. It also meant that the flow of solving the problem wasn’t ever interrupted by remembering or forgetting to sync the files. This small bit of code really turned out to be a massive timesaver.

Generally speaking tooling is a bit of work upfront but incredibly worth your while. If you’re creating products for the Raspberry Pi you can’t afford to do without.

If you want to read more about IoT product development on Raspberry Pi, I wrote another article called I love systemd.

With many thanks to Dan Brown and Eric George.

Continue reading


Misconceptions of the lone QA

Throughout my career as a quality assurance engineer (QA), I can honestly say that I have pretty much come across most of the misconceptions and myths tha...

Jamie Mayes   ·   29 January 2018


I ❤️ systemd: Running a splash screen, shutting down screens and an IoT product service with Python on Raspberry Pi

Systemd is fun! No really. Let me talk you through our process of running services for an IoT product on Raspberry Pi.

Melissa Coleman   ·   22 January 2018


Behind the scenes: User research on the road

People often ask about our research methodologies so I thought I'd give an insight into a recent experience when Peter and I spent 10 days on the road con...

Tom White   ·   8 December 2017