Previous blogs

Description

#1 - Introduction, the plan and materials preparationIn this blog post we describe the project's main concept and some initial high-level design of it.
#2 - Materials and casing assembly

Our journey with selecting proper materials, cutting and assembling the drawer and its casing!

#3 - Modelling, cutting, planting!The title says it all: this part describes some modelling we did before cutting some holes in the drawer and finally planted some vegetables!
#4 - Plants, harvests, and fertilisers

First harvest, problems with plants and used fertilisers. All that went off-script for our plants in the artificial environment.

#5 - Piping and pumping - water and liquid fertiliser delivery systemDesign and development of water supply system.
#6 - Mix of fixes - various fixes to water supply, lighting and hardwarePolish and fixes of various components of the system, all the small things.
#7 - Ride the lightning - Wiring diagram, pinout and components discussion

Diagram of the circuit, pinout for most components, wire choices and their connections

#8 - Pulling the stringsLED and Fan control circuit, overview of BJTs and MOSFETs with a handful of useful general information
#9 - 3D printingDetailed description of 3D printed components, their presentation and some tips

Next postsDescription
Envidrawer 11 - SummaryLast post before the end of the challenge!

 


Introduction

 

Hey all!

Time to write about some software that is powering the Envidrawer. Since this project is meant to upscale and thrive after this competition, we decided to focus on scalability and a proper architecture. As some of you with a software background already know, building the proper architecture takes most of the time and writing the actual logic takes much less (and if done properly - debugging too!).

 

I will include some most important code snippets here and if they pique your curiosity I will provide GitHub links to both the Envidrawer and Envimonitor code below:

 

Envidrawer GitHub

Envimonitor GitHub

 

This post assumes a moderate level of programming knowledge, not limited to OOP, processes and threads, synchronization and python programming language.

Architecture overview

The good thing about writing such an architecture is that you can write it independently of most other things. The one limiting factor in this case was the choice of frameworks and our communication method. In the picture below you can see the high level overview of the software elements of our project.

 

Envidrawer software overview

Envidrawer script, Envimonitor and an InfluxDB constitute the Envidrawer project   

 

We will talk through each element of the software elaborating where required. Feel free to ask if anything is unclear or something is not too clear or the code not self-explanatory (should be though :)).

Envidrawer

 

We chose to have a regular python script which would be running in an endless loop and collecting the data from sensors synchronously (with some asynchronous capabilities of course). The Envidrawer script has one entry point - the main function and in there the endless loop is run. The main way of communication is done utilizing python PubSub library which employs a Publisher-Subscriber messaging scheme, similar to ROS (Robot Operating System) or some other multi-process system. In this application we have only one process and many threads spawned for the Data Scraper, the Controller and short-lived callback threads.

 

In order to maintain clear separation from the system python, we decided to employ a virtual environment - venv for short and install the packages only in it. Because I (Jakub) have some experience with writing software for embedded systems I decided to maintain my habit of developing the code on the Host machine and flashing it to the Target to test it. If this was C code it would require a separate toolchain for compilation and some headers and various includes. Thankfully this is python and is interpreted and the Target device is Raspberry Pi which is powerful enough to run it smoothly

 

The script is very simple and just copies over the scp the required python files. It is visible below


 

With its help I can quickly edit the code on my Host PC and transfer the data quickly to the RPi for execution. Venv helps with autocompletion and pollution of the system pip and is just one step below having a container for python application development, therefore making the whole experience of developing the particular application much faster. I cannot underline how much I recommend it

 

But enough of the environment setup, time to show the most important picture for this part - the class diagram. Lo and behold:

 

Class diagram of the Envidrawer

 

In it you can see that we have two main objects whose purposes are:

Scraper - subscribe to messages from sensors and group the data to flush them periodically when the poll method is called (when the time is right!)

Controller - subscribe to messages from sensors and perform actuation of DC motors if particular conditions are met

 

Apart from them, each sensor or a sensor group implements an ISensor interface (implements methods defined by this interface) and is created in the main loop for easy access. Depending on the sensor, different logic is performed and different messages are submitted by them. Sensors are registered and created in the main loop for easy access and are periodically probed for data which is stored in the Data Scraper for later collection.

 

 

Above you can see an example of such a sensor. Most of them perform data communication in the poll method synchronously.

 

In order to facilitate implementation the data structure which is used for this message passing should be consistent and general. Hence the SensorData structure is used throughout the Envidrawer application for communication.

 

 

 

In a low-level language it would have to be properly structured with some enumerations or bitflags to utilize memory more efficiently (it has to be copied around and not moved in most cases...), however this is python and we can get away with strings and other heavy data structures in it. Each field is mostly self-explanatory, the sensor group field was added for Pimoroni Hats sensor grouping and the time field was added for accurate logging of the data in time.

 

With such a handy data structure, saving the sensor values to the DB is but a cinch. I will later present how InfluxDB stores this data as it is a time-series DB. For now rest assured that this preparation is not just a fad:

 

 

The remaining classes in the preceding diagram are storage classes. I decided to yet again make use of an interface and implement both FileStorage and DBStorage. In these classes the proper serialization (for FileStorage) and DB saving is done. The data is formatted properly and flushed.

InfluxDB

This is yet another major component of our application, even though it might seem like an overkill for just one reporting application instance, the database is vital. We decided to go for a time-series DB instead of a relational one because of the scalability possibilities and simply to learn something new.

 

Unfortunately, for the 32-bit default Raspbian the only version of InfluxDB which is supported is 1.8 and it has the old API (besides, why not go for the bleeding edge :), said every Arch Linux user). That is why we decided to go for the 64-bit version of Raspbian (not wanting to risk installing Arch Linux ARM or any other distro at the last minute). The installation went smoothly and we were able to install the newest InfluxDB binary from the official repositories. We only had to specify influxdb2 instead of influxdb. It is available for download from the official InfluxDB website or it can be built from source (but it takes sooo long on RPi that I do not recommend it).

 

All the user has to do is create a bucket to which the data will be written and read from, generate a token (later used in the application) and leave the daemon running or even better register it as a system unit with the systemd's systemctl commands (beware of root privileges, I spent some time with meaningless InfluxDB's logs about that). And it is simple as that, just read the data from the InfluxDB and write the data with an authorized application. It has its own query language which has syntax quite unfamiliar to regular SQL languages (which is a major breaking change between InfluxDB 1.8 and 2.0), but it is quite easy to learn so don't frown at this.

Envimonitor

This brings us to the visualization and (probably some time in the future) control application of the Envidrawer. Because most of this project was meant to teach us something, I decided to teach myself some Web App development with python. One of the obvious choices for this was Flask, which is a popular and mature python framework for Web Apps. Moreover I needed some sensible and responsive visualization and that is where Dash with Plotly comes in handy. Of course I could have plotted everything in matplotlib and be happy with it but having some experience with this tool I wanted to try something more modern. And to my joy I found working with Dash quite pleasant.

 

Because I wanted to do things the 'right' way I decided to go for a proper integration of Dash with Flask and these tutorials helped me greatly. Not only did I understand proper application structure but saved myself from some head-scratching resulting from trying to do it solely on my own. I always recommend doing as much research on the topic as possible. Time spent on preparation is never time wasted! (Much true with the proper architecture preparation)

 

These tutorials go in much depth on the topic! - check them out!

 

I will not go into details about the frontend side of this application, rather focusing on the backend and connections between all components. The application is at the moment straightforward, displaying several sensor measurements in time. It can be easily expanded due to the architecture we have chosen and because the Dash core is separated from the main Flask application we can add a camera feed, user inputs and other functionalities we desire.

Free topics

Due to shortage of analog pins on the Raspberry Pi (such a pity) and no multi-channel ADC's on hand we decided to make use of an Arduino Nano Micro as our interface between the sensors and the application. It can be seen in the UML diagram above as ArduinoSerialInterface class. In there we simply pass the messages in an ASCII format and pack them in our SensorData objects which we then serialize to the DB.

 

Moreover, we connected some sensors and made several tidy cables reusing old wires we found laying around and asking for a second life. Envidrawer, more benefit every day

 

Batch of sensors

Humidity, temperature and luminosity sensors next to capacitance sensor and soil moisture sensor

Batch of sensors v2

A lonely humidity and temperature sensor

Cables reused

Cables reused and nicely sleeved with heat-shrink tubes

Summary

As you can see there is plenty to discuss and describe I must admit that working with the software once the skeleton of the architecture was done was really pleasant. No need for corner cases and hacks and everything is nicely decoupled. I cannot stress how important it is! But better now, try it yourselves and bask in the glory of clean code (just kidding, it is never THAT clean).

 

But jokes aside, if I was writing this code in C it would be most probably less clear and would need to rely on some ugly defines and macros. I do not say C is bad, but it is much less pleasant to work with when it comes to abstractions and writing applications. Of course it is very fast and expressive (which is terrific for systems programming and drivers). I would very much like to do the next software for such a project in Rust as it has recently enamored me  

 

Wrapping up, writing python code is a pleasant experience taking into account that the libraries are usually easy to interface with and already written, all that remains is to create a proper structure and connect all elements meticulously so that it does not explode (really, it can sometimes happen). Of course minor issues arose, but we were able to circumvent them quite smoothly.

 

 

Previous postNext post
#9 - 3D printingFinal Presentation