Skip navigation
1 2 Previous Next

Raspberry Pi Projects

19 Posts authored by: Frederick Vandenbosch Top Member

This project is about a digital picture frame aimed at family members, such as grandparents.


The idea is that parents taking pictures of their children, can easily share those pictures with the children's grandparents by making them appear on the picture frame automatically. In turn, the grandparents can "like" the pictures, letting the children's parents know which pictures are their favourites.


By making use of a specific software platform called, multiple instances of this picture frame can be deployed for various family members, without hassle.


Screen Shot 2017-08-29 at 18.34.39.png




The project makes use of different services. Here's an overview:


Screen Shot 2017-08-28 at 16.51.45.png


The picture frame offers following features:

  • simple user interface to navigate the pictures, start a slideshow or like a picture
  • periodically download pictures from a shared Dropbox folder
  • send push notifications whenever a picture is liked
  • Turn the picture frame's display off every evening, and back on every morning


Let's take a closer look at the software and hardware for this project, and how you can build your own connected picture frame.




The following hardware components are used in this project:


Assembly is super easy, following these steps:

  1. Mount the Raspberry Pi 3 to the Raspberry Pi Touchscreen
  2. Connect the jumper wires from the screen's board to the Pi for power
  3. Slide the Touchscreen assembly through the enclosure's front bezel
  4. Screw everything in place

Do not insert the microSD card or power on the frame yet, as the software needs to be

Image-1 (1).jpg




The complexity of the project is in the software. Let's break it down. makes it simple to deploy, update, and maintain code running on remote devices. Bringing the web development and deployment workflow to hardware, using tools like git and Docker to allow users to seamlessly update all their embedded linux devices in the wild.'s ResinOS, an operating system optimised for use with Docker containers, focuses on reliability over long periods of operation and easy portability to multiple device types.

To know more details about how works, be sure to check out this page: How It Works

Sign up for a free account and go through the detailed Getting Started guide. From there, you can create your first application.


Application Creation


Setting up a project requires two things:

  • application name: ConnectedFrame
  • device type: Raspberry Pi 3


Screen Shot 2017-08-26 at 21.38.55.png


After completing both fields and creating the application, a software image can be downloaded for the devices to boot from. The useful part is that the same image can be used for every device involved in the project. Select the .zip format, which will result in a file of about 400MB, as opposed to 1.8GB for the regular .img file.

Screen Shot 2017-08-26 at 21.38.45.png

Before downloading the image, connectivity settings can be specified, allowing the device to automatically connect to the network once booted. Enter the desired SSID and matching passphrase.


Flashing SD Card


Once the image specific to the application is downloaded, it needs to be flashed to a microSD card for the Raspberry Pi to boot from.


There is a tool available for doing just that, by the same people from, called Etcher. It works on mac, Linux and Windows, is simple to use and gets the job done.

Screen Shot 2017-08-26 at 21.50.54.png


Launch Etcher, select the downloaded image file. Etcher should automatically detect the SD card, all that remains is to click the "Flash" button.


The SD card is ready to be inserted in the Raspberry Pi.


Configuration & Environment Variables


Some raspberry Pi configuration changes are typically made by editing the /boot/config.txt file. allows users to do this via the user interface, by defining Device (single device) or Application (all devices) Configuration Variables.


In config.txt, pairs of variables and values are defined as follows: variable=value


Using the Device/Fleet Configuration, the variable becomes RESIN_HOST_CONFIG_variable and is assigned the desired value.


For example, rotating the LCD touch screen is normally done by appending lcd_rotate=2 to /boot/config.txt. As a configuration variable, this becomes RESIN_HOST_CONFIG_lcd_rotate with value 2.

Screen Shot 2017-08-26 at 18.01.22.png


Another type of variables, are Environment Variables, which can again be defined at application or device level.


Screen Shot 2017-09-03 at 09.57.08.png


These environment variables can be used by the operating system, such as "TZ" which is used to set the correct timezone, but also by scripts.


Following environment variables are used by the connected frame Python script:

  • DISPLAY: display to use for the Tkinter user interface
  • DROPBOX_LINK: link to dropbox shared folder
  • IFTTT_KEY: personal IFTTT webhooks key to trigger notifications
  • DOWNLOAD_INTERVAL_HOURS: interval in hours to download photos from the dropbox folder
  • CAROUSEL_INTERVAL_SECONDS: interval in seconds to automatically switch to the next photo
  • FRAME_OWNER: the name of the person the frame belongs to, used to personalise the "like" notification


Most are to be set at application level, though some variables such as FRAME_OWNER are specific to the device.

The link to the shared dropbox folder ends with "?dl=0" by default. This has to be changed to "?dl=1" in the environment variable, to allow the application to download the pictures.


Application Deployment


I've been developing a Python application using Tkinter to create the graphical interface for the picture frame.

The layout is simple: four interactive buttons (two on each side), with the picture centralised.


Deploying an application with requires some additional files, defining which actions to perform during deployment and which command to use to start it. The full code and accompanying files for this project can be found on GitHub.


You can clone the repository for use in your application, reproducing the exact same project, or fork it and modify it as you desire!


git clone 
cd ConnectedFrame/


In the top right corner of your resin application dashboard, you should find a git command. Execute it in the cloned repository.


git remote add resin


Finally, push the files to your resin project:


git push resin master


If all went well, a unicorn should appear!

Screen Shot 2017-08-26 at 17.55.45.png


In case of problems, a clear error message will appear, telling you what exactly went wrong.




"IFTTT" stands for "If this, then that" and is an online platform that enables users to connect triggers and actions for a plethora of services.


For this particular project, the webhooks service is used to trigger notifications to the IFTTT app on a smartphone.

Screen Shot 2017-08-28 at 21.19.27.png


The trigger is part of the code and needs to remain as is, though the action could be modified to suit your own personal needs.




Enough with the theory, let's see the frame in action!



What do you think? Is this something you could see family members use? Let me know in the comments!

Originally posted on: Pi Zero Case One-Minute Mod – Frederick Vandenbosch


A 1-minute mod for the official Raspberry Pi Zero case, inspired by the ZeroView.



  • Raspberry Pi Zero (W) with official case
  • Raspberry Pi Camera module
  • Hobby knife
  • Two suction cups



Follow me:


I’ve had a Unicorn pHAT sitting in a box for a while, but I finally used it in a simple project. Because it has four rows of LEDs, I thought it would be ideal to make a binary clock from.



A script fetches the time, converts it to binary and lights up the matching pixels on the LED matrix. Because the matrix has RGB LEDs, any colour can be used


You can find the full post and code on my blog: Binary Clock – Frederick Vandenbosch

Need to know the distance to the sun in centimeters, what weather it will be tomorrow or turn on the lights using your voice? Or perhaps you just need someone to talk to at night, when working on your projects? With Amazon's Alexa Voice Service on the Raspberry Pi Zero, this is now a reality, at a very affordable price!


Using a Raspberry Pi Zero, a USB sound card, speaker, microphone and a huge button, I created my personal assistant!



For instructions on how to reproduce this build, have a look at the blog post on my personal website: Running Amazon Echo (Alexa) on Raspberry Pi Zero – Frederick Vandenbosch

Here's the latest project I've been working on: controlling the TV using gestures instead of a remote control. Mainly inspired by my kids being able to handle a tablet but not a remote control, it could perhaps also help to Make Life Accessible.


The project consists of a Raspberry Pi A+, Skywriter HAT and IR LED and receiver. Using Lirc IR signals were recorded from the original remote and can then be reproduced by the Pi.




As always, schematics, code, etc ... are available in a full blog post on my website: Gesture Based TV Remote Control – Frederick Vandenbosch


Let me know what you think in the comments!

Looking for a new project to build around the Raspberry Pi Zero, I came across the pHAT DAC from Pimoroni. This little add-on board adds audio playback capabilities to the Pi Zero. Because the pHAT uses the GPIO pins, the USB OTG port remains available for a wifi dongle. Perfect for a small wireless speaker project!

Using a combination of wood and 3D printing, I created a custom enclosure for the speaker. Hope you like it!

Screen Shot 2016-01-21 at 20.05.28.pngScreen Shot 2016-01-21 at 20.06.02.png

A full guide on how to reproduce this project can be found on my website:

The Raspberry Pi Zero form factor, makes it perfect for use in smaller project. Combined with internet connectivity, a display and some kind of input, it could be used to visualise virtually anything.


Using a Pi Zero, an I2C OLED display from Adafruit, a miniature wifi dongle, two push buttons and a custom 3D printed enclosure, I attempted to create a small device which can sit on my desk and report various things, such as:

  • time and date
  • network settings
  • social media stats


This could easily be expanded to display the weather, latest email received, tweets you are mentioned in, or even the latest discussions on element14. The choice is yours!


One button cycles through the different screens, the other triggers actions depending on the active screen. For example, on the network settings screen, the button forces the Pi to reconnect to the network.


The wiring diagram, code, 3D files ... are available in a complete blog post on my website should you want to know more about this build: Raspberry Pi Zero Internet Connected Information Display – Frederick Vandenbosch


Check it out and let me know what you think!

I received my Raspberry Pi Zero earlier this week

While holding it, I noticed one of the USB hubs I had, had the same form factor as this new Pi. So I combined the two into a as small as possible package.

The Pi is powered from the USB hub and the USB OTG port is connected to the hub, providing 4 ports for connectivity allowing to connect wifi/keyboard/mouse/bluetooth/...



All connection details for the wiring etc ... are available in a complete blog post on my website should you want to recreate this build: Raspberry Pi Zero – USB Hub Mod | Frederick Vandenbosch






I had a thermal printer for a while now, but never used it as part of a project. Recently, I purchased the new Raspberry Pi Touch Screen and decided to make a kind of photo booth. The touch screen would be used for the user input, instead of using (mechanical) buttons. If the user is satisfied with the picture, it can be printed on the spot by the small printer.


It's certainly not a new idea, but I thought it would be a fun little project to try out.


The main components used in this project are:


Main components
Raspberry Pi 7" Touch Screen Display with 10 Finger Capacitive TouchRaspberry Pi 7" Touch Screen Display with 10 Finger Capacitive Touch
Mini Thermal Receipt Printer
Raspberry Pi Camera Wide-Angle Lens


Raspberry Pi


For this project, I ended up using a Pi 2. Originally, I tried with the A+, but some software components failed to install (more on that in the "Kivy" paragraph).


For the OS, the latest version of Raspbian was used (2015-09-24 Jessie). It can be downloaded from the official Raspberry Pi website:

Getting the OS image on a microSD card can be done in several ways depending on your own operating system. In my case, in OSX, I used "dd" to get the image on the micro SD card.

Fredericks-Mac-mini:~ frederickvandenbosch$ sudo diskUtil list
Fredericks-Mac-mini:~ frederickvandenbosch$ sudo diskUtil unmountDisk /dev/diskX
Fredericks-Mac-mini:~ frederickvandenbosch$ sudo dd if=Downloads/2015-09-24-raspbian-jessie.img of=/dev/diskX bs=1m
Fredericks-Mac-mini:~ frederickvandenbosch$ sudo diskUtil unmountDisk /dev/diskX


Once the image has been written to the microSD card and the card has been unmounted, it can be removed from the PC and inserted in the Raspberry Pi.


Touch Screen


Connecting and getting the touch screen to work with the Raspberry Pi was super easy using the instructions found right here on element14:

Using the latest Raspbian image (2015-09-24 Jessie), the touch screen was plug & play. I did install the additional virtual keyboard by executing following command:


pi@photobooth ~ $ sudo apt-get install matchbox-keyboard




Getting wifi to work on the Pi is another one of those plug & play things. Just connect the wifi dongle, select the access point you wish to connect to in the desktop environment and enter the password. That's all there is to it.


Pi Camera


No photo booth without a camera, right? Let's see how to connect and enable the camera.


Connecting the camera


To connect the camera to the Pi, open the CSI slot located near the ethernet port and ensure the camera's flex cable is inserted with the exposed contacts facing away from the ethernet port.


Enabling camera support


By default, the camera support is disabled. To get the camera to work, support needs to be enabled using the "raspi-config" tool.


Open a terminal and enter following command:


pi@photobooth ~ $ sudo raspi-config


A menu will appear. Select option 5: "Enable Camera", and in the following step, select "Enable". Reboot the Pi.

Screen Shot 2015-10-04 at 08.39.49.pngScreen Shot 2015-10-04 at 08.39.52.png


Thermal Printer


To set up the printer, a complete guide is available over at Adafruit (, only a few steps are relevant for this project though and I will highlight them in the next paragraphs.


Connecting the printer


There are two parts to connect the printer:

  • power, using an external 5V power supply (at least 1.5A for the printer only)
  • data, using the Pi's GPIO serial interface (including GND)


To easily connect an external power supply, I cut off one end of the provided power cable and screwed on a female DC barrel jack connector. The data cable, even though not ideal, can be connected to the Raspberry Pi's GPIO. Careful though, the the printer's TX pin (RX on the Pi's GPIO) should either be disconnected or have a 10k resistor added to compensate for the level difference (5.0V vs 3.3V).



You'll notice I moved the GND jumper wire from the touch screen to another GND pin, in order to accommodate the printer's data cable.


Controlling the printer


Start by installing the necessary software components.


pi@photobooth ~ $ sudo apt-get install python-serial python-imaging python-unidecode


In the cmdline.txt file, remove references to ttyAMA0 to avoid conflicts with the printer on the serial interface.


pi@photobooth ~ $ sudo nano /boot/cmdline.txt

#dwc_otg.lpm_enable=0 console=ttyAMA0,115200 console=tty1 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline rootwait
dwc_otg.lpm_enable=0 console=tty1 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline rootwait


Download the Adafruit python library for the printer, containing some example code.


pi@photobooth ~ $ sudo apt-get install git
pi@photobooth ~ $ git clone
pi@photobooth ~ $ sudo reboot


After the Pi has rebooted, it should be possible to make a test print.


pi@raspberrypi ~ $ cd Python-Thermal-Printer
pi@raspberrypi ~/Python-Thermal-Printer $ python


The printer should then output something like this:





Kivy is an open source Python library used for developing applications making use of user interfaces. Kivy's official website can be found here:


The installation steps and some example code are provided via Matt Richardson's tutorial, in which he used Kivy to control the Pi's GPIO using the touch screen:


Some notes on my experience, performing the installation:

  • I originally used the Raspberry Pi A+. However during the Cython installation step, it runs out of memory and starts swapping. The installation never finishes as the kswapd0 process takes 100% CPU. Using the Raspberry Pi 2, no problems were encountered.
  • Originally, when trying to edit the Kivy's config.ini (~/.kivy/config.ini) in order to add touch support, the file didn't exist. After running an example (~/kivy/examples/demo/pictures/, the file was there and could be edited.




With all individual components working, it's time to move on to the project specific topics.




The code is based on Matt Richardson's example application, which was then adapted to suit my needs. In addition, Adafruit's thermal printing python library was added to have printing support as well.


I've added comments in the code to make easier to understand.





For the frame, I picked something simple: a wooden board holding all the components in place. The result would be a flat and portable photo "booth".

I started by using some tape to draw on and see how the result would be. Everything looked good, so I started cutting and drilling. A bit of sanding was required to make everything fit.


{gallery} Build


Board: the piece of wood before the cutting and drilling


Layout: using tape and a pencil to decide where I'll put the different components


Cutting: happy with the layout, I cut out the parts using an oscillating multitool


Drilling: some drilling was required for the camera and the handle


Edges: removed the top corners to make some rounded edges


Fitting: test-fitting the parts


Feet: making some "feet"


Bandsaw: using the bandsaw, the "feet" can easily be cut to the desired shape


Cleanup: with everything in place, some tidying up was required


Testing: my assistants testing the new gadget






Hope you like the project!




This post is meant as a first how-to for Rapiro, the Raspberry Pi robot. The goal will be to start off with some simple topics and gradually move to more complex things.

It will be a good opportunity for me to use my robot and try to discover its full potential, while getting feedback and/or requests from you.


Do you think that's a good idea? If yes, what topics would you like to see covered? Leave a comment!





For a video version of this how-to, watch the embedded video below, otherwise, keep on reading







The topic I'd like to cover in this how-to: modifying and uploading the Arduino sketch to Rapiro. More specifically to solve a particular issue.


It's very likely, that after assembling Rapiro and powering it for the first time, some servos are not properly aligned.


If you look at the picture below, you'll notice a few mistakes:

  • the arms are pointing to the back instead of being straight down
  • the head and waist are not aligned with the feet
  • the feet are not sitting flat on the table


The last one is hard to see from the picture because the robot's weight is pushing them flat. This is however causing a lot of noise as the servo is trying to maintain its position even though it is being forced in another.

Screen Shot 2015-01-12 at 09.30.53.png




There are certain things you'll need for this how-to:

  • an assembled Rapiro (obviously ...)
  • a computer
  • a micro USB cable to upload the sketch
  • a power supply or batteries to power the robot


Arduino IDE


The first thing you'll need is the Arduino IDE. This is required to modify and upload the sketch to Rapiro.

If you don't have it installed yet, go to and download the latest version.

Screen Shot 2015-01-11 at 21.42.23.png


Rapiro Sketch


The second thing needed is the sketch used to program Rapiro, which we will be editing.

The sketch can be found on, in the download section.

Screen Shot 2015-01-11 at 21.45.18.png


Correcting Servo Position


Open the Rapiro sketch by double-clicking on the Rapiro *.ino file. This should launch the Arduino IDE and visualise the code.

Screen Shot 2015-01-12 at 18.48.36.png


You'll notice a bit of code for the fine angle adjustments, that's the part needed to fix the initial servo positions without having to take apart the robot and redo the assembly. Unless the correction exceeds 20 degrees, in which case reassembling that part is recommended.


It requires a bit of trial and error to find the right values. After corrections, you could end up with something similar:


// Fine angle adjustments (degrees)
int trim[MAXSN] = { -5,  // Head yaw
                    -10,  // Waist yaw
                    20,  // R Sholder roll
                    0,  // R Sholder pitch
                    0,  // R Hand grip
                    -20,  // L Sholder roll
                    0,  // L Sholder pitch
                    0,  // L Hand grip
                    0,  // R Foot yaw
                    8,  // R Foot pitch
                    0,  // L Foot yaw
                    3}; // L Foot pitch


Once the sketch is modified, it needs to be uploaded to Rapiro:

  • power off Rapiro
  • connect the micro USB cable to Rapiro and the computer, Rapiro's eyes should light up
  • in the Arduino IDE
    • select "Arduino UNO" as board type (Tools > Board)
    • select the correct serial device (Tools > Port)
    • press the "Upload" button


Rapiro's eyes will turn off while the upload is ongoing and light up again once finished.


After having uploaded the modified sketch and powering on Rapiro, it should stand straight and be properly aligned as demonstrated below:

Screen Shot 2015-01-12 at 09.31.23.png


If not, repeat the process of modifying and uploading the sketch until the position is as desired.

Previous entries in this blog series:






The dimensions of the enclosure were based on some rough estimations, now I had to make everything fit inside.

I puzzled a bit, figuring out how I would expose the controls, camera and LCD.

The solution I came up with can be seen in the pictures below. It's a bit big, but fully functional.

I even printed some buttons, as the keypad pushbuttons were no longer accessible because of the enclosure.








RF433 I2C Board


As mentioned in previous blog post, I made a prototype of an I2C RF433 transmitter using an ATtiny85.

With the prototype working, I designed a small PCB using Fritzing and had it made.


The result (prototype left, custom PCB right)



A short video of the board being controlled by the Pi via I2C to turn a light on and off:




Power and other pins


I wanted to power every component involved in the most simple way possible, using a single power supply.

For that purpose I made use of the power provided to the Pi Rack and distributed it to the different components by means of jumper wires in case they weren't directly connected to the Pi Rack.

The Pi is also powered by the Pi Rack by providing the power via the GPIO pins. Not ideal, but it works and was simple to achieve: on one of the jumper selections for power, I shorted all three pins in order to have the external power supply reach the Pi 5V GPIO pin and have the module attached still powered also.


Screen Shot 2014-04-23 at 21.49.02.png


Because the Arduino and the Adafruit LCD/Keypad cannot be connected directly to the Pi Rack, I used jumper wires to connect the necessary pins:

  • Adafruit LCD/Keypad: Power and Ground + I2C pins
  • Arduino with GSM Shield: Power and Ground + Serial pins




I'll have to find a better way to fasten the jumper wires though, as they tended to come loose easily.





With all components enclosed, some tests were required to verify all connections were still functioning properly.


Using the "sudo i2cdetect -y 1" command, I verified both I2C modules (Adafruit LCD/Keypad and custom pcb) were detected properly.


Screen Shot 2014-04-20 at 21.44.35.png


The camera was tested using the "raspistill -o test.jpg" command. The picture and its orientation were correct.

Motion also provides a live stream of the camera feed, which worked properly.


A quick test of the LCD/Keypad:



With the enclosure done and the components working, it was time for a "field" test ...





Because the wireless sensors could detect our little burglar even before she reached the living room (and because there was not much for you to see), we allowed her to start from within the living room.


The system would still detect her, but at least we'd have her reaction on film



also good.jpg

Thanks to the pictures and videos, the burglar was identified and apprehended





There are still quite some improvements to be done before this project becomes more than a game to play with my daughter:

  • As Mark suggested in a previous post: the wireless sensors should include a mechanism to check in periodically in order to know the batteries are not dead
  • The control unit with camera and controls should have some backup power: if the burglars turns off the electricity, the system should keep working for a (little) while
  • The solution is very bulky, mainly because all components used were used as is. A custom PCB with the different elements required could be more compact (and cheaper).
  • ...


Still, this was a great learning experience on different types of communication between the modules, and on the modules themselves.





I'll be posting the full code on GitHub and provide the link when available.


The code consists of following parts:

  • Arduino sketch for RF433 sensor reception and GSM shield using UNO
  • Arduino sketch for the RF433 I2C transmitter using ATtiny85
  • Raspberry Pi python scripts to drive the LCD with keypad, the Piface digital, the lights and the sending of SMS

Previous entries in this blog series:





In part 2 of this project, I prototyped two types of wireless sensors to be used with the alarm system.


Using a prototyping PCB leftover, I moved the circuit from the breadboard, trying to keep it as small as possible.

I then continued by printing some custom enclosures for my different sensors. Below you can see an example for the motion detector sensor.


It contains:

  • PIR sensor
  • RF433 transmitter
  • battery holder
  • ATtiny85 circuit

photo 1.JPGphoto 2.JPGphoto 3.JPG


The PIR sensor and the RF433 transmitter are not soldered directly to the PCB, instead I used some female headers for the sensor and transmitter to plug into.

This way, the components are replaceable in case of failure, etc ...


Control Unit


I have used a lot of different, mostly off the shelf, components for my control unit.


Below is a simple block diagram of how the different components are interconnected:

pi_security (1).png

It might not be the most efficient way to interconnect everything, however, I find it interesting to play with and learn about these different interfaces.




The Pi NoIR is there for motion detection and recording of possible evidence. Using "motion", the camera can be used as a motion detector, triggering a recording of the event.

Instructions on setting up "motion" on the Raspberry Pi with a Pi Camera (or Pi NoIR) are described in detail on


Using simple oneliners, it is possible to enable or disable recordings generated by the "motion" application:


# Recordings ON
sudo sed -i -e 's/output_pictures.*/output_pictures best/g' -e 's/ffmpeg_output_movies.*/ffmpeg_output_movies on/g' /etc/motion.conf 

# Recordings OFF
sudo sed -i -e 's/output_pictures.*/output_pictures off/g' -e 's/ffmpeg_output_movies.*/ffmpeg_output_movies off/g' /etc/motion.conf


Adafruit LCD and Keypad


I've already elaborated on the assembly and getting the LCD/keypad up and running in my previous post.

You can find it here: Pi Alarm System - Part 3: Control unit


PiFace Digital


On top of what was explained in Pi Alarm System - Part 3: Control unit , I have connected a 12V rotating light on one of the relays.

The relay is activated when the alarm is triggered, causing the light to turn on.


To spare my family's hearing, I have not connected any siren to the system, but this would be set up in the same way as the light was.


Arduino UNO with GSM Shield


I played with the Arduino GSM shield and had some success in adapting the sample sketches to send and receive SMS messages.


However, after another test, I forgot to put the correct PIN code in the new sketch and managed to get my SIM locked ...

I have contacted the phone company and I either have to pay 10EUR to unlock the SIM or order a free new one, with a new phone number.

Since I was using a test SIM anyway, I'll go for option two. It may take some time before I get the new SIM though ...


The GSM shield will come in handy to receive notifications from the alarm system, or even send specific commands via SMS.

Until I have a new SIM, this is unfortunately on hold.


Prototype board: ATtiny85 with RF433 Transmitter


This prototype covers the possibility of sending on/off commands to power sockets via an RF transmitter on 433Mhz.

This functionality could have been merged with the Arduino, but using the ATtiny85, it is possible to keep the solution generic and to use it in combination with anything else that supports I2C such as a Raspberry Pi, an Arduino, etc ...


photo 1.JPG


I used two libraries to cover the needed functionality:


  • RemoteSwitch provides a generic class for simulation of common RF remote controls, like the 'Klik aan Klik uit'-system, used to remotely switch lights etc: RemoteSwitch v2.0.0 on GitHub
  • The ATtiny85 does not have I2C (or SPI) "built in". Instead it has a Universal Serial Interface (USI) that can be used to facilitate I2C and SPI: I2C (master and slave) on the ATtiny85


Using those library, I programmed my ATtiny85 with the following code:


#include "TinyWireS.h"                  // wrapper class for I2C slave routines
#include "RemoteSwitch.h"

#define I2C_SLAVE_ADDR  0x26            // i2c slave address (38)
KaKuSwitch kaKuSwitch(1); // pin 1

void setup(){
  TinyWireS.begin(I2C_SLAVE_ADDR);      // init I2C Slave mode

void loop(){
  if (TinyWireS.available()){           // got I2C input
    char house = TinyWireS.receive();
    int unit = TinyWireS.receive();
    int on = TinyWireS.receive();


void onOff(char house, int unit, int on){


A small Python script on the Raspberry Pi is then used to send commands to the ATtiny85 which will then send the correct codes to the sockets:


import smbus
import sys

bus = smbus.SMBus(0)
address = 38

house = int(sys.argv[1])
unit = int(sys.argv[2])
on = int(sys.argv[3])

bus.write_byte(address, house)
bus.write_byte(address, unit)
bus.write_byte(address, on)


The script is called as follows to turn unit 1 with house code A (decimal 65) off and on:

# Turn OFF

sudo python 65 1 0

# Turn ON
sudo python 65 1 1


Using a cron job, the functions will be called with a variable delay, to have following example behaviour:

  • at 19:00 + [0-60] minutes turn the lights on
  • at 22:00 + [0-60] minutes turn the lights off



What's next ?


In my next and final post on this project, I will be covering the build (my project box arrived today!) and have my little assistant test the system!


photo 2.JPG photo+3+(1).JPG

Previous entries in this blog series:



I've finally been able to spend some time on the Pi Alarm project this weekend, so here's a progress update!


For the Pi Alarm System, I'm using various Raspberry Pi accessories/extension boards, such as: PiFace Digital, Adafruit LCD and Keypad kit, Pi Rack, etc ...

In this post, I'll be describing how I connected different parts and got them up and running.




For the installation and use of the camera, I will refer to my Pi NoIR RoadTest:

The camera will be used in the same way as for the Santa Catcher. You can find more information here: Raspberry Pi Santa Catcher with Pi NoIR and PiFace CAD



Adafruit RGB LCD and Keypad Kit




The LCD and Keypad come in the form of a kit.

photo 3.JPG


I got my soldering iron hot and started soldering. Fifteen minutes later the kit was assembled and ready to use.

photo 5.JPG


I2C Support


The LCD and Keypad kit uses I2C for communication with the pi.


There are some configuration changes required for this to function, luckily there is a nice tutorial on Adafruit: Configuring I2C | Adafruit's Raspberry Pi Lesson 4. GPIO Setup | Adafruit Learning System


After setting up I2C on the pi, I connected the LCD and keypad, and powered the pi. I verified the LCD and keypad were detected properly:


pi@raspberrypi ~ $ sudo i2cdetect -y 1

     0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f
00:          -- -- -- -- -- -- -- -- -- -- -- -- --
10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
20: 20 -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
70: -- -- -- -- -- -- -- --


The LCD and keypad were detected and with address 0x20. Time for some testing.




There is a tutorial on how to use the LCD and keypad and sample code on the Adafruti website:


I downloaded and executed the example code:


pi@raspberrypi ~ $ git clone
Cloning into 'Adafruit-Raspberry-Pi-Python-Code'...
remote: Reusing existing pack: 461, done.
remote: Total 461 (delta 0), reused 0 (delta 0)
Receiving objects: 100% (461/461), 155.96 KiB, done.
Resolving deltas: 100% (196/196), done.


pi@raspberrypi ~ $ cd Adafruit-Raspberry-Pi-Python-Code


pi@raspberrypi ~/Adafruit-Raspberry-Pi-Python-Code $ cd Adafruit_CharLCDPlate


pi@raspberrypi ~/Adafruit-Raspberry-Pi-Python-Code/Adafruit_CharLCDPlate $ sudo python


This cycled through some different background colors and detected button presses.


The output of the script looked like this:


Cycle thru backlight colors
Try buttons on plate


And the result as follows:



Programming the menus


I programmed a simple menu structure to enable/disable the alarm system. This can later be extended to use some kind of pin code for verification before allowing any changes.


The menu is programmed in Python and behaves as follows:

  • At startup a message is displayed for 5 seconds. The LCD is then cleared and turned off, waiting for user input.
  • When the SELECT button is pressed, the menu is displayed
  • Using the UP and DOWN buttons, a menu entry can be selected
  • Pressing SELECT again confirms the selection and enables/disables the alarm system
  • When no button is pressed for 5 seconds, the LCD is cleared and turned off, waiting for new user input


A video demonstrating the menus:




PiFace Digital


Unlike the Adafruit LCD and Keypad kit, the PiFace Digital comes pre-assembled and ready to use.

photo (9).JPG




There is a step-by-step guide on how to install the necessary python modules and on how to enable SPI on the pi, available on the PiFace website:


I followed the instructions and could then start using the PiFace Digital.




A first test I executed, was to connect some LEDs to the outputs of the PiFace and see if I could properly control them.

The goal is to light up a green LED when the alarm system is disabled, a red one when enabled.


Ultimately, the LEDs should light up as a result of the actions on the LCD and keypad.



There are also two usable relays on the PiFace Digital.

Even though the relays are rated for 250V-10A, the PiFace documentation states they can only be used for 20V-5A max.

This is still enough to connect a rotating beacon light and a high power siren, both running on 12V.


Pi Rack


To be able to connect more than one board at the time to the Raspberry Pi, there is an accessory called the "Pi Rack".

This allows us to connect up to four extension boards on the Raspberry Pi's GPIO pins.


I connected both the PiFace Digital and the Adafruit LCD and Keypad to the Pi Rack on top of the Raspberry Pi.

photo (8).JPG


The Adafruit LCD and Keypad worked immediatly. The PiFace Digital however, didn't.


pifacedigitalio.core.NoPiFaceDigitalDetectedError: No PiFace Digital board detected (hardware_addr=0, bus=0, chip_select=0).


I had to swap the SPI jumpers for the PiFace Digital from "default" to "swapped" CE lines. I was then able to use the PiFace Digital again.

Screen Shot 2014-03-01 at 20.59.00.png


Combining LCD/Keypad with PiFace Digital


With different elements working, It was time to combine some functionality.


I've extended the code to control the Adafruit LCD and Keypad to also perform some actions on the PiFace Digital.

The PiFace Digital controls two LEDs to show the state of the alarm system, as configured using the LCD and Keypad.



The code:





There is still some work to be done before the system is complete:

  • Listen for the wireless sensors to know when to trigger the alarm
  • Notification of the owner using an Arduino GSM Shield
  • Testing, testing, testing ...


Thanks to element14 and the Pi NoIR and Catch Santa Challenge, my daughter and I won a Rapiro kit with Raspberry Pi and Pi Camera.


In this post, I will describe my experiences in building and getting Rapiro up and running.




The kit came in a large white box, accompanied by a Raspberry Pi and Pi Camera.



The Rapiro kit comes in a big box which can be used later on to carry the assembled Rapiro.


photo 4.JPG

All parts are neatly arranged in individual compartments.


Unfortunately, no instructions are provided with the kit.


After some searches on Rapiro's Kickstarter page and website, I was able to gather different pieces of information:


Since Kickstarter rewards only started shipping about two to three week ago, there is not a lot of user feedback yet. Also, as I'm not a backer on Kickstarter, I am unable to post comments in order to ask questions etc ...


I've read that a discussion forum will be open soon, but until then, I'm on my own.






Using the photo gallery, assembling Rapiro was easy. The photo instructions are very clear and make it very easy to compare with.


IMG_3291 copy.JPGphoto 9.JPG

Labeling the servos and connecting them to the control board to set the initial position of the servo.


photo 10.JPGphoto 12.JPG

Passing the servo cables through the foreseen slots to the control board.


IMG_3306 copy.JPG photo 16.JPG

Connecting the RGB LEDs and attaching the final piece of the kit.


The build took approximately 2 hours and did not require any soldering, only a screwdriver and some common sense.


Raspberry Pi


It's possible to expand Rapiro's capabilities by adding different components, such as:

  • Raspberry Pi and Pi Camera
  • Speakers
  • PSD distance sensor


These components do not come with the Rapiro Kit by default and Rapiro is able to function without them.


I installed the Raspberry Pi and Pi Camera as those were kindly provided by element14.


It's important to prepare your SD card in advance, as once the head is closed, there is no access to the SD card slot or HDMI port.

I installed the latest Raspbian, enabled camera support, wifi and ssh, in order to be able to control Rapiro remotely.



photo 15.JPG photo 19.JPG

Pi and camera are installed in Rapiro's head. Pi is connected and powered via GPIO pins.


photo 21.JPG

Wi-Pi connected in the back of Rapiro's head.


Arduino Sketch

The Arduino sketch for Rapiro can be modified and uploaded using the Arduino IDE and the micro USB connection.

It's very useful to update the sketch to:

  • modify the initial position of the servos to properly align feet, arms, etc ... in case they weren't during installation
  • program new moves
  • ...

In my case for example, both feet weren't exactly sitting flat on the table and the waist was not propery aligned. So instead of taking everything apart to correct this, I modified the starting position of the feet's and waist's servos and uploaded the updated sketch.

// Fine angle adjustments (degrees)
int trim[MAXSN] = { 0,  // Head yaw
                    -5,  // Waist yaw
                    0,  // R Sholder roll
                    0,  // R Sholder pitch
                    0,  // R Hand grip
                    0,  // L Sholder roll
                    0,  // L Sholder pitch
                    0,  // L Hand grip
                    0,  // R Foot yaw
                    10,  // R Foot pitch
                    0,  // L Foot yaw
                    5}; // L Foot pitch

Commands via Pi

The Arduino sketch comes with some predefined movement sequences and positions. They can be called via the serial interface of the Pi.

I installed minicom:

pi@rapiro ~ $ sudo apt-get install minicom


I couldn't find an actual list of sequences documented anywhere, so I derived some from the Arduino code.

To execute them, you pass the sequence's number via the serial interface:

pi@rapiro ~ $ echo "#M6" | sudo minicom -b 57600 -o -D /dev/ttyAMA0

Here's a little demo:

Thank you

Our cats do not necessarily like our new robot friend, but we love him!


Thank you again for this awesome prize, and stay tuned for more adventures with Rapiro.


You can find part 1 here: Pi Alarm System - Part 1: Project and components description

Wireless Sensors

This post is focusing on the wireless sensors spread around the house in order to detect possible intrusion. In a later post, I will collect the data from these sensors with the Raspberry Pi.

RF Modules

For the communication between the remote wireless sensors and the central control unit, I used RF433MHz transmitters and a receiver.

The sensors will be equipped with a transmitter each, and the control unit with a receiver.

photo (1).JPG

RF433MHz transmitter (left) and receiver (right)

The remote sensors need to be small, so I opted for the ATtiny85 which can easily be programmed with the Arduino IDE after installing the necessary files to recognise the hardware.

I downloaded the "arduino-tiny" ATtiny cores from, unzipped the contents in the Arduino/hardware folder and after restarting the Arduino IDE, the ATtiny "boards" could be selected for programming:

Screen Shot 2014-02-10 at 21.08.14.png

ATtiny cores available in Arduino IDE

Prototype test

I performed a quick test by hooking up an ATtiny85 to a transmitter and an Arduino Uno to a receiver. Wrote small sketches for basic one-way communication for both Tx and Rx units.


ATtiny85 transmitter (top) and Arduino Uno receiver (bottom)

The transmitter code sends an integer representing the sensor's id. The receiver code waits for data and displays it on the serial interface when available.

The goal is to have every sensor send their unique id to the receiver once they are triggered. This would be done periodically, let's say every 5 seconds.

Transmitter (ATtiny85) code:

Receiver (Arduino Uno) code:


As mentioned in part 1, there will be two types of triggers for the remote sensors: switch or motion sensor.


The switch type of sensor can be used for doors and windows. In idle state, the switch interrupts the circuit, powering everything off.

Once the door/window is opened, the switch powers the circuit resulting in the ATtiny85 sending its sensor ID to the control unit.

photo 2.JPG

Circuit is powered when the switch is triggered

Note: the switch's functionality should be reversed (the circuit should be powered when the switch is opened).


For the second type of sensor, the motion detector should always be on. But in order to avoid having to power all components all the time, I power the ATtiny85 and RF433 transmitter using the PIR sensor's output pin. The PIR sensor's output pin goes HIGH when motion is detected. The duration of the pin's state can be changed by using one of the PIR sensor's potentiometers.

Sketch of the circuit, I hope it makes sense ... (feel free to point out any corrections and/or improvements ):

photo 1.JPG

When the PIR detects motion, the ATtiny85 and RF433 transmitter are powered



By adding antennas to receivers and transmitters, I was able to get reliable coverage in the entire house, even through big concrete walls.

Both sensor types work on breadboard and can now be turned into more compact PCBs.

Filter Blog

By date: By tag: