With the limited time that remains, things need to get kicked in higher gear. For this post, I worked on creating camera feeds using the camera module that was provided in the kit, in combination with the low cost Pi Zero. I built two of these, one for my shed, and another one for my lab.


Here's how I did it.




Nothing super exciting on the hardware side of things, as it's merely a Pi Zero with wifi dongle and camera module, but I can perhaps share two interesting gadgets I've used


The first one is the ZeroView, which I already mentioned in my [Pi IoT] Alarm Clock #02: Unboxing The Kit. Useful to stick a Pi Zero and camera on any window in a very compact format. Even if you don't attach it onto a window, the spacers can be used to attach a string or similar to attach it somewhere else while keeping everything as one unit.




The second one, is this micro USB converter shim. Helps keep things compact as well!




Moving on to the software side of things ...






The first thing to do is to enable camera support using the "raspi-config" command. It doesn't matter which type or version of the Pi camera is used.


pi@zeroview:~ $ sudo raspi-config


Select the "Enable camera" menu, and when prompted, select the option to enable it.

Screen Shot 2016-07-25 at 20.30.49.pngScreen Shot 2016-07-25 at 20.30.53.png


Don't forget to reboot the Pi before trying to use the camera!




There are different options available to stream from the Pi camera. I've used "motion" before in the Pi NoIR and Catch Santa Challenge, but have come across interesting solutions by Calin Crisan while searching for a more up-to-date alternative.


On Calin's GitHub page, a bunch of different projects are available, even a prebuilt image with all tools included called MotionEyeOS (currently featured on element14's homepage as well: Raspberry Pi Smart Surveillance Monitoring System), because I'm integrating everything into a single interface though, I've opted for the lightweight StreamEye program, which creates an easily embeddable MJPEG stream.


I followed the instructions described on the GitHub page:


pi@zeroview:~ $ git clone https://github.com/ccrisan/streameye.git
Cloning into 'streameye'...
remote: Counting objects: 133, done.
remote: Total 133 (delta 0), reused 0 (delta 0), pack-reused 133
Receiving objects: 100% (133/133), 52.15 KiB | 0 bytes/s, done.
Resolving deltas: 100% (75/75), done.
Checking connectivity... done.


pi@zeroview:~ $ cd streameye


pi@zeroview:~/streameye $ make
cc -Wall -pthread -O2 -D_GNU_SOURCE -c -o streameye.o streameye.c
cc -Wall -pthread -O2 -D_GNU_SOURCE -c -o client.o client.c
cc -Wall -pthread -O2 -D_GNU_SOURCE -c -o auth.o auth.c
cc -Wall -pthread -O2 -D_GNU_SOURCE -o streameye streameye.o client.o auth.o


pi@zeroview:~/streameye $ sudo make install
cp streameye /usr/local/bin


In the "extras" folder is a script for the Raspberry Pi, allowing the capture of a continuous stream of JPEG images. Launching the command with the "--help" options, gives a list of all other options available.


pi@zeroview:~/streameye $ cd extras/


pi@zeroview:~/streameye/extras $ ./raspimjpeg.py  --help
usage: raspimjpeg.py -w WIDTH -h HEIGHT -r FRAMERATE [options]

This program continuously captures JPEGs from the CSI camera and writes them
to standard output.

Available options:
  -w WIDTH, --width WIDTH
                        capture width, in pixels (64 to 1920, required)
  -h HEIGHT, --height HEIGHT
                        capture height, in pixels (64 to 1080, required)
  -r FRAMERATE, --framerate FRAMERATE
                        number of frames per second (1 to 30, required)
  -q QUALITY, --quality QUALITY
                        jpeg quality factor (1 to 100, defaults to 50)
  --vflip               flip image vertically
  --hflip               flip image horizontally
  --rotation {0,90,180,270}
                        rotate image
  --brightness BRIGHTNESS
                        image brightness (0 to 100, defaults to 50)
  --contrast CONTRAST   image contrast (-100 to 100, defaults to 0)
  --saturation SATURATION
                        image saturation (-100 to 100, defaults to 0)
  --sharpness SHARPNESS
                        image sharpness (-100 to 100, defaults to 0)
  --iso ISO             capture ISO (100 to 800)
  --ev EV               EV compensation (-25 to 25)
  --shutter SHUTTER     shutter speed, in microseconds (0 to 6000000)
  --exposure {off,auto,night,nightpreview,backlight,spotlight,sports,snow,beach,verylong,fixedfps,antishake,fireworks}
                        exposure mode
  --awb {off,auto,sunlight,cloudy,shade,tungsten,fluorescent,incandescent,flash,horizon}
                        set automatic white balance
  --metering {average,spot,backlit,matrix}
                        metering mode
  --drc {off,low,medium,high}
                        dynamic range compression
  --vstab               turn on video stabilization
  --imxfx {none,negative,solarize,sketch,denoise,emboss,oilpaint,hatch,gpen,pastel,watercolor,film,blur,saturation,colorswap,washedout,posterise,colorpoint,colorbalance,cartoon,deinterlace1,deinterlace2}
                        image effect
  --colfx COLFX         color effect (U:V format, 0 to 255, e.g. 128:128)
  -s, --stills          use stills mode instead of video mode (considerably
  -d, --debug           debug mode, increase verbosity
  --help                show this help message and exit
  -v, --version         show program's version number and exit


Finally, to begin streaming, launch the "raspimjpeg" script and pipe ("|") it to "streameye". This starts a webserver, streaming the images.


pi@zeroview:~/streameye/extras $ ./raspimjpeg.py -w 640 -h 480 -r 15 | streameye
2016-07-25 18:45:44: INFO : streamEye 0.7
2016-07-25 18:45:44: INFO : hello!
2016-07-25 18:45:44: INFO : listening on
2016-07-25 18:45:45:  INFO: raspimjpeg.py 0.5
2016-07-25 18:45:45:  INFO: hello!
2016-07-25 18:46:04: INFO : new client connection from
2016-07-25 18:46:04: INFO : new client connection from


Depending on the selected resolution and frame rate, the result should look a little like this:

Screen Shot 2016-07-26 at 20.20.39.png

I was very impressed by the latency, as it is very low (less than a second) compared to what I've used in the past.




For the final part of this post, I embedded the two image streams in my OpenHAB installation. It only requires a modification in the sitemap, no items need to be defined:


        Frame label="Video" {
                Image url=""
                Image url=""


Refresh the OpenHAB interface, et voila, both streams are embedded:


Screen Shot 2016-07-27 at 20.53.52.png


I moved one camera to the shed, and one to my lab. Yep, that's me writing this very blog post in the lower image (And yes, I still have a lot of cleaning up to do!)




Navigate to the next or previous post using the arrows.