Skip navigation
1 2 3 Previous Next

Sci Fi Your Pi

243 posts

The Meditech project version 2 (on development) explained during a  1hr talk at the last QtCon in Berlin the past September, 4.

IMG_20160904_094920.jpg IMG_20160904_095102.jpg

The recorded event

Follow the link to see a recorded version of the QtCon Meditech event: Relive: Meditech: A Qt-driven OSHW device – QtCon Streaming


QtCon in Berlin, September 2016. Images gallery


{gallery} QtCon images gallery
















Hello everybody!


So happy to write this here =) It's been a long time but the idea lives and I've finally come to the point when I'm ready and the things are in the best state possible.


This year, I've been developing a lot. My main project was and still is pyLCI - a headless interface for Raspberry Pi and other Linux boards. It's an external interface you can use on your Raspberry Pi computers to perform any kinds of tasks, such as managing network interfaces (wired&wireless), viewing system parameters and changing the most important parameters, controlling media functions. It's very easy to start using, it's very cheap to add and it solvesyour problems.

It's also been a huge amount of work for me  - but the PipBoy wouldn't be possible to make without it since it's exactly the interface I'll need. As a bonus, it's opensource, has great documentation and therefore you all can install and use it in your Raspberry Pi projects! BTW, it's working for PiFaceCAD boards - the main reason is the board that was provided in the kit =) I think I'll announce it over Element14 forums quite soon.


So, let's see what I've got so far. First of all, PipBoy is modular.


What are the main PipBoy modules?


  1. CPU module - a Raspberry Pi B+, RTC and, soon, some additional components in an acrylic box. It's huge, since it's designed to accomodate a RPi without removing any ports - to improve the repeatability of the project. From this box goes a 40-pin IDC cable - I've got enough IDE cables and corresponding sockets to have a big supply of those =) It's also got HDMI/audio/MicroUSB cutouts, as well as a camera cable slot. I'll still need to make a hole to let USB cables go through somehow...
  2. Display module - with some buttons attached. See, I've got a really cool LCD from a hackathon I was participating in. It's HD44780-compatible, so works with pyLCI I've spent so much time on, it's I2C so it's very fast, it's also very small and therefore I can at least make the display module very thin. In short, it's awesome for my PipBoy =) I'll add a graphical display in the future - I can certainly make it output captures from BitScope Micro, which has become my indispensable tool. The buttons are for pyLCI control, as well as hexadecimal input which I commonly use online tools for, but let's face it, that's not cool.
  3. Power&GPIO module - here's the thing, I'm adding a GPIO socket to PipBoy, connected straight to the Pi. Therefore, I can use existing breakouts/shields/HATs for the Pi and just connect them to the PipBoy when necessary. Will come in handy when I'm handling motors with GertBoard =) It also has a MFRC522 reader for RFID cards. The power part about the module is that it's got DC-DC converters from Li-ion voltage to 5V and, hopefully, some Li-ion voltage monitoring in the future. I'll also add a DTMF addon so that I can communicate with some simple robots I've built that receive commands through DTMF =)
  4. Communication module - has a FM receiver, WiFi/BT radios, sound amplifier, Pi Camera for photos/video, USB hub and an Arduino for switching power supply of all these modules when I'm not using them. I've also got some WS2812 LEDs I want to use as a flashlight, and the Arduino seems to be a nice thing for controlling them, given it's realtime features and its 5V output which's needed for those LEDs.

Which external components are needed but not included?


  1. Some kind of input device for pyLCI. I think to make a glove, since it goes so well with "arm-mounted computer". I've actually got all the parts, they're just waiting for me =)
  2. Something to hold it on my arm. I've got some belts from laptop bags, so it's OK.
  3. Something ot cover it. Even the acrylic casings can't be uncovered, but there has to be a cutout for the display. That's going to be the complicated part, since I'm so bad at crafting things like that.
  4. A power source. I have some components for Li-ion battery holders, but it ain't enough - there's a lot of supporting circuitry to be inserted, including battery chargers, protection modules - I simply don't have most of those now, they break very easily and even without human intervention. I have a power bank that works, though, but it's not wrist-mountable.


What's the current state?


I've been working for a little less than a week now. I've assembled the Power&GPIO module (without DC-DCs though, need to solder them on). The main module is ready. The display module needs an acrylic casing because it's fragile (laser cutter I was using is broken and I can't cut new parts =( ), but the display works with pyLCI, using the 40-pin cable. The communication module is not assembled yet,since there's so much to assemble. I'll probably omit some parts just to make a working prototype, such as power management Arduino and sound amplifier (I need an audio switcher IC anyway because of all possible audio sources).


What now?


Hacking it all together, of course =) I expect to report back today with some photos, I'll just pick a part of the project I feel like doing today and then show the results.


I'm also entering this as a Hackaday Prize project. I've actually already won one of the prizes with an unrelated project I've started working on with a team I've got in at a hackathon. You can read about it here: ICeeData, it's a really nice project making a difference for patients with Implanted Cardiac Defibrillators. and our nomination has got us a 1000$ from Hackaday which we can use to finally buy ourselves equipment we badly need for moving the project forward. I'll mostly make separate project logs for Hackaday and Element14, since I don't mind writing, but I'll copy-paste if I'll feel like it'll impair the "documenting" part because of simple lack of time, which may happen.


Regardless. It's 4AM - ain't that a perfect time to announce a project? ;-) I'll come back with photos and details!

I ended up having to leave my job without getting paid and after working overtime for three weeks. How is that?


As I probably mentioned, it was about creating a system for the so-called "escape rooms". This is an entertainment place where you are paying to go through a series of rooms full of different riddles and tasks which you need to complete in order to get out of the room in the specified time - usually one hour. My job was to make a room fully automated, since all these quests were interconnected and the game operator (basically, a guy in front of the computer monitoring participants using cameras and communicating with them using a sound system) needed to be able to open the doors remotely. I'd developed the software side of things before the thing started, as ordered, as well as prototyped most of the sensors from these I were to install. Installation, however, took longer than planned due to many unforeseen circumstances, changes of plans, such as changes of game plot or materials we needed to get because of materials appearing not suitable for the changed task. It was also partly my mistakes due to lack of experience - it was my first time installing this system on a real-life object, and thus a learning experience after all, with mistakes made and things taking more time than they should have if it was, say, my third escape room. It also appeared that I'm prone to undervalue time that's to be taken by unexpected problems, as well as work in general - I learned to count in time margins for hardware and software bugs, tiredness, unexpected changes of plans and innovations, as well as trying to communicate the current state of things to my boss. However, my boss weren't exactly understanding the technical side of things, and trying to explain him that software+hardware development problems do have their problems appearing out of the blue which are hard to account for proved to be a task close to impossible in the end. As my part of project was coming to the conclusion, he decided that he doesn't want me working there and fired me, without paying me, hiring a company to re-do the system from scratch. We had no written agreement between us, which turned out to be bad for me - I didn't receive half of the money, having received half of the payment before I started the hardware work, but that doesn't cover me working for 3 weeks overtime - about 10-11 hours a day average, with night shifts and so. I had a plan for the case where I were to finish the system and not receive the payment, but not for the case where I were to be stopped before completing my work and told that it's not needed. If I were to have a contract, though, it turned out that any delay caused by me installing the system for too long would be to be covered by me, as he told. Seems like I had not much ways of preventing this problem before, except for requiring a contract before work starts and discussing it with my lawyer before signing - which seems like common sense now, but I just couldn't think of this particular problem and all the other possible problems seemed to be covered by precautions of mine. Anyway, not getting paid for work I've done is not a thing I'll allow in the future. But this is not the important part.

I appeared to be much more closer to getting rid of depression than I thought. I had a depressive episode the day my dismissal was announced. Honestly, I was shocked. That night I had a lot of trouble sleeping, as well as thinking. I couldn't just accept the situation and search for the ways to get out of it the most profitable way for me. Not so much the next day, when I managed to get my thoughts in order and start to formulate my opinion about this whole situation - I'm not a quick thinker when it comes to forming an opinion, since I'm cautious. The day after that, I started preparing to work on my next projects. Such a quick recoverey would be impossible if I were depressed - I'm speaking from my experience. This is big news for me. One of my reasons for continuing this project - support of my mental state - is disappearing, together with my illness, which, I honestly thought, would take much, much longer to get through. This does not mean anything for how I'm determined to finish this, it just means that my goals change a little bit =)


Here's that "once I finish my job" moment. What I'm gonna do now?

  1. I have to re-make many of my room's computers. My room is my workplace, with many services provided by different computers, from development and maintenance functions to commodity (i.e. printing/scanning/copying) and entertainment services. Therefore, if I don't have a workplace, it's gonna be quite complicated to take on any project. My estimate - one week (that's with my improved estimation capabilities now!)
  2. Then, I'm making a portable router using a Black Swift Board - one of AR9331 devboards, this one being the smallest out of any and IMO fastest in terms of computing power. It's gonna be a fully custom router worn on your belt or so, able to work a day or two without being charged. My estimate - two weeks. Reasons for taking this on before this one - network connectivity is gonna be absolutely crucial for my wearable setup.
  3. Finally these are done, I'm continuing this project, and you'll know once I'm ready to start. I'll also post a timeline on when to expect it, as well as make a couple of announcements on other resources - I'm more and more sure this is gonna be a big thing. I'm also going to read through my previous posts, in case I made any (not time-related) promises about the project in the previous posts, since these will have to happen first.


I'm going to post pictures of how things are going in the meantime, so look out! I'm happy I'm finally continuing this project, no matter how late i am - better late than never. See you, and expect updates soon!

Oh well, my keyboard somehow gets stuck so I have to type slamming the keys quite hard and it still doesn't type certain keys sometimes =) I got my i5 laptop fixed - was as easy as ordering the right part off eBay. Unfortunately, it has its quirks. But - it's finally more comfortable to develop web interfaces which is part of my job right now - see, luakit developer console just doesn't work right =( Saves me some time in searching for money for which to buy a used laptop to do my job right.


Had a hard day today. Was installing the first part of sensor network I'm developing as main project, noting down what I have forgotten to take from home to the place - and I took a lot of things, mind you. Had to take about 7 PSUs, a lot of wires and 4 computers - and all that having a bike as my primary means of transportation =( I came home and felt very, very sleepy. I had a lot of things in mind to do, but 'd constantly find myself browsing FB on my phone, as my brain apparently thinks is a relaxing activity. However, I just had to eat something, take a shower then fall asleep - only to get delayed for 2,5 hours as a result. I certainly need to monitor time I'm wasting, as time wasted while doing 'relaxing' things sometimes is time stolen from sleep, or from the things you have to do when you wake up - as it's the case now, as tomorrow I have to continue connecting the sensors.
So what can be done? With my wearable computer, I can imagine a daemon that you could send a command to, like, 'I need not get distracted now'. It'd then monitor your motions from the accelerometer, maybe put some fast and energetic music on. If there was another daemon that'd collect and store data about your average time needed to complete tasks, they could communicate to predict how much time it will take to do everything you need to do, and provide you by step-by-step instructions,as well as pick slower music as you're closer to going to bed. This is what I see, and this is what I know I need. I need a device that would ping me constantly to make me less slacky, and I have the qualifications to do this.

Moreover, it'd work with depression, too. If the device can detect that the user is awake, but still in bed, more often than not it means slack, but with certain conditions it can be diagnosed as depression - and do something, maybe at least give some advice. Not to mention that a device capable of not only tracking and reminding to take your antidepressants, but also persuading you to take them, is something that can improve the healing process severely, as stopping to take antidepressants rarely means something good - as it is with most pills, frankly.


Anyway, have some photos!

powerbank closed

powerbank open

Here's a powerbank that I've developed to power my portable Raspberry Pi on the go. It's 2 Li-Ion cells from laptop batteries in series, with a step-down DC-DC to feed stable voltage to the Pi effectively, as well as charger&protection module connected to each battery slot. It has a USB socket which can be kept inside the housing together with USB plug connected to it, as you can see - saves some broken wires! The casing is made from 3mm PMMA, and is too bulky to be wrist-worn. I'm already planning to make a waistbelt attachment, and I've made a wire long enough =) As for now, this gives 7 hours of uptime before the batteries are fully discharged, and this is with possibly with the most consuming RPi board. Fitting a RPi B+ will double the rating, and I won't even mention RPi A+ =) This is without many peripherals, though. It also needs to have its corners rounded, as well as to be painted black - as well as this one:

first walls of main module - view from connectorsfirst walls for main module - view from GPIO pins


This is base + one wall of the Raspberry Pi casing, the main module of my creation. It's gonna have most of the sockets on RPi accessible one way or another, as well as have some space inside for USB peripherals and the Ethernet jack - and you can see the laser cutter I'm using =) I'll talk about minimizing it in next series, as well as show how you can connect many peripherals to it in a creative way that applies to any RPi.


My job supposedly ends at the end of October, but I can imagine it stretching a week further. Also, if any of the thoughts I express sparkle your interest, I'd be glad to hear about it in the comments!

During these weeks I experienced lot of problems with the Emlid board and the servo controls

The servos that move control surfaces were moving not so smoothly and they made a lot of noise and vibrations. I Initially thought that the problem was in the mechanic (leverages not moving smoothly or getting stuck during the movement)

After one of the servo started emitting some magic smoke, I started thinking the problem was actually on the PWM signal...

So I connected the Bitscope to the PWM output as I saw something really strange...


PWM output.png


The frequency of the PWM signal was incredibly high: 250 Hz, whereas it was assumed to be 50 Hz

It took me a lot of time to find out that there was one parameter (MOT_SV_SPEED) that was set by default to the wrong value!


PWM parameter.png


After I changed the value from 250 to 50, the PWM output was perfect


PWM output 2.png


Now the servos (I mean, the survived one) run smoothly and without vibrations

Hi again!The Pip-Boy desktop prototype
As you remember from my previous post, I was planning to start my work on this project at the beginning of October. Unfortunately, I cannot make it so - my current job, which requires me to build a control system for Modbus devices, has extended the project's deadline and I am currently doing what I should have been doing 2 weeks ago - helping to plan all the wiring they're making for the whole setup, making some new devices, searching for parts that are missing, and, most importantly, not yet receiving any money as I'll get money as soon as I install the system on the site which isn't ready yet. I'll make a personal blog entry about the system some time later, it'd be excellent as I know it's gonna be interesting to many =)

Depression is a funny illness, in a way. In my case, there are still too many factors influencing me that don't let me get rid of it. But then, I don't feel any of it when I work on things I love working on, and that's coding and electronics. They make me distracted so well that I don't feel any of bad things I usually feel. Instead, there's a sense of accomplishment. Well, the explanation is easy - one thing that depresses me is feeling unproductive. Those reasons are something I should be getting rid of, but as far as it helps me along the way, why not? Hope that helps me fight my procrastination. And about that I shall start my train of thought.


Have you ever actually played Fallout? I highly recommend it - not if you have an important project coming, mind you ;-) I started playing Fallout 3 two years ago, and I got stuck in it for two freaking months. I had a boring shift job, completely unrelated to what I love doing, but at least, hey, I received money, even if it was sorting shitloads of dirty laundry every day! But that was boring. I didn't feel any achievement at all, and the pay wasn't that good compared to amount of force applied. So I went home every day, tired and needing some relaxation. As usual, it was videogames or similar stuff - so I was installing lots of games to try out, but the Fallout 3 was the one that caught my attention with first hours of gameplay. I know I like open-world games, first person shooters even more so, never liked RPGs much but there was and still is something beautiful about the whole combination of these that Fallout is. Long story short, it consumed all my free time, and even a little more, given how boring the job was ;-) What was fascinating about the whole thing?
The sense of achievement. The whole thing that we call gamification wasn't a new idea to me but it was a hella good motivator and every second of gameplay felt worth it. What's the matter? Well, I'd say we people generally like achieving, and we like to be reminded about these achievements. Or maybe it's about how people like quantifying their success... Crap, I'm not a psychologist and I don't know the explanation by heart =) But I'm sure you've felt that way, at least once, if not regularly, and can understand what I mean. Basically, we need achievements, reminders about our successes and victories and similar motivational things to keep us going. Or is it just me? =D


Gamification was the reason I started to think Pip-Boy is worth bringing to the real world. The Pip-Boy, which had a task manager I've come to love so much. For the non-enlightened ones - it was all about the in-game quests, and it happened to be so well built onto the gameplay it started to be an integral part of it. I remember missing it in Vault 112 (wasn't it the one with Tranquility Lane simulation?), always being amazed by how well-built the interface is while using it and discovering some new functions when I thought I knew everything. Say, your life consists of some quest lines, each having its own tasks, sub-tasks and maybe some optional things to do that might influence the outcome. You can see what you need to do at the moment, what you can switch to doing and what's needed for what purpose. That's not exactly a new thing, as that's what people keep all those task books, note books, whatever those paper calendar books with blank places for describing TODOs for every day are called :-) But then, that hardly can be gamificated! It's just a plain boring note book. Moreover, it's not electronic and therefore it doesn't have notifications, achievements and task trees I call quests - not without some scissor and glue magic ;-) There are phone apps for task management, but 1) I still don't have a smartphone 2) neither I want to keep my data on the cloud 3) neither rely on third-party application to provide the exact functions I need when I know I can make my own and better. There are more reasons, and I'm yet to list them all for you - but one thing for sure, I need my own task manager, Fallout style.


Enough about the reasons. While I still have time, let me describe what I've got.
As for now, it's a rechargeable accu-powered Raspberry Pi 2B with a simple yet sophisticated framework for managing simple input devices. The framework is a crucial part of the system and will be something I'll base all my software on. You can read more of it here, I've planned to put a short description here on Element14 but right now I've got no time for writing, so a link will do =) I wrote the whole description there because I needed something to push me to work, and a Hackaday Prize deadline looked like worth it. No way I could have gone further that the application form, but at least I'll likely get a T-shirt with Hackaday logo, and if that's not worth it, then I don't even know ;-) I still plan on describing it here, just not today.
Another Pip-Boy photo

Right now, as I've told you, it's all cardboard outside, meant purely for protection from occasional wind and rain. See, it travels with me every day in the rear pocket of my backpack, for now - as a music player =) Music playback is one of the functions I've already implemented because I needed it desperately. I don't actually have anything capable of music playback all day long that fits in my pocket and is comfortable to use, and the real problem is I'm getting depressed without music. Two months ago, I'd come home only to feel completely powerless and unable to do anything, then fall asleep - only to regret my lack of productivity the next day. Music in the background changes everything, though. I feel better and I'm able to do more things, and if all that's between my well-being is making a Raspberry Pi battery-powered and writing a music playback control system, I'll do it.


It's gonna change this week, however. As you can deduce from the first pic with a Vault-Boy badge, I've got access to a laser cutter, and it's completely free for me. I've got a part-time job maintaining the said cutter, and I get to play with it too =) I've ordered a sheet of acrylic that I'll use to make a case, and I plan on making the first wrist-worn version till Monday, possibly with minor app updates. I wish there could have been more to tell you, but I already spend too much time on my main project. As for now, keeping you updated =)

In this post we will face some image processing technique, like blob analysys.

I will use the cvBlob library to detect blobs in the image capture by the raspicam


Step1: cvBlob installation

1.       Download cvBlob library from this link


          and save it to


2.       Because I downloaded the zip file, I need to unzip it

               tar xvf

3.       We need to make some bug fixing. So move to the source code directory

cd cvblob

cv cvBlob

and make the following  changes to

               a.       cvblob.h: at line 84 replace

const char cvChainCodeMoves[8][2] = { {0, -1},


const signed char cvChainCodeMoves[8][2] = { {0, -1},

               b.      cvlabel.cpp: at line 34 replace

const char movesSE[4][3][4] = <line continues...>


const signed char movesSE[4][3][4] = <line continues...>

               c.       cvlabel.cpp: at line 40 replace

const char movesI[4][3][4] = <line continues...>


const signed char movesI[4][3][4] = <line continues...>

4.       build cvBlob library

                    cd ..    

          cmake .    


5.       and finally let's install the cvBlob library

               make install

Step2: develop an application that can detect blobs in images

  1. I firstly created some utility functions to capture images from RaspiCam. You can find such functions in raspistill.h and raspistill.c (everything is under development, so some code clean up may be required..). Thanks to raspistill functions, the main application is very easy to implement:
raspistill_shot(&state, camera_callback);     

The raspistill_shot take a photo and invoke the camera_callback function. The only      parameter used by this function is an IplImage object that contains the captured image

2.       Next step is to implement functions for detecting blobs in the image. Such functions are implemented in the blobs.c and blobs.h functions. There are two functions currently implemented

detectBlobs(IplImage* image, struct ImageParams* imageParams)           


detectBlobsWithImage(IplImage* image, struct ImageParams* imageParams)               

The second one shows the results of the image processing in a set of windows. So I'm going to use the latter for this demo because it's more impressive than some text output on a console. This function basically gets the images captured by the RaspiCam and creates an image (named segmentated) where the pixel whose color is inside the given RGB color range are white and the other pixels are black

IplImage* segmentated;
cvInRangeS(image, CV_RGB(155, 0, 0), CV_RGB(255, 130, 130), segmentated);

With the segmentated image, we can detect blobsIplImage* labelImg;CvBlobs blobs;

labelImg = cvCreateImage(cvGetSize(image), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);

3.       Let's bring everything together and make  loop that periodically invoke the raspistill_shot function so that we can experiment with the camera and it's behavior in different light conditions

4.       Let's build application with the usual sequence of commands

                    cd build    

          cmake ..    



Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass

Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass - Functional Design

Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass - Route selection and indication

Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass - Direction of Travel Indicator 1

Sci Fi your Pi - Prince Dakkar's patent log taking chart compass - Current Position

Sci Fi your Pi - Prince Dakkar's patent log taking chart compass - GPS Test

Sci Fi your Pi - Prince Dakkar's patent log taking chart compass - Direction of Travel Indicator 2

Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass - The end.......for now


So the winner has been announced this week - Sci Fi Your Pi: Winner Announced! and it was a well deserved win by the Meditech! by balearicdynamics. Even if i had more time I am not sure I would have been able to compete with this project or the other finalists there were some really amazing entries.

Seeing the announcement reminded me I said I would continue with the project despite the deadline having passed so here it is.. I have been working on this a little and have managed to get full direction of travel arrow circuit working.The system now takes the current position and lights an appropriate arrow(s) based on the direction to be traveled to the destination. in this below the arrows showing are South and East as I am North-West of the destination




Currently the destination is fixed so the next stage is to work on adding the destination selection circuits to make this possible. I have added a test circuit to the breadboards to test the idea and added the map travel LEDs as well so i can test that functionality too.




So far I have set up the circuit and it uses a pair of jumper wires rather than the planned wheels to select the destination and start locations. The LEDs that represent the correct rout light up indicating the route to be taken. The next step is to also use this input to set the destination for the direction of travel arrow.


I may have made a little more progress but it was my birthday so I was a little distracted by a new arrival - a Raspberry PI touchscreen (and flotilla case from Pimoroni.




I was also a little distracted by my Abobo robot project a little while back I added a seven segment display mouth and decided it used too many pins. i spent a while soldering in a shift register but hadn't got around to getting it running again. This week AgoBo got his smile back .



Previous posts for this project:


I have been updating my code for the QuadCop in the GitHub.  I am new to Git Hub and many of my changes were not being committed.


So for anyone looking at the code it may make more sense now.  The ControlSwitch_32 was very outdated and missing the head control logic.


On Another note, I have some parts ordered!  I need a new camera and going with the Pi NoirPi Noir.


Im busy this week but next week its back on the project! Going to make a video of the sensor array in action, something I didn't get to before the challenge ended.


The head code is here.  It basically moves the head left and right, at a random point, at random times.  It only updates every few seconds.  It needs to be a bit more complex but I plan to use a 360 degree servo.


Keep in mind this code for the the ChipKit PiChipKit Pi


//A function that randomly moves the head around

void MoveHead()

        if(millis() - headLastUpdate > headDelay)      
    if(headDirection == 0)
   headCurrentPWM -= headSpeedFactor;
   headCurrentPWM += headSpeedFactor;

  if(random(1,10000) < 30)
                  headDirection %= 2;
                  headDelay=random(1,3) * 1000;
                  headSpeedFactor = random(1,6);
                  headSpeedFactor =10;
                  headDelay = 10;
                if(headCurrentPWM >= MAXPWM || headCurrentPWM <= MINPWM)
   if(headDirection == 0)
                            headDirection = 1;
                            headCurrentPWM = MINPWM + 10;
                            headDirection = 0;
                            headCurrentPWM = MAXPWM - 10;
   headSpeedFactor = random(1,3) * 10;
                        //headCurrentPWM = headStopPointPWM;
                           //Wait up to 3 seconds before moving again
                        headDelay = random(1,3);
                        headDelay *= 1000;
              headLastUpdate = millis();



Here is the ControlSwitch code I was using before the challenge ended.  Still messy.

#include <Wire.h>
#include <SoftPWMServo.h>

#define MINPWM 1200
#define MAXPWM 2000
PWM output to Flight controller (PWM write)
3 - Rx Channel 1 (Left and Right)
5 - Rx Channel 2 (forward and reverse)
6 - Rx Channel 3 (climb and dive)
9 - Rx Channel 4 (rotate left and right)
//PWM outut from ChipKit to Flight Controller
//PWM output to Head Servo
#define HEADSERVO  A1

//PWM input from Rx to Chipkit.
#define RXCHANNEL1 8
#define RXCHANNEL2 9
#define RXCHANNEL3 10
#define RXCHANNEL4 11
#define RXCHANNEL6 13
#define RXCHANNEL5 12
//Output pins to RPi for auto and macro mode switching
#define RPIAUTOMODE   2
//Move MACROMODE to RPI Read
 The SPEED definition is the amount to increment or decrement
 the PWM nuetral point in order to move in that direction.  
 In the case it is
 not enough to cuase movement due to wind or other issues,
 the code will increment slowly by FASTER or decrement by SLOWER
 which is a small value.  This will allow the quad to make
 small adjustments in order to compensate without being 
 over controlled. 
#define SPEED  100 
#define STOP            1500
#define STOPHOVER       1800

//an integer to indicate directions

//Climbing and diving is done slower

//Adjustment variables for block and controlbyte
#define  LEFTADJUST    3
#define  RIGHTADJUST   4
#define  CLIMBADJUST   5
#define  DIVEADJUST    6
#define  RRIGHTADJUST  7
#define  RLEFTADJUST   8
#define  ADJUSTFASTER  10
#define  ADJUSTSLOWER  20
//Global Variables

//Speed Vars
int xSpeed = 0;
int ySpeed = 0;
int zSpeed = 0;
int rSpeed = 0;

//Servo pins for PWM output to the flight controller
bool autoMode = false;
bool autoModeInProgress = false;
bool manualModeInProgress = false;
bool forceManual = false;
bool serialOut = true;
bool macroMode = false;
bool macroModeInProgress = false;
bool forward = false;
bool reverse = false;
bool left = false;
bool right = false;
bool climb = false;
bool dive = false;
bool rLeft = false;
bool rRight = false;
bool controlByteChanged = false;
bool controlByteBad = false;
bool ledOn = false;
bool heartBeatChecked = false;
unsigned char controlByte;
unsigned char lastControlByte;
unsigned char temp;
char *r;

//NEW block method of sending data
#define STARTBLOCK 204
#define STOPBLOCK 190
#define RESETBLOCK 195
bool blockStarted = false;
int block[10];
int blockCounter = 0;
bool blockCompleted = false;
bool blockBlock = false;
int blockSkipped = 0 ;

//Head control variables
long headCurrentPWM = STOP;
int headSpeedFactor = 100;
int headDirection = 1;
long headLastUpdate = millis();
long headStopPointPWM = MAXPWM;
int headStopPoint = 0;
int headDelay = 5000;
int headDivFactor = 1;
//A function that randomly moves the head around

void MoveHead()

        if(millis() - headLastUpdate > headDelay)      
    if(headDirection == 0)
   headCurrentPWM -= headSpeedFactor;
   headCurrentPWM += headSpeedFactor;

  if(random(1,10000) < 30)
                  headDirection %= 2;
                  headDelay=random(1,3) * 1000;
                  headSpeedFactor = random(1,6);
                  headSpeedFactor =10;
                  headDelay = 10;
                if(headCurrentPWM >= MAXPWM || headCurrentPWM <= MINPWM)
   if(headDirection == 0)
                            headDirection = 1;
                            headCurrentPWM = MINPWM + 10;
                            headDirection = 0;
                            headCurrentPWM = MAXPWM - 10;
   headSpeedFactor = random(1,3) * 10;
                        //headCurrentPWM = headStopPointPWM;
                           //Wait up to 3 seconds before moving again
                        headDelay = random(1,3);
                        headDelay *= 1000;
              headLastUpdate = millis();

bool ControlByteCheck(unsigned char cb,unsigned char cbc)
 bool valid = true;
 ////////Serial1.print("CB: ");
 ////////Serial1.print("CBC: ");
if(cb % 17 != cbc)
  valid =  false;
//Check for opposing motions, which may mean there is an issue
 //forward and reverse requested
 if(cb % 2 && (cb>>1) % 2)
  valid = false;
 //left and right requested
 if( (cb>>2) % 2 && (cb>>3) % 2)
  valid = false; 
 //rotate left and rotate right requested
 if((cb>>4) % 2 && (cb>>5) % 2)
  valid = false;
 //climb and dive requested
 if((cb>>6) %2 && (cb>>7) %2)
  valid = false;
return valid; 
inline int ParseControlByte()
 int cb = controlByte;
 rRight = false;
 rLeft = false;
 forward = false;
 reverse = false;
 left = false;
 right = false;
 climb = false;
 dive = false;

  if(cb & 1)
   rRight = true;
                        rSpeed = ROTATERIGHT;
  cb >>= 1;
                if(cb & 1)
                        rLeft = true;
                        rSpeed = ROTATELEFT;
                cb >>= 1;
                if(cb & 1)
                        dive = true;
                        zSpeed = MOVEDIVE;
                cb >>= 1;
                if(cb & 1)
                        climb = true;
                        zSpeed = MOVECLIMB;
                cb >>= 1;
                if(cb & 1)
                        right = true;
                        xSpeed = MOVERIGHT;
                cb >>= 1;
                if(cb & 1)
                        left = true;
                        xSpeed = MOVELEFT;
                cb >>= 1;
                if(cb & 1)
                        reverse = true;
                        ySpeed = MOVEREVERSE;
                cb >>= 1;
                if(cb & 1)
                        forward = true;
                        ySpeed = MOVEFORWARD;
                //Set default speeds here
                if(!forward && !reverse)
                    ySpeed = STOP;
                if(!left && !right)
                    xSpeed = STOP;
                if(!climb && !dive)
                    zSpeed = STOPHOVER;

void PrintDirections()


void I2CReceiveEventBlock(int numBytes)
        //Every 2 bytes are our data pairs, writes come in groups of 3
        unsigned char cb, cbc, reg;
 cb = Wire.receive();
                //Block Start
                if(cb == 204)
   blockStarted  = true;
   blockCounter = 0;
                //Block End
  if(cb == 190)
    blockStarted = false;
    blockCompleted = true;
    blockBlock = true;
                //Block Reset
                if(cb == 195)
                   blockStarted = false;
                   blockCompleted = false;
                   blockCounter = 0;
   block[blockCounter++] = cb;
   blockCounter %= 10;

void ProcessBlock()
  int reg = block[0];
  if(reg == 22)
    //heartbeat check
    heartBeatChecked = true;
  else if(reg == 60)
      lastControlByte = controlByte;
      controlByte = block[1];
      controlByteChanged = true;
      controlByteBad = false;
    else if(reg == 90)
        //Speed Change
        //block[1] is the direction and block[2] is to indicate speedup or slowdown
        int direction = block[1];
        int adjust = 0;
        if(block[2] == ADJUSTFASTER)
          adjust = FASTER;
        if(block[2] == ADJUSTSLOWER)
          adjust = SLOWER;
        if(direction == FORWARDADJUST || direction == REVERSEADJUST)
          ySpeed += adjust;
        if(direction == LEFTADJUST || direction == RIGHTADJUST)
          xSpeed += adjust;
        if(direction == RRIGHTADJUST || direction == RLEFTADJUST)
          rSpeed += adjust;
        if(direction == CLIMBADJUST || direction == DIVEADJUST)
            zSpeed += adjust;
        controlByteChanged = true;
        controlByteBad = false;
      controlByteBad = true;
    controlByteBad = true;
  blockBlock = false;

void setup()
 r = new char[20];
 //Random numbers for Head movement.
 //Setup RPi input pins
//Setup PWM pins going to flight controller

 //Setup PWM input pins from RX
//Center all the servos
        SoftPWMServoServoWrite(FLIGHTCONTROL_X, STOP);
        SoftPWMServoServoWrite(FLIGHTCONTROL_Z, MINPWM);
        SoftPWMServoServoWrite(FLIGHTCONTROL_R, STOP);

inline int ReadPWM2(int pin)
 unsigned long m1,m2;
 int d = 0;
 unsigned long functionStart = millis();
 //for(int i=0;i<2 && millis()-functionStart <= READPWMMAXDELAY;i++)
 //wait for the pin to go low
 while(digitalRead(pin) == HIGH && millis()-functionStart <= READPWMMAXDELAYLOW);
 while(digitalRead(pin) == LOW && millis()-functionStart <= READPWMMAXDELAYLOW);
 m1 = micros();
 while(digitalRead(pin) == HIGH && millis()-functionStart <= READPWMMAXDELAYLOW);
 m2 = micros();
 d = (m2-m1);
 //d /= 2;

 ////////Serial1.print("pin: ");
//if(d < MINPWM || d > MAXPWM)
 //f d = 0;
if(millis() - functionStart <= READPWMMAXDELAYLOW)
  return d;
  return 0;

char * CToS(unsigned char c)
  char temp[20]; 
  char t;
   int i = 0;
  while(c > 0)
     t = c % 10;
     c = c  / 10;
     t = t + '0';
     temp[i] = t;
 int q = 0;
 for(int j=i-1;j>=0;j--)
   r[q++] = temp[j];
 r[i] = 0;
return r;  
//More Global Vars for loop
int channel1;
int prevChannel1 = 0;
int channel2;
int prevChannel2 = 0;
int channel3;
int prevChannel3 = 0;
int channel4;
int prevChannel4 = 0;
int channel6;
int channel5;

unsigned int channel1Errors = 0;
unsigned int channel2Errors = 0;
unsigned int channel3Errors = 0;
unsigned int channel4Errors = 0;
unsigned int channel5Errors = 0;
unsigned int channel6Errors = 0;
bool servo1Removed = false;
char controlChar = 0;

void loop()
 unsigned int a = 0;
        int channel6LastChecked = 0;
         // autoMode = false;
       //Move the Head.

 if(!autoMode || forceManual)
   //////Serial1.println("Entering Manual Mode");
   manualModeInProgress = true;
   autoModeInProgress = false;
                    //////Serial1.println("Heartbeat Checked");
                    heartBeatChecked = false;
                  //////Serial1.print("Bytes REceived: ");
                  controlByteChanged = false;
  //Main manual mode logic
  if(channel1Errors < 50)
   channel1 = ReadPWM2(RXCHANNEL1);
  if(channel1 == 0)
   channel1Errors = 0;

  if(channel1Errors < 50 && channel1 != 0 && abs(prevChannel1 - channel1) > SERVODEADBAND)
   //Serial1.println(abs(prevChannel1 - channel1));

   prevChannel1 = channel1;
                        SoftPWMServoServoWrite(FLIGHTCONTROL_X, channel1);
                        servo1Removed = false;
                  if(channel1Errors > 50)
                        //Serial1.println("SERVO 1 REMOVED"); 
                        servo1Removed = true;
                      channel1Errors %= 5000;  //After 500 cycles see if the servo comes back up.
  if(channel2Errors < 50) 
   channel2 = ReadPWM2(RXCHANNEL2);
  if(channel2 == 0) 
   channel2Errors = 0;
  if(channel2 != 0 && abs(prevChannel2 - channel2) > SERVODEADBAND)
   SoftPWMServoServoWrite(FLIGHTCONTROL_Y, channel2);
   prevChannel2 = channel2;
  if(channel3Errors < 50) 
   channel3 = ReadPWM2(RXCHANNEL3);
  if(channel3 == 0)
   channel3Errors = 0;
  if(channel3 != 0 && abs(prevChannel3 - channel3) > SERVODEADBAND)
   SoftPWMServoServoWrite(FLIGHTCONTROL_Z, channel3);
   prevChannel3 = channel3;
  if(channel4Errors < 50) 
   channel4 = ReadPWM2(RXCHANNEL4); 
  if(channel4 == 0)
   channel4Errors = 0;
  if(channel4 != 0 && abs(prevChannel4 - channel4) > SERVODEADBAND)
   SoftPWMServoServoWrite(FLIGHTCONTROL_R, channel4);
   prevChannel4 = channel4;

  channel6 = ReadPWM2(RXCHANNEL6);
  if(channel6 == 0 || channel6 >= STOP)
   autoMode = true;
   if(channel5Errors < 50)
    channel5 = ReadPWM2(RXCHANNEL5);
   if(channel5 == 0)
    channel5Errors = 0;
   if(channel5 > STOP)
    macroMode = true;
     //////Serial1.println("Entering Macro Record Mode.");
     macroModeInProgress = true;
     macroModeInProgress = false;
     macroMode = false;
     //////Serial1.println("Leaving Macro Record Mode.");
  //Main automode logic here
   //////Serial1.println("Entering AUTOMODE");
   autoModeInProgress = true;
   manualModeInProgress =false;
   macroMode = false;
  //automode get directions from Rpi
                //Do Testing
                if(blockBlock &&   blockCompleted)
                  ////////Serial1.print("Skipped: ");
   controlByteChanged = false;

                            SoftPWMServoServoWrite(FLIGHTCONTROL_X, xSpeed);
                            SoftPWMServoServoWrite(FLIGHTCONTROL_Y, ySpeed);
                            SoftPWMServoServoWrite(FLIGHTCONTROL_Z, zSpeed);
                            SoftPWMServoServoWrite(FLIGHTCONTROL_R, rSpeed);

                if(millis() - channel6LastChecked > 2000)
          channel6 = ReadPWM2(RXCHANNEL6);
          if(channel6 != 0 &&  channel6 < STOP)
                     autoMode = false;
                        channel6LastChecked =millis();


After some busy times, I finally have a weekend for working on this project

One of the ancillary parts of this project, is to make some blob analysis on the images captured by RaspiCam to detect laser beams. So I need install the OpenCV libraries and try to get images from RaspiCam inside a C program.

The idea is that an external application will command the autopilot subsystem (by means of MAVLink messages) and will detect reflected laser beams to start some special effect

So here are the steps to install OpenCV


1. Install RaspiCam. Installation procedure is very well described on Raspberry official site Once installed, test it with this command:

raspistill -t 10000

2. Download the MMAL library aspivid/Raspistill source code from .

3. Unzip the file and copy the directory userlan-master in /opt/vc. I also renamed the directory as "userland"

4.  Go to /opt/vc/userland and type

     sed -i ‘s/if (DEFINED CMAKE_TOOLCHAIN_FILE)/if (NOT DEFINED CMAKE_TOOLCHAIN_FILE)/g’ makefiles/cmake/arm-linux.cmake

5. create a build directory...

     sudo mkdir build

     cd build

6. ... and compile

     sudo cmake -DCMAKE_BUILD_TYPE=Release ..
sudo make install

7. go to /opt/vc/bin and test one file typing : ./raspistill -t 3000

OK... now I'm ready to create a new project


1. create a new folder (/home/pi/vc) in your home directory

     mkdir cv
     cd cv

2. copy all raspicam apps source code

     cp /opt/vc/userland/host_applications/linux/apps/raspicam/*  .

     mv RaspiStill.c camcv.c

     sudo chmod 777 camcv.c

3. remove then content of CMakeLists.txt and replace with :
     cmake_minimum_required(VERSION 2.8)
     project( camcv )


     add_executable(camcv RaspiCamControl.c RaspiCLI.c RaspiPreview.c camcv.c RaspiText.c RaspiTexUtil.c gl_scenes/teapot.c gl_scenes/models.c gl_scenes/square.c gl_scenes/mirror.c gl_scenes/yuv.c gl_scenes/sobel.c tga.c)
     target_link_libraries(camcv /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/

4. delete CMakeFiles directory if it exits
5. compile & test
     cmake .
     ./camcv -t 1000

We are now ready to install OpenCV

1. install both dev lib and python lib. Even if I'm going to use pure C for development, Python is still useful for small scripts, so I recommend to install it.

     sudo apt-get update

     sudo apt-get install libopencv-dev
     sudo apt-get install python-opencv

2. to test if OpenCv library is well installed, write this test software. It just displays a picture using imread and imshow functions. You will need to provide a sample .jpg file. To compile using OpenCv lib, create a CMakeLists.txt file with the following lines

     cmake_minimum_required(VERSION 2.8)
     project( displayimage )
     find_package( OpenCV REQUIRED )
     add_executable( displayimage display_image.cpp )
     target_link_libraries( displayimage ${OpenCV_LIBS} )

3. compile and execute

     cmake .

4. Modify your CMakeFiles.txt to include OpenCV library

     project( camcv )


     find_package( OpenCV REQUIRED )




     add_executable(camcv RaspiCamControl.c RaspiCLI.c RaspiPreview.c camcv.c RaspiText.c RaspiTexUtil.c gl_scenes/teapot.c gl_scenes/models.c gl_scenes/square.c gl_scenes/mirror.c gl_scenes/yuv.c gl_scenes/sobel.c tga.c)
     target_link_libraries(camcv /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ /opt/vc/lib/ ${OpenCV_LIBS})

5. Recompile. It should be ok, because we didn't change anything in the source code



In next post I will make a basic application that can capture an image from RaspiCam and make some video processing using the power of the OpenCV library

My QuadCOP is fully repaired.  However there is appears to be one casualty, the Pi-Cam.


I am getting the infamous pi-cam error and all the normal workarounds are failing for me.  When the QuadCop "landed", the little eye/bubble on the head filled with water since it landed in a puddle.  Apparently sound sensors don't detect water real well.  I am not sure if that killed the cam or if the jolt from the ribbon being pulled out did.  I confirmed the ribbon is ok.  The little brown cable from the black camera you can see in the pic, I get the same error if I unplug it so I think the camera itself is shot, but the breakout board is functioning.


I plan to get more flights and continue, but I want to order another Pi Cam to get onboard video.  So stay tuned!  I also need to update my GitHub with some more mods I have been making to try to stabilize it.





The Meditech project is moved to the further step 1. This will produce the formerly testing version of the device; this means a version enabled for testing with volunteers. To move the project and the prototype to the end of this step a series of essential upgrades has been focused playing on the actual version. While Meditech phase zero aimed to reach a full working prototype, despite of the aspect the testing version Meditech phase 1 will be the first full working prototype.


This introduction post points out the first series of changes that will be applied to the actual running prototype to set it as perfectly working in a test environment. Further improvements will be added to the internal architecture and the software component; the release date for the Phase 1 prototype version is expected around the mid - end of the month of October.



The actual prototyped display was 15 inches so an external support was needed. The main issue of this display is its weight and the portability: the device and be easily put in-place but an extra transport bag is needed to carry it and this will reduce the usability of the system that should be set in the normal working conditions in a very short time.

IMG_20150909_113052.jpg IMG_20150909_113126.jpg

How the display actuallly fits and the back support.


The obvious solution mentioned in the posts related to the screen is a smaller 10" or 7" screen that will fit well hosted in the middle of the Meditech device when it is closed. The following image shows the free space for the display that is about 40x20 cm. A custom support should be created supporting the mechanics to slide it in the usage position.



The newly announced <strong>7" touch screen for Raspberry PI</strong><strong>7" touch screen for Raspberry PI</strong> maybe a good solution also for the good price it is proposed on the market.


Control panel cover

In the first version of the prototype the control panel cover has been done with two half of soft 2mm plastic material. The following image shows how it is now; this simplified a lot the disposition of the components on the surface (the GPS and a couple of jack plugs are always missed in the image) as it is very simple to drill and cut.



Then next version will be precisely cut and milled on a 1mm white Aluminium plate; the material is shown in the image below. This solve also the problem of labelling the plugs and signals: a full adhesive surface will be done and applied with the same material used to put the advertising images on the car surface. With the size of the surface this solution will be efficient at a very low cost (about 3-5 Euro)




As show in the following image the actual version is using common commercial cables that result very longer than needed. A series of custom cables will be wired saving a lot of space. Also the power distribution will be affected with a better single board circuit and all where it is possible the round cables will be replaced by flat ones.



Internal devices and components

Also for the internal devices a strong revision will be provided.


  • Make more reliable supports for the Raspberry PI Devices
  • Single-board including all the control panel components (fan control, IR controller, LCD, LEDs
  • Rationalise the power supply, powering the Raspberry PI via the GPIO connector instead the mini USB  plugs
  • Use flat cables all where is possible
  • Optimise the network cabling
  • Improve the door-open switch


I'm one of those people who has been selected to participate in this exchange. Unfortunately, my involvement in it was close to zero. I did have a hard time fullfilling my obligations to Element14 community and the participants of this contest. I had external circumstances that made it quite problematic for me to participate, the most important being clinical depression, which I still feel like I'm yet to get rid of even though it's been a year since I started my medication. Unfortunately, a recession happened due to severe health issues of my girlfriend and it complicated nearly every aspect of my life, leading to me being almost dropped out of my university - thankfully, I got a last chance to reinstate and managed to do so. It had crippled my ability to manage my work as well as I thought I could when I was applying for this challenge. I apologise for letting you all down, and it's solely my fault.
That doesn't mean I've finished any work on my project. I have managed to put the interfacing part together and even exceed my own expectations of what it could become. What I have now is a framework that exposes a display and keypad combo to an arbitrary number of applications which can then make use of them. This makes it a simple yet effective control system for any Linux-driven device, from Raspberry Pi boards to desktops, routers and other Linux-running home appliances, and it's astonishingly cheap - I can add this system to anything for around 10$. You can write various applications for it, for example, one that lets you control your network interfaces or connect to WiFi, one that lets you read your Twitter feed, one that controls your home automation setup and one for general system control such as shutting down or rebooting the whole system. And I'll make it so that applications will be really simple to write, possibly producing a couple of general-purpose tools in the process =) I just wish I could have completed and presented it earlier, then think I could've had seen it being used in some of projects you guys have developed.
Right now, I have a LiIon-powered Raspberry Pi setup that I use both as my desktop computer (hey, I submitted this blog entry using it ;-) ) and a portable music player, and I have written an application that enables me to switch tracks and control volume of my music player. The next application I'll write will probably be a task manager, the one we use for keeping schedules and so on - or maybe a camera app, because as for now I don't have any appliance with a camera, except for my 5-year-old tablet with a 2MP one =) The system already has quite a lot of planning put in it, and I'll soon add lots of crucial elements facilitating application creation, as well as re-build parts of the system re-buildable before it's too late, since its functionality is already limited in some aspects compared to what it could achieve.
I plan on taking the next month off to work on it, and I'm sure I'll be posting my results here. See, I currently am bound contract that requires me to build a control system for 30 Modbus-controlled devices, operating them by a given scenario - as well as build a lot more of those devices =) Sure is a fun thing to implement in Python, and I'll get money which will allow me to live at least the next month without worrying much about money, which is what's necessary now for me to be able to fully dedicate my time to a project that requires as much work as this one. I'll be able to build a better enclosure for my PipBoy (cardboard just doesn't cut it, even if it's painted silver with spraypaint ;-) ), as well as finally let myself stuff it with all the electronics I've dreamt to put in it. I know I'm already outside all the terms, but I know this project is worth it and I feel obliged to finish it, not only because it's fun and useful, but also because I promised all of you to do so by my participation. So during the October, I'll be posting here in this competition blog, or elsewhere on the site if this blog gets closed.
Sorry, guys, no photos now, but I'll soon assemble a second system using a RPi and PiFaceCAD which will be able to take photos - using my system, of course =)

This past week finished up the Sci Fi Your Pi Design Challenge, and we saw a mass influx of final project updates being posted. Following the spirit of my Weekly Design Challenge Summary post, I wanted to make a post that would wrap up the Sci Fi Your Pi Design Challenge and showcase some of my favorite projects.


2015-08-31 20_55_39-Sci Fi Your Pi _ element14.jpg


To get started let’s take a quick look at each of the 25 Sci Fi Your Pi projects that captivated our imagination and inspired us over the last several months.


  • Project: Rover Pi Protector - Brenda Armour ( armour999 ) got off to a rough start with Rover Pi Protector after falling victim to a freak accident in her garage which resulted in some badly hurt arms. She did not let that stop her though, and in the end Rover Pi Protector was a success.
  • Project: Prince Dakkar's patent log taking chart compass - Neil Bizzell ( nbizzell ) cut it close with this project after life got in the way for most of the competition. However, Neil did manage to get a lot of work completed in the final week of the challenge. Neal did manage to get the GPS portion of the project up and running though, so that is a plus!
  • Project: Intelligent body Armor - Unfortunately this project by Joe Carender ( jlcarender ) was destined to become a no-starter and we never saw any progress post made past two very short postings.
  • Project: Advanced Dog Trainer - This project by Vivien Chin ( vivienchin ) , while great in concept, this project was a non-starter with not a single blog post being made.
  • Project: I Ching Hexagrams - Trevor Clarke’s ( taodude ) entry into the Sci Fi Your Pi challenge was a bit of a daunting one, but some great progress was made, and anyone who followed the updates definitely learned a thing or two. Unfortunately Trevor experienced lots of issues with the PiFaceCAD which costed him lots of valuable time.
  • Project: Escape the Past -  Eric Ellwanger ( frellwan ) set out to change the way the industrial world controls its equipment, with this project. By the end, he had managed to successfully send and receive data from an old-school PLC onto his Raspberry Pi, making it a very practical and useful project.
  • Project: Training Sphere - This was another project that got off to a late start due to its builder being part of the Enchanted Objects Challenge. Ambrogio Galbusera ( amgalbu ) set out to create the ultimate training sphere that would hone Jedi skills.
  • Project: Cybernetic Computer Interface - Never before has a project with so few updates managed to crank the WOW-Factor to 11. This wearable cybernetic device by Sebastian Groza ( sebathorus ) is the stuff borgs dream about!
  • Project: Picorder - Michael Hahn ( saturnv ) took Star Trek prop replication to the next level, and built his interpretation of what a TriCorder would be if it were around today! This project is one of my favorites and I recommend everyone check it out!
  • Project: C3P1 - This project was yet another no-starter with Augusto Lisbôa ( augusto.diniz.l ) only posting two short updates..
  • Project: Glove Computer & Control - Unfortunately this project never made it past the challenger announcement, and its builder deleted his user account.
  • Project: Empathy box - Despite being “really excited to begin this project,” Eric Lovejoy ( j0h ) never posted any project updates other than an intro where he talked about ordering some USB dongles and RGB LEDs for the project.
  • Project: Meditech - Enrico Miglino ( balearicdynamics ) knocked the ball out of the park with this medical-based project. With well over 20 updates, Meditech, is one of the most thoroughly documented, well written, and completed projects of the entire challenge. I highly suggest everyone take a few hours and read through this entire project!
  • Project: Knight Rider - This project by Wilson Oberholzer ( scrpn17w ) only saw a few updates.
  • Project: PizzaPi - Margot Paez ( dmrobotix ) set out to change the way pizza is delivered, and the end result is nothing short of amazing. This was one of the more regularly updated projects, and Margot’s enthusiasm and dedication to the project really shown through.
  • Project: Visus Sancto - This project by Cecil Perks ( sirusmage ) was another non-starter.
  • Project: PipBoy Personal Helper - Arsenijs Picugins ( crimier ) got off to a good start with two very well written blog post, but fell silent thereafter.
  • Project: PiBo - This project never made it past the first blog post by  Rajesh C ( kcrajesh ).
  • Project: Real-Life Holographic Projector - This was one of the projects that I was really excited to see progress, but unfortunately Kenny Rasschaert ( kenny_r ) did not make it past the first few post.
  • Project: Sci Fi Advanced Controls - Shrenik Shikhare  had a cool concept with this project, but unfortunately it never made it past the first post that showed off all of the hardware he received.
  • Project: Verbal & Physical Morality Monitor - This project by Harsahib Singh ( harsahib ) was a non-starter.
  • Project: VIRUS - Inderpreet Singh ( ipv1 ) suffered a delay during the build of this project, but despite the setback he still managed to work up a cool project. Bonus points from me for the use of 3D printing!
  • Project: QuadCOP - was one of those projects that saw a ton of updates, and really delivered in each update! Props to Joey Thompson ( screamingtiger ) for writing a wonderful series of updates, and teaching me a few things along the way.
  • Project: PiDesk- Much like Meditech, PiCorder, and a few other projects, PiDesk really turned out way above what I expected. Frederick Vandenbosch ( fvan ) built something that I can only describe as beautiful, and PiDesk is in my top three favorite projects from this design challenge
  • Project: RAED - Conceived by Jeremy Walker ( trenchleton ), this project got off to a good start, but quickly faltered when the update post stopped being posted.


With the projects covered I want to take a few moments to talk about my favorite top three projects. Please keep in mind that I am not a judge of this competition, and all of the projects that were submitted were amazing in their own right. These three projects are simply three of the ones that I enjoyed following the most. I truly enjoyed watching all of the projects progress, and I hope I see more from each contestant in the future.


Project: Meditech



If Enrico Miglino ( balearicdynamics ) set out to create one of the coolest projects that has ever been part of an Element14 Design Challenge, then he succeeded several times over. Enrico finished the core Meditech project weeks ago, and has continued posting updates ever since by building new accessories that augment the medical device. Great job Enrico!



Project: PiDesk



Frederick Vandenbosch ( fvan ) built one of the coolest desk I have ever seen, and as I said in my Weekly Design Challenge Summary, he inspired me to build my own desk over this winter season. From the use of Neo Pixels, to the design and integration of custom touch controls, this whole project was A+ quality!



Project: PiCorder


I would be lying if I said that I was not a bit of a closet Trekie, and Michael Hahn’s ( saturnv ) PiCorder build really brought out the geek in me. I have loved every aspect of this build, and it has really taught me that I need to focus more on getting my projects finished rather than making them 100% aesthetically perfect. From the use of protoboard as the front and rear panels, to using component leads as point to point jumpers makes this project super cool.


Well that is going to wrap up my coverage of the Sci Fi Your Pi Design Challenge until the judging process begins and winners are announced. I hope everyone enjoyed this challenge as much as I did, and I can not wait to see how the Vertical Farming design challenge turn out.


Final writeup

Posted by armour999 Aug 28, 2015


Right to the wire to finish this project. This has been so much fun and a challenge. The software used included Java, C++, Python and Linux. I used several tools including;

1. SD Formatter:

2. Win32diskimager:

3. FileZilla :

Access Point for extending WIFI range


One issue I had was the short range for WiFi outside the house. I looked at several tutorials for building a Access point and the Arch Wipi worked right out of the box. I could not make it wireless as the signal strength was too weak. It was stronger than our wireless in the house and I was impressed at the range it extended the WIFI .


Arch Linux Raspberry Pi WiFi AP Requirements for Rasoberry Pi Model B Revision 2.0

  • Power adapter with at least 1500 mA (2 amp recommended)

  • Minimum 2GB SD Card Fat32 formatted

  • Wifi USB dongle

  • Network Cable with Internet access


    Image Installation and Access Point Setup

  • Download  Arch Linux Wireless Raspberry Pi image:
  • Extract it. eg:  sudo tar zxf archwipi.img.tar.gz

    Optional – extend partition to use all of disk. You can use gparted. (I used Raspi-config)

  • Plug your internet cable and new Arch WiPi SD card into your Raspberry and power it on.

  • Everything is automated so after a minute or so scan for a new network SSID = archwipi

  • If you don’t see the archwipi SSID, then it means you need to manually install your Wi-Fi dongle drivers. I had no issue and the SSID was an option on all the Raspberry Pi computers as a wirelesss network.

  • The wifi password is: 1010101010

  • If you need to login to the Pi the credentials are: root | archwipi

  • You can change the WiFi password to whatever you like by editing: /usr/lib/systemd/system/create_ap.service

  • Check CPU speed, temperature and more using ./

  • You can also view graphs of Pi stats, browse to this address (Pi’s IP Address):8080/


Raspberry Pi Camera (Two ways)


I decided to use two ways to take pictures. One ways was to use the Raspi-Pi cam and also the Snap Camera with the Pi Face Display and Control:


Raspi-Pi cam

Detailed documentation can be found at:


Once the software is installed please note that RaspiStill command will not work if you are testing the camera. The basic installation is:

  • Install Raspbian on your RPi
  • Attach camera to RPi and enable camera support with sudo  raspi-config
  • Clone the code from GitHub and run the installer with the following commands:


          git clone


          cd RPi_Cam_Web_Interface


          chmod u+x


          ./ install


After the install you need to reboot the Raspberry Pi. Once that is completed you can view the GUI in a browser. I found either Chrome or Firefox worked well. This GUI has many features including Timelapse and Motion  Detection. The files are stored in a specific location so I found a way to use Dropbox to store the pictures in real time on my laptop and Blackberry . I did investigate a couple of cloud apps but this seemed to be the easiest.


Adding Dropbox to the Raspberry Pi

You need to set up a DropBox account and then set up an app to link to your Raspberry Pi. You can set up your app at:


I played with some choices but found the File Type version seemed to work well. As you can see it supplies an App Key and App Secret. You will be using this to link to your Pi.

ow we want to install Dropbox for Raspberry Pi:


git clone


Once downloaded you can make the script executable by using the following command:


chmod +x




The first time you run the script you will be asked to enter the App Key and App Secret.



Screenshot (24).png



HINT:  Copy the Keys to a text editor first rather than copy and paste to Putty from DropBox. Otherwise it does not play nice and you may have errors.  I used Word. Once your Keys are accepted it will ask you to open up a URL to confirm connection.  Assuming you are using Putty, copy the contents to your clipboard and paste to a text editor. Now copy the URL to a browser. You may receive a message from Dropbox that the connection is successful but unless you perform the last step in Putty the token may still fail. Some oauth tokens are corrupt so you may have to try a couple of times.


RaPiCamcoder stores media files in /var/www/media. So I want a script to pull the .jpg files to Dropbox and see the media on my Blackberry and Laptop in real time. I tried a couple of test.jpg and it seemed to work like a charm.


I used this script to start the downloader:


pi@raspberrypi ~/Dropbox-Uploader $ ./ upload /var/www/media/ {*.jpg*} /Apps/PiRover


This was tricky. Most documentation did not include a target file for the upload and failed. I took several scripts and reduced the code to one line and added the target DropBox. The command tells Raspberry Pi to upload all files ending in .jpg in /var/ (location that Raspi_Cam_Web stores the images) and upload to my DropBox App called PiRover.


I setup a full dropbox instead for final testing and called it PiRover. When I ran the script the images stored in /var/www/media uploaded to DropBox at a fairly good speed and now is accessible on my Blackberry and Laptop in minutes.


A cron job is added to run the script every minute and I’m done! I will add a cleanup cron job so the SD card does not fill up too fast. I’ll have some videos posted soon. Please do not rain.






A complete guide can be found here:


Basic install instructions are:

Install snap camera with the command
sudo apt-get install python3-snap-camera


Start SnapCamera by running












GPS Real Time Tracking


Most of my effort went to programming the Raspberry Pi and Microstack GPS to provide . The Microstack GPS unit was installed as per the instructions:


sudo apt get update

sudo apt get install

sudo apt get install gpsd gpsdclients pythongps


Disable the Serial Port from raspiconfig


● From the menu choose Advanced Options.


● Then choose serial.


● When asked “would you like a login shell to be accessible over serial?” choose ● A message saying “serial is now disabled” will appear.


● Exit raspiconfig and reboot the Raspberry Pi


Then:  sudo dpkgreconfigure gpsd


● Choose when asked if you want to start gpsd automatically.


● Choose when asked “should gpsd handle attached USB GPS receivers automatically”.


● When asked which “Device the GPS receiver is attached to”, enter /dev/ttyAMA0.


● Accept the defaults for other options.




Now test the GPS with: cgps s





Creating a Real Time GPS Tracking


  • Sharing using a Google Earth KMZ file
  • Live data provided by Microstack GPS
  • Connected to the Raspberry Pi via /dev/ttyAMA0



Robot Chassis (My first Robot)


Well I went with the Half_Pint Runt Rover that came with no instructions. I'm not sure if its because I'm left-handed or Canadian but I fund this a challenge to assemble. I did find a 3D diagram and a short video that helped put the kit together.




Attaching the motors to the Gertbot

A comprehensive datasheet can be found here:



The schematic was helpful in connecting the motor coils and power supply to the  GertBot.