Skip navigation

Sci Fi Your Pi

April 2015 Previous month Next month

IMG_0164.JPGSo the parts list for the kit online included the base board for the Microstack GPS module and accelerometer. However, the kit that arrived at our doors does not include the base board and it seems unlikely to be arriving anytime soon. My GPS module arrived today and I really wanted to get it to work so I had to get creative.

I took a good look at the user guide for the Xtrinsic Sense Board (and the board itself) and noticed that there are some pins sticking out on the top and bottom sides of the board. There are three pins on the top that are for serial sending and receiving of data and on the bottom side there are three pins that allow for ground and power to come in. The power/ground pins are actually for connecting to the Freescale FRDM-KL25z board.


I also looked at Microstack's documentation for the GPS module's pins and found that all I needed were connections for serial communication and power. There are a couple of other pins on the GPS module that can connect to GPIO pins but they are not really necessary for functionality.

IMG_0175.JPGI was a little nervous about the possibility of destroying one of the new RPi's I got in the kit, so I pulled out a spare 2011/12 RPi Model B and tested it on there. Well, you can see from the picture on the left that power is definitely flowing through the wires and into the GPS module, the LED light on the board means that it is communicating with the satellites. Super exciting.

It was time to test whether it was communicating through the pins on the Xtrinsic board. I followed the directions in the Microstack documentation and installed the relevant software. After rebooting I ran their test code and low and behold, coordinates!


I also tested the sensors on the Xtrinsic board and they are still working correctly. I'll be running this off the B+ model in my smart pizza box. I'm going to have to make a base board of my own at some point because the breadboards are very clunky and will take up too much room in the pizza box.

Looks like I'm one step closer to getting this crazy box working!

Previous posts for this project:



Project Update


As you may start noticing, I'm trying a new blogging format for this challenge. The idea is to produce content that can stand on its own, even though created as part of this challenge, but tied together and referenced using project updates such as this one. That way, people interested in the project have access to all the relevant content and others only interested in a specific guide/review are not bothered by content not relevant to them.


I'm thinking there will be three types of posts:

photo (1).JPG

  • guides, in which a certain topic is covered and explained in a generic way
  • reviews, in which thoughts and feedback are shared about a certain product used in this challenge
  • project updates, in which the project specific progress is explained and/or demonstrated


Let me know what you think of this approach!


Back on topic ...


I've managed to spend some time on my project this weekend, and have set up the Raspberry Pi 2 with wifi and camera for use as a desktop computer.

There is one thing I'm curious about though. The Pi 2 contained in the kit was a NOOBS Edition, meaning it contained a preloaded SD card. Having used NOOBS before, I was expecting to see an OS selection menu upon first boot, but that wasn't the case. Instead, it seems Raspbian was pre-installed. Was this the same for you as well ?


The guide on setting up the Raspberry Pi 2, configuring wifi, updating the software and installing the camera can be found here: Sci Fi Your Pi: PiDesk - Guide: Setting Up the Raspberry Pi 2


I'm off to experiment with the Touch Board, which I plan to use for capacitive touch input on the Raspberry Pi.


Until next update!




The Raspberry Pi 2 Model B is the latest member to join the Raspberry Pi family. Even though it has the same form factor as the Raspberry Pi B+, the Raspberry Pi 2 has a quad core ARM v7 processor running at 900Mhz. The RAM has also received an upgrade and has been increased from 512MB to 1GB. Besides these changes, the Raspberry Pi 2 remains similar to the previous B+ model, with:

  • 40-pin GPIO header
  • 4 USB ports
  • 10/100 Ethernet port
  • HDMI port
  • microUSB input for power
  • CSI port for camera board
  • DSI port for display/touch screen
  • combined 4-pole stereo and composite video output




I've made a video of the entire configuration process. If you prefer, you can follow the written instructions instead, starting below the video.





The Raspberry Pi 2 Model B - NOOBS Edition from element14 comes with the following:

  • Raspberry Pi 2 Model B
  • preloaded 8GB microSD card with SD adapter
  • Raspberry Pi quick start guide (in 16 languages!)


The included microSD card is a 8GB SanDisk MicroSDHC Ultra Class 10.


First Boot


After having connected the necessary peripherals such as keyboard, mouse and monitor and having inserted the preloaded microSD card in the designated slot, the Pi is powered on using a 5V/2A microUSB power supply. It seems that no distribution installation selection menu is offered upon the first boot, instead, it appears like Raspbian is already installed. This is different from what I have experienced with a model B+ NOOBS Edition. During the boot process, four raspberries are displayed in the top left corner, indicating four CPU cores have been detected. Once the boot process completes, the Raspberry Pi configuration tool is automatically launched. The configuration tool can be used to configure basic things such as the hostname or wether to boot to the console or directly to the desktop, but also more advanced things such as I2C, SPI, camera support, etc...

Screen Shot 2015-04-26 at 21.57.22.png Screen Shot 2015-04-26 at 21.57.52.png




Instructions on how to configure Wifi from the graphical environment are provided with the element14 WiPi dongle, but work just as well for any other wifi dongle. First, connect the wifi dongle while the Pi is powered off, when done, power on the Pi. Using the menu in the top left corner, go to "Preferences" > "WiFi Configuration". Press the scan button at the bottom of the wpa_gui application and double-click the SSID of the wireless network you would like to connect to.


Screen Shot 2015-04-26 at 21.59.16.png Screen Shot 2015-04-26 at 21.58.37.png

Enter the password of the selected wireless network and click "Add". If all went well, the Pi should be connected to the wireless network and an IP address should be displayed.


Screen Shot 2015-04-26 at 21.58.49.png Screen Shot 2015-04-26 at 21.59.47.png




With network connectivity taken care of, the next step was to update the software. In a terminal, use following commands:

  • sudo apt-get update
  • sudo apt-get upgrade


The process could take more time depending on how old the preloaded version is, and in my case, it took about 15 minutes in total to upgrade the software. Once the software is upgraded, a reboot is necessary to apply the changes:

  • sudo reboot




Connecting the camera module to the Raspberry Pi 2 is identical as with previous models. Ensure the camera's flex cable is inserted in the CSI port near the ethernet port, with the exposed contacts facing away from the ethernet port. Once connected, the Pi can be booted. Before the camera can be used, support for it needs to be enabled. This can be done by running following command in a terminal:

  • sudo raspi-config


This launches the Raspberry Pi configuration tool, which was also presented during the first boot. Menu option 5 allows to enable camera support, and requires a reboot to apply.


Screen Shot 2015-04-29 at 10.48.49.png Screen Shot 2015-04-29 at 10.49.05.png


After having enabled support and having rebooted the Pi, the camera can be used. In a terminal, use following command to make a picture:

  • sudo raspistill -o test.jpg


This will create a file called "test.jpg" after a five second preview on screen. In case the image's orientation is not correct, it can be changed by adding the horizontal and/or vertical flip options to the command:

  • sudo raspistill -o test.jpg --hflip --vflip


Screen Shot 2015-04-29 at 10.49.47.png Screen Shot 2015-04-29 at 10.49.36.png



The Pi is now set up and up-to-date, have fun!

Hello everyone! Well, it's been a busy last couple of days for me. I've been struggling with the Xtrinsinc MEMs sensor board because I cannot seem to get it to work with the Raspberry Pi 2. I believe the problem boils down to a sensor driver that is not compatible with the Broadcom chip. Raspberry Pi 2 has the Broadcom BCM2836 chip whereas all other models run Broadcom BCM2835. Has anyone else had this trouble? It works perfect on the B+ which uses BCM2835. I even tried the image that is on the website for the board but it is totally foreign to the parts on the Raspberry Pi 2 and does not boot up (It does boot on my old Raspberry Pi model B circa 2011). I guess it is possible to write a new driver (I've never done that before so it sounds like a lot of work)...


Well, all that trouble has really slowed down progress for me as I was planning to run the sensor board on the Raspberry Pi 2 along with the web server. I'll have to run the sensor board off the B+ for now.


Aside from troubleshooting I did some planning in my lab notebook to figure out how I want to organize the system between the two Pis. I think I have a good plan of attack and I'll write more about that later on in the week (perhaps over the weekend as my schedule is piling up with work!).


I heard from the people in charge recently and they said they were shipping the GPS module to me from the UK warehouse. Apparently the US warehouse was low! I guess it's in high demand. Thanks team element14 for making all of this possible!!

Well I received my kit on Friday and was very excited to begin. The competitors have impressive projects and very creative ideas. When I applied for this design contest it was a challenge to review all the components in the kit and come up with an idea. So I have been building projects with the Raspberry Pi , sensors and the camera. I also built a project with EnOcean Pi and wireless sensors. It just came to me there was enough components to build the Pi Rover Defender. I plan to add a chassis and a proximity sensor to the project. The general consensus of my friend's children is it must have Spider Legs






My strengths in this contest are programming skills and troubleshooting software errors. I am proficient in Visual Basic, Java, Python, Linux and C. Should I give my Achilles heal to the competitors ? Okay yes ! This will be my first time developing robot movement . I always wanted to do this and will be successful. Oh I have to learn to solder.




I have decided to start with what I am familiar with and build from there. My first step is to set up the navigation system. My second step is to build a stop motion camera that will take pictures or videos when movement is detected. I will blog in more detail on these first two steps. Each additional step will be to add more working components and additional functionality. The challenge is to build a nice electronic sandwich to log the data from all these components and to present the data in a meaningful way.




Navigation System




The kit included the Microstack GPS and XTRINSIC-Sense Board.






XTRINSIC – Sense Board




Inertial Navigation Systems uses a computer (i.e. Raspberry Pi ) and motion sensors (accelerometers and magnetometer) to calculate position, orientation and velocity without the need for GPS or other external devices. I have already tested the XTRINCIC – Sensor Board and made this short video. I logged the output using Putty but as you can see there was some drift in calculating the next Latitude and Longitude. In other words if the projection is off when the first error is made than the errors seem to accumulate. I walked a straight line and not a diangle line as shown in the video. BUT the distance and time were spot on. I will share my algorithms in future blogs and the detail write up. Perhaps the most exciting moment was calculating distance without GPS. I measured my stride and was able to calculate a step had occurred when the acceleration curve crosses below the dynamic threshold. So taking the average of 50 samples of data from the accelerometer Z axis provided the threshold. Lots of trigonometry involved and promise to go into more detail. To project next Latitude and Longitude I needed that starting position, time , distance and bearing. The bearing was from the magnetometer. I will blog more about the calculation of Yaw, Pitch and Roll from these sensors. The video was a previous idea but serves as my idea for navigation without GPS.





The Micro stack GPS




I will have this up and running for more next blog. Of course GPS is the standard for tracking. If you use GPS and walk inside a building then the device will use dead reckoning to project the next position. This is what I will be testing first. The Inertial Navigation System serves as a backup system when the GPS cannot obtain position.




Raspberry Pi Camera




I will probably go with the Raspberry Pi Cam for stop motion camera. I can't wait to test it on the new Raspberry Pi 2. I will provide more tests in future blogs.


I Ching Code Port

Posted by taodude Apr 28, 2015

I have not been idle.  Most of my work over the last few days has been planning and sorting the data models for the hexagrams.  However, i have been experimenting with bits of Python to produce the values that define the quantum states of each Hexagram line. and i have also been thinking about how best to display the results on the PiFace CAD display, and make good use of the navigation toggle and pushbuttons.  I might even use the IR sensor to read inputs from a remote control keypad.


I had intended to port the code from my Psion Organiser II LZ64 as the Hexagram generation engine, but the screen is only a bit larger at 20x4 characters than the PiFaceCAD 2 at 16x2 characters, and so is a nightmare to transcribe from.  Also, I lost the serial port connector ages ago, so i can't export the OPL source code files, and the 128k DataPaks have a proprietary interface.  To add to the mix, i am waiting for a couple of 32GB microDSs, so that i can try out a net installer without messing up the Raspbian NOOBS SD card that came with my RPi 2.


Here are some pix to show how the program output first looked when i wrote it in the late 1980s:

First a picture of the LZ64 for those who were in nappies when it came out .


Next the power-on screen (note the vintage dot matrix LCD display).


The I Ching program is written in OPL, an interpreted language like Python and is invoked from the menu screen


Here is a sample hexagram output that i ran to show it still works:


The I Ching was conceived in ancient China as a Oracle, providing guidance in times of uncertainty.  Augury was widespread in the ancient world and the interpretation below should be taken in that context.  The Illuminatus books are a conspiracy theory comedy, written around the time that the world was on the brink of Armageddon.  Much as i would like to, I am not going to weigh down this Blog with loads Chinese philosophy, but a few snippets are unavoidable in order to understand the project.  Expect an eclectic mix of ancient and modern throughout this project.


The hexagram is split into Lower and Upper Trigrams to fit on the screen and both the 'Old' and 'New' Hexagrams are displayed.  The cast or 'Old' hexagram has a Lower Trigram Tui, 'The Lake' which represents the Joyous, and an Upper Trigram, Chen 'Thunder', which represents the arousing.  Together, they form Hexagram 54 - Kuei Mei 'The Marrying Maiden'  Chen also represents the Eldest Son and Tui the youngest daughter.  The hexagram cautions against supplanting the established order.

Changing lines in the second and fourth place advise now is not the time to act and counsel patience and holding true to the original idea you had.

The 'New Hexagram is Hexagram 24.  The Rock Band Pink Floyd wrote a song about this Hexagram and called it Chapter 24.  The Lyrics say: Change, Return, Success.  Going and coming without error. Action brings good fortune.  So it looks to beworthwhile waiting a bit rather than pushing ahead too quickly now.


That is my excuse and i am sticking to it.  This week i am busy, but next week i am at home on leave and so will make more progress .

By the end of this project I want to be able to start a call on my laptop or my smartphone and start a conversation with whoever is standing next to R2D2, who will holographically project my likeness. That R2D2 model has yet to be built and the projector hasn't arrived yet, so I spent my first week trying to get video conferencing working on the Raspberry Pi 2. You know, baby steps.



When I say video conferencing you might immediately think of Skype, which is probably the most widely used video conferencing software. While Skype does have a Linux client, they don't distribute one that runs on ARM processors. On the Skype support forums someone mentions that the new web client at does work on the Raspberry Pi. Unfortunately, at the time of this writing, the Skype Web client is still in closed beta and I don't have an invitation. Whatever, I'm not particularly fond of Skype anyway.


There are a couple of open source alternatives and the ones I like best are Ekiga (formerly known as GnomeMeeting) and Linphone (part of the now-discontinued Linspire distribution). Both are SIP clients and offer free SIP accounts. Linphone wins major bonus points for having apps for both iOS and Android and for having python wrappers for the Raspberry Pi. These wrappers are featured on their home page, which links to a wiki page with documentation and an example script that is pretty close to what I want to achieve. Fantastic!



The kit we received for this challenge includes the Raspberry Pi camera board. The documentation at mentions methods for snapping stills and recording video with this camera. What I need for this project however, is something that I can use as a webcam. I spent a lot of time fiddling with something called v4l2loopback. The plan was to capture stills or a video stream with the python scripts, send those to the v4l2loopback device using gstreamer, and then access that video device with the conferencing software. It was much later that I found out that all this wasn't necessary. There already is a V4L2 kernel module for the Raspberry Pi camera and it's called bcm-2835-v4l2. Linphone's raspberry pi wiki page I mentioned earlier also makes mention of this kernel module. I wish I had found it earlier... Oh well, lesson learned.


Loading the module is really easy:

sudo modprobe bcm2835-v4l2

echo "bcm2835-v4l2" | sudo tee -a /etc/modules


If you see "ERROR: could not insert 'bcm2835_v4l2': Operation not permitted" when trying to load the module, make sure the ribbon cable is connected correctly with the blue end facing the Ethernet port.

2015-04-26 18.42.34.jpg

I propped up the camera by stuffing the ribbon cable in a bank card holder


The kit also includes the Cirrus Logic Audio Card, but mine hasn't arrived yet. I was hoping to make due with a cheap USB sound adapter I already had lying around. Linux recognizes it as the Tenx Technology, Inc. TP6911 sound card. After configuring modprobe to load the snd_usb_audio driver for it, it doesn't appear to be fully supported on the Raspberry Pi and I can't get the audio input to work. If anyone knows a way of making both input and output of this thing to work on the Pi, please let me know in the comments.


The other two USB devices you see in the picture above are the Wi-Pi dongle that came in the kit and a wireless receiver for keyboard and mouse.


2015-04-26 19.14.39.jpg

Closeup of the cheap Tenx Technology, Inc. TP6911 USB audio adapter



I installed iceweasel (~= Firefox, it's a long story) because the default Epiphany browser on Raspbian doesn't support WebRTC and visited to do a quick video conferencing test without involving any SIP software like Linphone. vLine generated a room and when I visited the link with my laptop, it worked!


The world's loneliest conference call.


This screenshot was taken on the Pi so the actual Pi Camera footage is in the lower right. As you can see in the top right corner, this is very demanding on the CPU. Hopefully this won't become a problem later. I'm not particularly worried about that because I have a more urgent problem right now: I can't get the camera to work anymore.


Shortly after taking the above screenshot I wiped my SD card to start from scratch, just to make sure I had all the necessary steps documented. I reinstalled Raspbian, enabled the camera in raspi-config and loaded the bcm2835-v4l2 module without errors, but the camera just won't work. I spent some time panicking and troubleshooting, trying to figure out what essential step I forgot. Then I got this clue: the raspistill and raspivid tools used to show me a full-screen camera preview. Now they show me this error:

mmal: Received unexpected camera control callback event, 0x4f525245


Judging by this thread on the Raspberry Pi forums, I'm dealing with a hardware failure. Maybe stuffing that ribbon cable in that bank card holder wasn't such a bright idea after all.


To be continued...


Visus Sancto Day One

Posted by sirusmage Apr 27, 2015

Well I got my kit today. Now is when the work starts. Wish me luck.



This weekend I dug into my parts supply and found several sensors which expect to incorporate in the project. They should work out well for they are simple


in design and I have quite a few to choose from. I have some other sensors coming in the mail shortly that I intend to use also, that may prove very interesting in the end.


I've included the pictures of the sensors. I also assembled a Boarduino module that I might determine is useful going forward. Some rough sketches of the Tricorder design to follow later in the week. Trying the Image Gallery.



{gallery} SciFi Pi Gallery


Sensors 1


Sensors 2


Sensors 3


Sensors 4


Boarduino Assembly




LED Test


Halfway done


Power up


Test reset button

One more announcement - I have successfully booted my Raspberry Pi 2 and checked its performance. What can I say?


  1. It can easily perform as a desktop PC for developing this project, for example. Still nothing like 50 browser tabs without lag, but it's pretty much awesome - certainly better than RPi 1 =) I feel like I could make the entire project using it.
  2. With supplied MicroSD card, you don't get raspi-config fully working (no proper memory division dialog or camera function) until you do apt-get update&&upgrade. But then, it's much faster than it used to be, so that little detail doesn't matter =)
  3. Minecraft already on the card =) As well as Wolfram and some other things. See...

I'm building something that'd be a convenient&fast personal helper, not something that'd lag all the time. Do I need Wolfram? I doubt yet. Minecraft? Nah, I feel more like adding something more useful first.

IMHO the default environment is pretty bloated, and I think you should spend time installing things, not deleting them. Not only that speeds up the debugging for me, but also helps me be sure that I can customize the hell out of it.

As you can understand, clean install it is... How does one cleanly install Raspbian?

There's a netinstaller. Basically, you write it on the card, boot from it and watch it install the base system automagically. After that, you're left with a system that hardly takes space on an SD card (might want to see how much it took the last time I used it), is fast, non-bloated and shows a nice text console that enables you to install anything you want.

So - you want a clean system? Just use this tool. One thing - I haven't yet tested it with Raspberry Pi 2. Will do soon and make a quick tutorial.


Another thing I'll do is upgrade from 'stable' to 'testing'. I do that every time, and it's helping a lot. See, it basically installs newer versions of packages that are less tested but are more advanced and that's often a killer feature of this release.

Pros? You get newer versions of packages. That means bugfixes, features for you to use and often speed advancements. For example, two years before upgrading to 'testing' has solved USB connectivity problems I was having a lot - seems that patches were still on their way to 'stable'.

Cons? People hardly ever do this, it seems, so there might be less support, but then - it was never an issue for me. Also, you should always backup your card every time before you do 'apt-get dist-upgrade' or 'upgrade'. If power gets switched off due to e.g. faulty cable, it can get broken easily.

For example, 'chromium' version I have with 'stable' now doesn't even support HTML5. No YouTube for me, huh? At least 'epiphany' lets me listen to YT music videos while I do something... Pretty neat, I have to say! (it crashes when I switch from one video to another though)


Update it is, right? Well, let's see! Now, I'll show you how to update from 'stable' to 'testing'.


  1. Backup your SD card (I didn't LOL) - you better do it with dd from another Linux PC(link), for Windows there's Win32DiskImager.
  2. Boot your Pi from it. Launch the terminal and type:

sudo nano /etc/apt/sources.list

        There's a single uncommented line - our only repository. That simple. Right at the end of repository URL, there's 'wheezy'. Replace that with 'jessie' and close the file saving changes.

  1. Then, type

sudo apt-get update

        There'll be a bunch of lines that mention 'jessie' now, some of them start with 'Get' and indicate that your package index has been updated and system is ready to receive new packages.

  1. Then, it's the actual upgrade. I think you'd better close all open applications except Terminal, just in case. They're ALL going to get updated =)

sudo apt-get dist-upgrade

        It calculates an upgrade for a while, then... 698 packages to upgrade, 369 new to install. The most surprising part is "After this operation, 14.0 MB of additional disk space wil be used." Come on, how come it's that small? For fun, I'll save df output to check it:

df > df_before.txt

Okay, let's press "yes"... And set a stopwatch! For me, that's a simple "; date" added to the 'dist-upgrade' command - as soon as it finishes, it'll print the time to terminal which I can later check.


Seems that it has to get a lot of packages. LEDs on Ethernet blinking like crazy =)

After it has downloaded all of them, it will begin unpacking and installing. It'll ask questions. Answers are:

  • You can disable SSH root login if you care that much about security
  • You definitely should allow it to restart services while upgrading
  • Replace all the config files by 'package maintainer version' unless it was you who modified them. One thing I don't know about yet is that dphys-swapfile configuration file... Okay, you can replace it too =)


...time passed from when I started the installation. I managed to get some sleep =) When I woke up, install was stuck at "Should I rewrite that conffile, master?" dialog. Oh, so my time measurements most certainly will not mean anything =D


diff df_before df_after

< /dev/root

> /dev/root

6520152 2886768   3279132  47% /

6520152 4668636   1497264  76% /


See, that's not exactly 14.0MB. And it's after I deleted Chromium and Epiphany. Shame on you, apt-get, you filthy liar.


What will fundamentally change?

  • You'll get systemd. From my experience, it's actually awesome. I use it a lot and it hardly ever lets me down, and autorun scripts have never been that easy to write, and boot times have increased due to parallel starting of processes (Now you finally can put your 4 cores to work while booting!)
  • As for now, you'll get some bugs connected to how systemd works with previous init scripts, if I understood the situation correcty. I managed to fix them, but it felt more like magic and I don't know what I did, aside from disabling some initscripts I felt were not needed.
  • You'll get newer versions of packages of course - that means improvements and bugfixes. For example, my Microsoft keyboard now works properly with the RPi (before that I had some weird key mappings, such as @ replaced by " and vice-versa.)
  • You'll get broken Epiphany and Chromium, at least at this moment =D I just switched to Iceweasel after some hours of reading gdb backtraces, and removed both offenders from the system. Sorry, folks, I didn't bother reporting the bug in Epiphany. My bad, I am realy busy now and this reporting stuff is quite new for me =(

About Chromium - its version in Raspbian repos is frozen at 22.something, when recent versions are 44 and so on. My opinion? Forget about Chromium. Here's a link that has some explanation.

Personally, I found Iceweasel quite good and already managed to code some things for my project using it to google things, while watching YouTube videos =) BTW, HTML5 is there and properly working, yaaay!


So - the system works for me and is clearly newer than the 'stable' release. All the features I wanted are there and working. But - you should use this method at your own risk.


Shucks. I forgot about my project description again. Sorry =( Will do that ASAP. I will need to learn SketchUp before that, though - to give you all an idea about how it all will look like!

But I did integrate PiFace Control and Display device screen into my Wearable Control System. The corresponding WCS repository branch is here, it's quite a dirty job as for now (heck, I basically threw out a lot of helper functions I made before for future expansion) and only shows that it's working, but I'll certainly integrate it properly =) Keys on the PiFaceCAD are not integrated yet, too. Just quick hardware test. Here, watch this video!

It's the essence of my WCS framework. You can see keypad-controlled prototype menu system with submenus, all callback-driven. See that "Third function selected" in the end? It's a function I activated by using this menu, and it could be anything you'd want it to be, for example, SMS sending function =) There's a lot to be done to make it work, such as enabling it to work with some form of RPC (you don't want to 'import' every program that uses screen and keypad in the control module executable).


That's all to it. I'll keep you updated. This post was written from my Raspberry Pi 2 =) ...Okay, not quite. Javascript performance still sucks, even with 4 cores. Also, it keeps throwing those undervoltage notifications and I feel like things will not work properly until I replace the power supply. So I used i5-based Windows laptop just to finish publishing the article - I do want to sleep now, and waiting for letters to actually appear after I have typed them is not the best way of publishing an article. See you all!IMG_1571_c.jpg

UPD: Just realised I had my Pi slightly underclocked. I think I'll retract that statement for a while =)

Oh, and it does crash when you shine a camera flash at it! I was wearing headphones when I decided to check if it does, triggered the flash and was deafened by a sudden buzz in headphones (was listening to music, probably PWM had continued with the same frequency as it had at the moment of crash). 0/10 wouldn't flash again.

I received my kit yesterday everything was there except for the 4 items missing from everyone's kits


  • Raspberry Pi Model A+
  • Cirrus Logic Audio Card
  • MicroStack Base Board
  • Shim RTC

Now to get back to the design process. I will be using the Raspberry 2 as the main unit for the body armor sensors temp sensors both ambient temp and body temp and the heads up display. and the GPS and accelerometer will be controlled by the Raspberry Pi B+ .  The weapons mounted camera will be controlled by the Raspberry pi A+ provided the missing components arrive. and all components will be connected by wireless connection.

                                                 More to follow soon

I am in the phase of distributing the roles to the electronic components and the planning should close today the first timeline for the Sub-projects definition. I plan a series of posts explaining the reasons of the choices of every sub-architecture. These considerations are open to any contribution, suggestions, critics.


After an in-depth check of the features of the Bitscope oscilloscope the facts confirmed what I supposed before reading the documentation.


General considerations

Meditech acquisition unit should be shown as a series of different modules. There is a set of "standard" probes retrieving some of the basic health information based on commercial sensors (i.e. blood pressure, temperature, hearth rate) and a set of more uncommon probes that can be used depending on the needs. These probes (e.g. ECG terminals, just to cite the simplest) should be acquired with a relevant precision.

As a matter of fact the kind of hardware that better fit (almost perfectly) to assolve this task is just the Bitscope, not last for its reasonable price, essential electronic design and very reduced size with a more than acceptable quality of the samplings.


Bitscope flexibility

The flexibility of the tool - that is, the ability to acquire a wide range of different kind of data, digital and analog - can be considered the key factor for the choice.

But the very important aspect is, despite the "natural" role of the hardware, that it is a general purpose data acquisition and analysis tool: for a certain number of probes it is the perfect electronics!


As the next problem arises, the option to control the acquired signals, the problem is solved: first there is the compatibility with the Raspberry PI devices and second there is a full set of APIs to gain the control of the device programming it as the different measuring requires.


To grant the better timing and data resolution, at least in the product initial design and prototyping the less powerful Raspberry PI will be dedicated to control and manage the acquisition trend from the bitscope as an independent subsystem of the architecture.


Today I managed to get the Raspberry Pi B+ up and running. I had some initial trouble with the SD card I was using, the adapter was buggy but luckily I had a spare. I don't know if anyone else has run into this problem, but when I ran the latest version of NOOBS on the B+, I got a warning saying that Raspbian was not compatible with it. The installation was flawless otherwise and I have not had any issues since.


I decided to start with the Xtrinsic MEMS board. It was a little difficult to get it to fit onto the GPIO pins as I was afraid of breaking it. Not only that, the little pins that stick out on top hurt! After some effort, it locked in securely. It hangs over the HDMI port so I had to run it through SSH (my preferred method).


The setup was super easy. I didn't follow the directions on the pamphlet that comes in the box because I don't think it's really the best way to do things. They want you to download a special install image onto your SD card that includes the modifications that will run the board, but the steps to do it manually are incredibly easy. Plus, I had already spent a few hours trying to get NOOBS on my SD card (erasing a card is slow business, try doing it twice). The directions I followed can be found at Raspberry Pi Spy.

There are several example scripts that demonstrate the capabilities of each of the sensors on the board. All of the scripts worked perfectly right out of the box. I scanned the code for the accelerometer and it looks really easy to understand and work with. Modifying this should be a really straightforward (dare I say...enjoyable?) task. There is an option to send sensor readings to the browser, I have not set this up yet as I need to setup a web server first, but this ties in extremely well with my design idea.

So far, I am really impressed at how easy the board is to use right out of the box. I have a side quadcopter project and the sensor board I am using is not this "plug and play" friendly. You really have to get your hands dirty to get that going. I really like that kind of work, but for rapid prototyping, a board like this one definitely has its advantages. I'm guessing that the rest of the boards in the kit are going to be similar. I should say, I've never worked with development boards before on the Raspberry Pi, so this is a whole new world.


PizzaPi: The Technology

Posted by dmrobotix Apr 24, 2015


I received my kit sometime in the afternoon yesterday. The box came with a printout of the inventory that was to be

found in the box. This totaled 12 items but my box only contained 11. It seems I am missing the Microstack GPS

module. I've sent a few emails so hopefully that will get sorted out.


There are four other pieces from the kit listed online and I'm not sure if those parts will be arriving or not. I know a lot

of people planned to include the audio card and that was not included in the box that was sent to any of us (as far as I

can tell). Also, the baseboard for the Microstack modules was not included and it seems like the GPS and accelerometer

modules are supposed to run off the baseboard. The other missing two pieces are the RPi A+ and Shim RTC.


As promised, this post highlights the technology that I proposed to use to make PizzaPi possible and the following

is taken from my original design proposal.


PizzaPi will include the following from the design kit:

  • Raspberry Pi Model A+
  • Raspberry Pi Model B+
  • PiFace Control & Display 2
  • Shim RTC
  • Microstack Baseboard
  • Microstack GPS
  • Microstack Accelerometer


  • PizzaPi will also use a temperature sensor that is not included in the kit.
  • It will use WiFi dongles to tether to the smart phone's Internet connection.
  • The two Pis will share information with each other and the smart phone via WiFi.
  • The box will incorporate a battery pack to power the devices.
  • The box will be custom designed and 3D printed to hold the cardboard box, the two Pis and accessories in place.



Clearly, some things are changing based on what is physically in my possession. The Microstack set is incomplete, so

I am replacing it (for now) with the Xtrinsic MEMS board. It has everything I need, minus GPS. The other thing I am

replacing is the RPi A+ with the B+. Nothing major, but I have to figure out what to do about getting GPS into the mix.

The next post will discuss my first attempts at getting the Xtrinsic MEMS board working with the RPi B+. Thanks for

following! This is a lot of fun!


I found out earlier in the week that I had been selected to take part in this design challenge. When I was trying to think of an idea for this challenge I wanted to do something different but I was struggling to think of anything much different form the examples (and couldn't work out how to make an actual lightsabre). Eventually in this search for difference I thought about the different way technology is shown to evolve in the Steam punk genre of Sci Fi.


In the genre steam power airships and Victorian costumes are the order of the day. New technology is mixed in with the old and all is very stylised. The video below is a quick crash course in modern steampunk films




Some of the more popular recent films have some good examples of Steampunk technology. League of Extraordinary Gentleman features Captain Nemos submarine the 'Nautilus' and there are some interesting ideas in Skycaptain and The World of Tomorrow had some interesting ideas. (trailers below).




The work of Jules Verne and HG wells is very influential (hence the predominance of victoriana) and this would be an incomplete introduction to the genre without mention of the 60s film 'The Time Machine' The time machine itself is a really interesting design (and featured in  an episode of the 'The Big Bang Theory') which in some ways centered my thoughts on what i was aiming for.


These films and books have the Gentleman adventurer at their hart and that is what gave me the inspiration for my project. Any adventurer worth his salt needs to know where he is going and there is very little 'Dash and Elan' points gained from using a simple map. A vastly over complicated machine with suitably dramatic operation will be much more the sort of thing.


So the device I am creating is Prince Dakkar's patent log taking chart compass. A device that will revolutionise the way an adventurer will find his way around the world. There will be levers and wheels to turn. the display will be like an old worlde map (thing here be dragons). and the route indication will be like the scenes in films where the characters 'Travel by Map'.  The operation wil be suitably dramatic and the finished product will be have lost of gears and wheels and levers as part of the integral operation.


I have added a few pictures below to show some of the stylistic influences for this project.


compass-and-monocular.jpgscreenshot_10.jpgimages (3).jpg tb799a2_airship.jpgrain_white_cars_buddhism_street_the_league_of_extraordinary_gentlemen_desktop_1400x933_wallpaper-287718-1.jpgNautilus-Jules-Verne.jpg



I will attempt to document fully each of the sections of the project here and will make the code available for anyone mad enough to want to make their own version. I now have a big box of parts to investigate and some extra parts to order.


This project is wide and involves several disciplines and different approaches so first of all it has been divided in sub-projects accordingly with the general description of what should be reached.

Some parts should follow a chronological order but there are other steps that can overlap and will be joined together when the development steps needs.


For all the interested followers: I am defining a more detailed and organised project architecture on a trac system that will host notes, documentation, related material following the development timeline. As a matter of fact this blog posts will be the more concise report of the reached steps. So, if you are interested to see the project more in-depth you can access the following link and login with the email pass element14 to gain access as observers of the project (no interaction is provided with this user). If someone of you want to cooperate with suggestions, critics and questions you are welcome, send me a private email for a personal user access.

These blog posts are definitely the official content for the prototype development (changed on May, 17, 2015)


The next blog posts will refer to the mentioned planning and will reflect the state-of-the-art of this project.


Adopted development and documentation tools and packages

The following is a list of the tools that will be adopted during the development lifecycle, as much as possible open-source products and platforms.


Firmware and software development

  • If needed the Linux environment on the Raspberry PI will be Qt-based with the framework QtComplex that I have developed time ago with the Qt Nokia crew then always maintained.
  • Eclipse + ADT plugins for the Android development, Android version 4.4+
  • Eclipse with the C++ environment for the linux side on the RPI devices
  • Where needed specific development platforms where hardware requires them


Hardware design and prototyping

  • PCB prototypes, as well as all the supports and mechanic components will be created with a mill machine controlled by Mach3 on a dedicated Windows platform
  • Circuit schematics, design and PCB layout will be created with Eagle 7.x



To fasten and simplify the documentation operations, excluding the designs and hardware implementation and testing that are documented with graphics, exported images, videos and photos, all the documentation is included in the software and final documents are created with the Doxygen + LaTex tools. The documents will be attached to blog posts, included in the tasks and the project follow-up will be also available on the Balearic Dynamics reference site.


Software sources

All the software sources packages are versioned under Git but at the actual date I am not yet sure where the open-source components will be hosted. Now that Gitorious has closed I am prepending for GitHub.

Previous posts for this project:



Project Update


It's been only three days since the announcement of the finalists for the Sci Fi Your Pi Design Challenge and I already received most of the kit! On Tuesday I received a notification for a package from Farnell being sent my way, and on Wednesday it arrived. That's some quick despatching!




This package already contains many of the components we are receiving for this challenge:

  • Raspberry Pi 2 Model B
  • Raspberry Pi Model B+
  • PiFace Control and Display 2
  • PiFace Digital 2
  • Raspberry Pi Camera Module
  • Xtrinsic Sensor Board
  • MicroStack GPS
  • MicroStack Accelerometer
  • WiPi dongle
  • BitScope Micro Oscilloscope
  • GertBot
  • ChipKit Pi


Items which were not received yet based on Sci Fi Your Pi Design Challenge - Kit List, are:

  • Raspberry Pi Model A+
  • Cirrus Logic Audio Card
  • MicroStack Base Board
  • Shim RTC


Nothing blocking, as there is already plenty to get started with I'll be spending time exploring some of the parts this weekend, so I should have more content to share with you early next week!

I wont waste much time posting about my kit I received, I will just say that I am missing the GPS module which appears others did get.  I really need it for this project!  I sent a couple emails I am sure it will get resolved one way or another.

The other thing I will say is that I took the SD card out of my RPI model B and plugged it into the RPi 2.  It booted right up and all is well.  I did do an sudo apt-get upgrade and it installed the SMP kernel and other stuff specific to the RPi.

So that's all I will say about the Kit.


Here is the body I ordered, it is made of EPP foam.  EPP foam is very tough and rubbery, but requires some structure to keep its shape.  You can crash it all day long and it wont bang up too bad.  Other types of foams will turn into packing peanuts!


Once I have this prototype working I will get a better quad body but in the interim I felt this is a good way to start as I expect some crashes.

Actually its not that big overall.  The motors are about 400mm part which is a normal size quad.  However from end to end due to the foam wings it is 800mm across!  If you want to get started in quads this is a good way to go.

I did look into making my own using wreath rings and carbon fiber arrow shafts but they are way too heavy.  For the cost of this frame its tough to justify building one from scratch unless you are very proficient, which I am not!


I picked some hardware such as motors and props, I used my best guess on the total weight.  I plan to use a 4S lipo instead of a 3s to get more power to lift the stuff I will be putting into it.

All in all the quad cost about $250 all set up.  There are ready to fly quads for $300 now I see with a radio, that have GPS installed and can travel waypoints as well as return to home.  Very cool stuff.  But the waypoints are way to general to be used for security purposes.   Personally I find flying quads to be boring.  However trying to get one to fly by itself and actually do work by itself, now THAT is fun!


Well, I received my Sci-Fi-your-Pi kit today. Unpacked it and inventory is mostly complete. About four items identified in the Kit list are not there. Now down to business. First order of business is to determine a basic structure and layout design for the project as I want the finished product to appear. The dimensions of the enclosure is going to determine the placement of the specific sensor modules. The sensor placement is of paramount importance to me both for aesthetics and functionality, In the original ST series, I don't recall ever being able to actually "see" any of the sensors on the science tricorders or the medical tricorder with the exception of the small handheld accessory Dr. McCoy occasionally used to check a patients vital signs and general health. While the sensors I intend to use are relatively small in size, internal placement will partially determine their effectiveness.

A note on the key concepts

The key concepts of the Meditech project moves around the creation of a multifunctional measuring system for health and disease support, especially in cases where it is difficult an immediate intervention. Really the most fictional approximate representation of this concept is just the tricorder user by Dr Leonard McCoy to diagnose almost anything (of any species the Enterprise meets during his travels) See more here: Medical tricorder - Wikipedia, the free encyclopedia

This inspiring idea has already moved a not indifferent number of average and big players but - as for as what I saw until now proposals and ideas are oriented in directions that are not so much aligned with my personal vision of what science and technology will be.

Anyway, keeping apart the philosophical considerations I see that there are very few things already developed in this direction and few of these are today products. The other aspect - investigating in this apparently poor of interest niche of the biomedical devices applications - is that it seems that possible features are limited to a very generic range of diagnostic devices: heart-rate, temperature, blood pressure and few more.

The core of Meditech is to make a dedicated small and integrated application able to acquire data from more sophisticated probes, to reach a more complete diagnostic system that can be applied in conditions where a traditional diagnosis is slower, extremely difficult and in many cases impossible to make in a reasonable time.

[As some parts for the project diagnostic probes are under evaluation, this part is subject to future modifications and updates]


The second essential feature of Meditech

While a portable diagnostic first-aid tool should be used, not rarely in extreme conditions e.g. accidents, earthquakes, wild areas, etc. it is almost difficult that the injured(s) can count on a perfectly prepared medical crew. Here we are thinking in totally different places than an hospital. So the ability to connect potentially with any Android device to manage data and exchange information real-time with a medical support located remotely can be a definitive solution: The on-site group can count on a remote support, exchange live data and be helped and guided to operate in the better possible conditions despite de objective difficulties inducted by hostile factors impossible to control.



In conclusion to this short premise, what I mean is that the idea aims to be something more than a mere exercise of integration but I hope it will be possible to go many steps ahead. In the next few lines I explain the reasons that I consider more than sufficient to make this project a reality.


I have already explored the possibility to have some local (I live in the Ibiza island, Spain) structures available to test the prototype producing results to publish and promote it. I suppose - as for my experience over past projects - that when the prototype reach the testing phase it is not so difficult to find further sponsorship to make a first 10 units production. Then, last but not least very unit sold to a client (e.g. hospitals, environmental companies etc.) it will correspond a giveaway for free of another unit to a non-profit organisation (at the moment I plan to support Emergency, Save the children, Medicines sans frontieres).


PizzaPi: The Concept

Posted by dmrobotix Apr 23, 2015


The following is taken from my challenge proposal. I'll post more information about the technology

I plan to use from the element14 kit in a future post. So far, I've been researching 3D printing design

because I really have very little experience with that, so I'll be learning as I go! Still waiting for my

kit to arrive. I live in California, so I suspect my kit still has some miles to travel before it does!

- Margot.



The pizza box in Snow Crash had the ability to communicate client information with a car and keep track of the amount of time elapsed since the reservation was first made.


PizzaPi will:

  • Have the ability to communicate with the driver's smart phone.
  • Serve information to a website that the customer can access for updates.
  • Measure internal temperature.
  • Know when it is not upright.
  • Have a screen on the side to read out sensor data and user data.
  • Store customer information such as telephone number, address, and the order.
  • Have a GPS device that can send position to the customer so they know when the pizza is on its way.
  • Be a durable container that encloses and protects the standard cardboard box.
  • Keep track and store elapsed time from order to delivery and other sensor information.
  • Revolutionize pizza delivery.


Although, not the original inspiration for my design, I'm pretty sure this pizza guy could have

benefited from the PizzaPi!


Just this morning I wrote in a post comment that I was waiting for the UPS Tracking number for my kit.... And the magic happened! About one hour ago DRRRRRIIIIIIN !

- Hola?-


... mumble ...

It was !!! For these first minutes enjoy this image then you'll see something of fictionary (hopefully)

RPI Kit.jpg

For you guys, smart-minded, what's there that can't be part of the kit ?


Kit Question

Posted by trenchleton Apr 22, 2015

Just wondering: I've not recieved any information regarding recieving my kit? Was there an email I should have received or was everything taken care of from the info on my application? Just curious.


Project RAED - Week 0

Posted by trenchleton Apr 22, 2015

Hello all! Here's a description of my project, copied and pasted from my application for my convenience! Eagerly awaiting my kit. I intend to pack as much of it as possible into this little fellow.

RAED – Robotic Assistant for Everyday Delegations:

There is a stereotypical piece of sci-fi technology that always seems to be present – the personal robot. The steward bot. The robotic assistant, or maid, or butler. It’s not the only common piece of fantastic tech. There seems to always be a means of Faster-Than-Light travel. There’s usually a teleportation device. These are exciting, but not realistic. Additionally, there is almost always some sort of communicator (often built into a wrist-mounted computer), and some sort of “replicator”. These are not only realistic, but exist! Smart-phones enable us to communicate in more fantastic ways than some sci-fi predictions, and offer incredible computing power. The rise of 3D printing is the precursor to more fantastic replication machines - yet we still don’t have a general purpose household robot. Roombas are a great start, but where is our version of the Jetsons’ Rosie? Our C3-P0? Our CL4P-TP? The duties commonly attributed to this role are within our reach, and with your help, I intend to achieve them through RAED (Robotic Assistant for Everyday Delegation).


To maintain a reachable goal and realistic scope, the robot assistant project must have a defined set of objectives, which are outlined forthwith:

  1. The ability of the robot to navigate its surroundings, and follow the user as needed.
  2. Receiving, interpreting, and responding to voice commands.
  3. The potential to take advantage of the “internet of things” and embedded systems to further automate and integrate with the home.
  4. The intake of basic information about its surroundings.
  5. Basic manipulation.
  6. Recharging capability.


A suite of bump sensors, rangefinders, and pressure sensors in concert with the camera will allow the robot to navigate its surroundings with little danger to itself. It will utilize a rocker-bogie suspension (the same type found in the mars rovers) for increased stability. In order to follow the user, it will utilize RFID. This technology has already been applied to the navigation of mobile robots, and a wearable, dedicated RFID tag will allow the robot to sense and follow the user. RFID navigation is accomplished by using several readers and comparing their relative strengths – a robotic game of “hot and cold”.


In order to provide assistance through automation, voice commands will need to be recognized. These will range from simple commands to questions, and the robot should be able to respond to these accordingly, pulling information from the internet or its sensors as needed. Many of these responses will likely pull from available sources, utilizing systems like “Google Now” to provide detailed answers to given inquiries. Using RFID to locate the user and then aim a directional microphone should allow for a small reduction in background noise, and easier interpretation of given commands.


This robot should be able to connect via wifi or Bluetooth to similarly enabled objects. This means that the robot would be able to, through voice commands, access and activate things from doorlocks to automated coffeepots. This interconnectivity of distributed automation allows the robot to act as a “hub” through which other appliances may be easily accessed. This would allow the robot to respond to even more inquiries about the state of the household.


The robot should be able to sense its surroundings so as to provide relevant information to the user such as temperature, pressure, and location. This would be accomplished via the robot’s onboard suite of sensors.


High-level manipulation would place the robot outside of the scope of this challenge, but simple grasping should be easy to automate. This way, the robot would be able to pick up small objects at ground level. This would allow it to pick up litter or something that has been dropped, thereby offering convenience to the user. This would be done is response to a voice command from the user, using a simple, small manipulator. In the case of litter, this could be stored to a small compartment for later disposal.


The robot should able to recharge itself. This allows for the highest level of automation. In order to do so, the robot should be able to detect a low-battery status and inform the user. An additional RFID tag can be placed into a charging station, and the robot can use this to navigate toward it. Once at the station, orientation and distance sensors can be used to position the robot correctly so as to connect to the charger. Such systems are already in place in other automated systems.


The speed of operation and low power consumption make the Pi a perfect system to control the robot, and the many available options and documentation allow for a wide range of applications, even in a single system. The goals outlined for RAED are within the grasp of technology which is currently available, and would be an excellent way to utilize the Raspberry Pi system to bring science fiction into reality.


Items arrived: unboxing!

Posted by crimier Apr 22, 2015


Hello! I'm happy to announce that I've received a part of my kit already. Here's a list of what I've got:

- Raspberry Pi 2 Model B and 1 Model B+

- BitScope Micro

- PiFace Control & PiFace Digital

- GertBot board

- Raspberry Pi Camera

- Xtrinsic sensor board

- chipKIT board

- MicroStack accelerometer and GPS boards (without baseboard)

- WiPi dongle


I wonder if other parts will be sent as well. I know that by the rules the organizers are free to change the kit as they wish, it's mainly that I'd like to know whether I have to expect some more UPS deliveries... And parts, of course! =) MicroStack baseboard is something you'd definitely want to have with the according components - for the sake of checking whether they work.

I'll still be waiting for Wolfson audio card, RPi A+ and RTC. As I have some components I won't be using, I think I'll pair RPI A+ with GertBot board and make a robot. Always wanted to do something like that, and it'd be an excellent side project to couple with my PipBoy! Wolfson (okay, now Cirrus) card will be quite necessary as audio features are important to me. There's better DAC, line-in, stereo microphone setup, S/PDIF... My PipBoy will certainly have advanced audio playback&recording capabilities =)


So, here's a small overview. First of all... You get an 8GB microSD card with RPi2! Didn't quite expect that. It's pre-loaded with NOOBS. You don't get a card with RPi B+, though, but that's not a problem - cards are quite cheap =) I haven't received an A+ yet. They say it's so tiny and nice, can't wait =)

All the components are packed nicely in a box, with anti-static packaging and retail boxes for each of them. All the safety instructions and stuff like that are there, just as much as you'd need.


I won't be overviewing RPi boards, as people have done it before a lot =) Just will say that it'll be excellent to have 3. All the possible interactions and scenarios... Not to mention I already have 3 RPi boards in different setups around my room =)


Then, there's BitScope. Its packaging is quite good. Beware - you don't get any analog additions with it, like breakout for BNC and analog probes. Seems like you need to get them separately, should you want to debug something analog. =( Doesn't seem to be much of a problem, as you can easily make your own adapter for scope probes using just a BNC socket which I happen to have. They have a quite nice digital probes and USB cable, and the scope seems to be fit for forgetful people like me, it has 2 cheatsheets - one printed on the scope itself, one on a separate cardboard sheet. Nice! I've always wanted a logical analyzer + protocol decoder, and this evening I've understood how it works and successfully have decoded UART messages. If anybody needs help, I can make a guide - it was kinda tricky as I still haven't spotted any nice guide for BitScope Logic software.

I've noticed one thing. The analyzer - it's small, it's sleek and nice, as well as waterproof. Might be one of PipBoy additions! I sure will use it many times, mainly debugging hardware.



GertBot board is what I liked a lot. Board is well-designed, docs are available and easy to read, communication protocol is simple yet well-planned, not to mention that boards are stackable. Beautiful!


chipKIT might be tricky, as I still haven't understood what kind of programming language you need to use to make use of it. But so far it seems to be a nice addition to the kit - should I need to use Arduino shields, this will be the thing.


XTRINSIC-SENSE board is huge. Like, it has only 3 sensors and all the remaining space is not occupied... But will be a brilliant thing for prototyping, I'm sure about that.


MicroStack GPS is something I'll definitely use a lot. I need GPS, but don't happen to have any, and this will be the one I'll use in the actual PipBoy - for a lot of things, it being a wearable device for daily use. As for the accelerometer - honestly, the board is huge for such a small sensor =-) Will use it for prototypes, then just buy another accelerometer board for the actual device.


PiFace Digital IO... Well, home automation. One more thing to add to PipBoy would certainly be home automation control, and this shield... Needs 5V and has 5V outputs when all my relays already mounted in my house are 12V. But I'll sure make use of it - might need some resoldering here and there, such as changing ULN2803 IC power source and so on =) I just wonder why does it have the 40pin header when it seems to only use those pins that are available on 26-pin header as well. Could've made it work on the previous RPi versions as well -_-


PiFace Control and Display is a device I might just make drivers for to use with my Wearable Control System framework, as that also uses buttons and 1602 display, just connected a bit differently  =) I plan on making it work, then sharing it so everybody could use it.



PiCamera... No comments. Great as always. I'm planning on making CSI->Ethernet port adapter in order to use longer cables with the camera and be able to connect/disconnect it easier. But that'll be later, now I just will need to check how it works =)



^Me trying to understand how logic analyzer's protocol decode works... I managed to get it working =)



^All the parts together. Awesome!


For my competitor colleagues and all the interested - guys, sorry for not having my project description blog entry yet. University takes a lot of time, but I hope I can get it posted around Friday. Good luck with your projects!

"Help me, Obi-Wan Kenobi. You're my only hope." - Princess Leia


(Star Wars Episode IV: A New Hope)



Holograms have appeared in many works of science fiction, most notably in the Star Wars trilogies. Through these projections, pre-recorded messages could be displayed in the air and Jedi Masters are able to attend council meetings even when they are away from Coruscant.

These projections were blue-hued and jittery but spoke to the imagination of the audience despite these shortcomings. They were clearly advanced technology, bordering on magic.


Tupac Shakur's appearance on Coachella in 2012 was a sudden reminder to the world that we have had this technology for a while. Through a technique called Pepper's Ghost, an image can be projected onto a transparent surface and appear as if it's really there. In essence, a hologram.


(Image credit: Flickr user evsmitty, CC BY 2.0)


For this design challenge, I will attempt to leverage Pepper's Ghost to give an R2-D2 model a real-life holographic projector to be used for telepresence video conferencing.


Project Overview


The platform

I considered the officially licensed R2-D2 trash can to serve as the base platform, but at 129 dollars, it's too expensive. I'll probably have to build my own. R2-D2 is essentially just a cylinder with a dome on top and two legs at the side.


(ThinkGeek's officially licensed R2-D2 trash can)

The holographic projector


The projector would be the centerpiece of this project, and is most likely the hardest thing to get right. Pepper's ghost requires a transparent surface angled at 45 degrees to both the viewer and the light source. In this case, the light source is a video projector that will be connected to the HDMI port of a Raspberry Pi. If I really can't pull off a convincing hologram, I can fall back on using the video projector to project on a wall. Let's hope it doesn't come to that. I'll be working on this first, as soon as the projector I bought online arrives in the mail.


Video conferencing

The kit we're given to compete in this challenge includes a Raspberry Pi Camera Board. Together with the projector, that means I've got video input and output. Add a microphone and some speakers into the mix and I can do video conferencing! I can get started on setting up the software for this first using a computer monitor while I wait for the projector to arrive.


(Star Wars Episode III: Revenge Of The Sith)


I'm excited about participating in this challenge. I'm curious to see the progress of the other contestants as well, there are some really neat proposals. Let me know what you think about my project in the comments!




I have a habit of biting off more than I can chew and this design Challenge will be no different. Bear with me as I explain the basics of my project. My complete proposal is a 'bit' longer than this particular post and I will be explaining my evil plan for project virus here. Lets go!


Cool Project Name:

VIRUS (Voice & gesture Instructed Robots & control of Universal Systems)




My proposal is conceptualized atop the fusion of JARVIS from IronMan which is a voice Home Automation System and the Minority Reports gesture Control System, merged with robots such as R2D2, C3PO(StarWars), Rosie(Jetsons), Wall-E(duh!) and possibly leverage the power of IoT by offloading Computational components to the cloud like SkyNet(Terminator). The idea is to create a reconfigurable robot that can be "Transformed" into the above mentioned and can link up to a central computer.

Detailed Proposal


Robots are part of every science fiction movie, comic and novel and they are designed in a multitude of ways to simplify tasks in our world. The design of such robots is already taking place like the Roomba, Redhawk and many more. The problem is that these are designed for a specific function which causes consumer to wonder about their "value proposal". Cleaning, cooking and serving are some of the basic chores in a house. The bigger the house, the larger the tasks and with time being something that we all fall short of, we need help. Additionally, our homes are becoming smart homes and facilitate quicker and more automated lifestyles. I will go straight to the Project Design.


Each module is explained in brief in the proceeding sections. The overall diagram of the system is shown below and as a requisite of the challenge, their inspiration is also mentioned.


SciFi Pi.png

The most important thing about this design is the design and use of "Transformer" Robot which can be modified to perform a multitude of tasks. In the proceeding sub-sections I will explain the sub modules to be made, their design and what problems they solve.


The Central Controller - IRIS


Inspiration - JARVIS from the Movie Iron Man Series and SKYNET from Terminator Series


Description: The name IRIS was used for the computer in the Sci Fi Cartoon Series Johnny Quest and it will serve me here as well. The idea is to have a central computer that can act like the controller and data accumulation point for the entire home.  It will sit in my home office/workshop and will accept voice commands via the CIRRUS LOGIC AUDIO CARD which will be processed via Jasper(JARVIS). It will also run OpenHAB which will be used to issue commands to the various sub-modules and robots(SKYNET). The CHIPKIT PI will act as a layer of abstraction between the RPI and the RF Interface which is used to create a PAN. If the need arises, the RF modules can be upgraded and other means of communications can be introduced such as two wire etc and the CHIPKIT can be programmed to accommodate that functionality as well as offload the network management tasks.


It will perform functions of data collection, sending commands, alarm, light, robot etc control as well as voice enabling the office.


Assistant/Telepresence Robot - MINION



Inspiration: GLADoS From Portal,  R2D2 from Star Wars, Minions from Despicable Me, Minority reports Gesture Recognition


Description: A robot that can see. Using a RPI2 and image processing, this robot is able to interpret gestures and can relay this information to the Central Controller. It can track a user in a particular area and even follow them. It will be powered by a RPi2 and equipped with a RPI CAM and images will be processed via OpenCV. This data is transmitted via WiFi using the WiPi Module. In order to facilitate movement, the GertBoard will be used and a new chassis will be created around it. Additionally, a display will be attached to enable telepresence capabilities.


Service/Cleaning Robot - ZOMBIE



Inspiration: Wall-E, Rosie from the Jetsons


Description: In order to perform regular tasks, a reduced robot is required. This robot can be transformed as per requirement. The basis is a chassis, motors and driver which are just the right size. It can perform regular tasks such as serving dinner, cleaning the driveway and acting like a sentry. It is designed to have a base with a RPi B+, WiPi, PiFACE CAD(for task selection) and some sensors for following predefined paths. Line follower robots are a good example and this will use a similar marker system to navigate way-points. As a service robot, it can toggle between two waypoints of the dining table and the kitchen. Same for when it must act like a sentry robot. Fluid sensor on the bottom can detect spills and leaks. And as a Cleaning robot, a zig-Zag way point system will be used to push leaves in the driveway.


Room Panel Module - CONTROL



Inspiration: Wall panel from Iron Man


Description: In order to complete the home automation experience, I need to have an addon that can connect my Central Controller to the wall socket from where everything is conventionally controlled. This module does that. It consists of a RPi A+ based system which connects to our central controller over ethernet. It uses the PiFace DIGITAL to control power appliances that are present on the wall switch panel. I will use the CapSense board from Cypress to replace mechanical buttons with touch buttons and add an IR Transmitter to control the TV and Air Conditioner in the room. Additionally there is a PIR Sensor that will enable the detection of movement in the room which will be used for the Alarm System Capabilities.


The Surveyor and Poseidon

TBD -  Will explain these as they come to life. HINT. I am already working on these!


Project Planning


I am NOT planning a lot of things in this challenge as doing that makes it a chore. I like making things and I will follow that route to build these modules and have as much fun as possible. In this process and time of three months I will try and produce some tutorials so that "You watching at home can follow along". The design is meant to be Useful and Feasible as well as efficient and cost effective solutions to the given problems and I hope I am able to provide sufficient and clear documentation so that it can be replicated by other community members.


Lets get to it then!




I wanted to provide a bit more description from my application to help people understand what I am trying to do.


Reason for the Design Challenge:


I love programming, I love radio controlled aircraft, and I love electronics.  This project allows me to combine all 3!

Quadcopters have been around for many years.  However in recent news the FAA and the media have made quite a bit of drama around "drones" to which they are referring to quadcopter and its variants.   I have been in the radio controlled hobby for years now and I get upset with the negative publicity because of the sheer amount of ignorance and fear involved.  I feel that my project can help shed a positive light on quadcopters and show that my quadcopter can be truly be called a drone.  It  will not present any threat to anyone of their privacy.  It will show a positive use of a quadcopter and that they really only do exactly what we well them.


I will build an automated security guard quadcopter called a QuadCop that will allow one to manually fly the quadcopter and record waypoint macros for play them back later.  The waypoint macros are a series of GPS coordinates and other information that is recorded while manually flying the QuadCop.  This will allows the QuadCop to fly around obstacles in a small area and perform security checks using a variety of sensors.  Sensors include motion detectors, sound recorders, and flame detectors.  The QuadCop will be able to land and will be on a timer to go perform security checks at regular intervals.  The QuadCop will have the ability to record photos and send text messages or emails upon certain events.  Bright LEDs will be used to scare intruders away as well as pre-recorded sound bytes.





Imaging you love to fly radio control aircraft and you have available a nice flying field to fly at with a runway, clubhouse, storage shed, and all the tools necessary to upkeep the field.  It took years to get the club into this condition.

By nature of the RC hobby, the flying field is located away from civilization for safety reasons.  Further, your club only owns the small acre of land the runway and facilities are on and not the vast fields used as a fly over area.


Unfortunately too many times there has been vandalism at the field.  The tool shed has been broken into and things stolen. The picnic tables have been set a blaze.  Trucks have used your runway and pit area for a mud track.  Putting up a fence around the whole area is impossible and would not help the too often need to go retrieve a downed aircraft out in the fields.

Imagine you now have the option to have a security guard at your field that is active during the non- flying hours.  This security guard is called the QuadCop. 


To setup the QuadCop, you only had to manually fly it around your field, and show it how to navigate around the buildings and the trees.  Using a couple of switches on your transmitter, you can tell the QuadCop to start recording the flight path you are flying it through.  Another switch will signal to it that when it reaches these coordinates to do a sensor sweep in a 360 degree turn.  The smart algorithms it has programmed into it allow it to connect your flight paths so it can navigate them in a random pattern.

You can point the sensors at the runway, the picnic tables, or even the club house.  The sensors include motion detectors, flame sensors, and cameras.  The QuadCop even has a “scare mode” where it can activate an on board speaker to give the trespassers a warning, shine high candle power LEDs onto the intruders, taking their pictures and recording their voices. 

Your club has a WiFi connection so the QuadCop and send email and text messages about what it is detecting.  Another club using the QuadCop opted to get a $35 per month smart phone to give the QuadCop 4G access since no WiFi exists.

The QuadCop has a roomy area to land on top of your clubhouse where it protected from the elements and theft.  It uses a sonic sensor to sense the landing pad for a comfy landing.  A future enhancement will become available, the QuadCop base station which will shield the QuadCop, recharge it and download its data for safer keeping. 


For now, the QuadCop can run for days without a charge due to its large Lithium Polymer (LIPO) batteries.  Built in safety features are designed to keep things safe, and in catastrophic failure such as long term loss of GPS the QuadCop simply shuts down.


A web interface allows you to configure the landing points and boundaries for the QuadCop as well as emails, texts and other important settings.  The setup the times of operation and how often to perform checks are also configured here.


You may or may not scare the thieves away, but you now have their pictures and you were able to get out the field before they got away too far.  Justice now has a chance.

In another case the intruders were scared off and no damage was done.  The video the QuadCop recorded was played back on the local news to give everyone some good entertainment.




Testing and programming has begun.  I will be replacing the RPi B with the RPi 2 when I get it.  The Arduino micro will be replaced with the pro mini, it is just used for convenience for now while I test code.

StarWars was THE SciFi movie of my childhood and still one of my favorites. StarWars universe represents a great source of inspiration from starships to robots, weapons, armors, clothes and various gadgets.

The source of inspiration for my project is "Borg Construct Aj^6" device, used by Lobot in Cloud City - StarWars - Episode 5 - Empire Strikes Back.

Besides the Lobot from original Star Wars series, AJ^6 device was used by other characters from extended Star Wars universe.


The "official" description of Borg Construct Aj^6 :


The Aj^6 was a sophisticated cyborg construct that allowed a being to become a cybernetic computer interface liaison able to mentally control computer systems.

The device was implanted against the skull, and sent nanothreads into the brain to form a link between the biocomputer unit and the cyborg. The Aj^6's cyborg computer was known to noticeably improve the intelligence of the wearer, enhancing logic and reasoning capabilities. The cyborg could also analyze data at roughly twenty times the speed of a non-cyborg computer operator.

The Aj^6's internal computer stored vast amounts of data, though additional information could be accrued with the use of knowledge cartridges. These devices plugged into ports on the back of the Borg Construct, and held data on virtually any subject. This allowed cyborgs to load and process data as and when needed.


Despite the advantages of a Borg Construct, the Aj^6 came under criticism since it was seen to limit a wearer's personality, and almost literally turned them into walking machines.

This resulted in a lack of face-to-face communication between a cyborg and other sentients, since their attentions were often turned to central computers.


I believe this device is a good model of a SciFi device which can be turned into a real world tool.


Used components description:


At the core of AJ^6-RW will be a RaspberryPi_2 unit. In order to connect RPi2 with user and surrounding environment the following components will be used:

- RaspberryPi camera - for taking pictures and video recording

- PiFace Control & Display - for system settings and information and as main user input/control panel

- SHIM RTC - real time clock

- WIPI - wireless LAN connectivity

- USB sound card + microphone - for audio output and voice commands input

- high power LED controlled by RPi2 GPIO + FET for directional light

- RGB LED strip - for visual signaling and ambient lighting

- Stereo headphones - for audio menus and music

- Roving Networks RN-52 Bluetooth audio module - for wireless smartphone audio connectivity

- Pololu Adjustable 4-12V Step-Up Voltage Regulator U3V50ALV - provide 5V/5A for RPi and peripherals

- motion sensor

- possible use of  Microstack GPS for navigation/position

- Li-Ion or Li-Po batteries for power

- USB Card Reader

- USB Hub and/or USB extension cords to bring RPi USB port outside the case

- charger


Depending on the connection requirements with RaspberryPi, additional boards might be used for extended functionality.


Functional description:


The skull implants and brain connected nanothreads will be left aside for now , but in order to allow user to communicate with the device, RaspberryPi run Python and shell scripts for reading PiFaceCAD buttons and launch commands and scripts according with user actions. Menu navigation will be aided by PiFaceCAD display and acoustic through headphones, hence commands could be launched without the need of the display.

PiFaceCAD panel can be folded from its default position so that the display can be seen and operated by user when needed.


A microphone will be used for speech recognition and vocal commands.


The audio sub-system will be used for audio feedback from applications, internet radio, media player - used for music/audio files, document reader, sensors readings, battery status, etc.

Headphones are also used by RN-52 Bluetooth audio module for smartphone connection, in order to listen at music streams and talking to the phone.


The original Aj^6 device was conceived to use "knowledge cartridges" to offer various data to it's wearer. My implementation will use a card reader and USB extension cords/USB hub to bring RaspberryPi's USB ports to the front of the case. Besides USB memory sticks, these ports can be also used for other USB connected devices.


On the outside of AJ^6-RW case, a RGB LED strip will be fitted for signaling purposes or provide ambient lighting. An high power LED is placed on the front left of the device to provide a directional light when is required or to improve RPi camera image quality in low light. This LED is controlled by RPi GPIO and a FET.

For example when biking it can be used to signal when brake or change direction. At camping can be used as ambient light and the front High power LED as directional light.


On the back of AJ^6-RW case, is placed the charging connector, RaspberryPi LAN port and a motion sensor so no one can sneak behind you unnoticed .


By default, the RaspberryPi's WiFi connection will be configured in normal host mode so it will use a wireless access point for remote connection, internet access, etc.

If needed, and allow other WiFi enabled devices to connect to it.

Alternative, a second RaspberryPi can be configured as access point and turned on only when needed in order to save power.

An Apache or Lighttpd web server will give access to AJ^6-RW camera, sensors, lights and battery status.


For power a number of Li-Ion/Li-Po cells will be used along with a Step-Up Voltage Regulator capable to provide 5V at about 5A. The number of cells will be determined to offer a right balance between weight and autonomy, but it is possible to be modified by user to favor one or other.

From a 5V USB connector, AJ^6 can supply power to charge a smartphone or other small electronic device.

AJ^6-RW can be used as a hub for other wearable components, like lights, sensors, motors, servos, etc.


Attached are few pictures of original device and the way it was used in the movie and a number of views of an approximate sketch of the device, in the way I imagine it.

First of all - MANY THANKS!!!

Second of all - sorry about my poor english (i've learned by myself).

So, as everyone said before, i'm VERY HAPPY as being selected to participate, that deserves a celebration...

So, let's go to hard work, i'm attempting to make my "Threepi" or "ThreePiOne" head with resin, or fiberglass (my buddy is doing that with the draws that i gave to him).

The code part is "tricky" (ok, it's being hell, but it worth) - at the next month i will have some acceptable job in that. (not enough time, since i'm helping my bride with our wedding (at 20/06) SO, i've lot of work...

I'll keep everything posted here...



Hi, Welcome to my Blog on the creation of I Ching Hexagrams using a Raspberry Pi.


The challengers have only just been announced, so i don't have the Design Kit yet.  Farnell have sent me an e-mail though telling me that it will be delivered some time over the next couple of weeks.  This introduction is a collection of snippets to give you an idea of what to expect in the run-up to the August deadline for project submission.  I do have a day job as an engineer, but i have not been a professional designer since the days when microprocessors only had eight bits (or 16 if you had loads of dosh to splash) and Real Programmers programmed in FORTRAN, or if you were really good, native Assembly Language (was that only 30 years ago?).


The idea for the project is based on a computer that features in the Illuminatus Trilogy by Robert Shea and Robert Anton Wilson (1).  The computer lives on a yellow submarine owned by eccentric Norwegian multi-millionaire Hagbard Celine (it was written a long time ago when a million was actually a serious amount of money that would buy you the best part of a hundred houses, rather than one or two if you live in London now)


If you want to find out the computer's real name, you will need to read the books: as this is a family show, and the Competition is based on Raspberry Pi hardware, I have renamed the computer Multi-core Executable Software Solution Using Pi (MESS-UP).


Once i have started the coding, i will make bits available for download via GitHub (link to be announced).  Some code will be a direct port from the core program that i created in the late 1990s to run on my Psion Organiser LZ64 (which incidentally still works), but there will need to be a lot of new code to drive the hardware that i will be using from the Design Kit.


First a bit of introduction (with an apology to scholars of Chinese wisdom/philosophy for cutting a few corners in explanation - OK a lot of corners):


If you want to learn more about the I Ching itself, please read the Richard Wilhelm/Cary F Baynes translation (2) for a more traditional explanation.  You could also read 'The Authentic I Ching' by Henry Wei (3), or the I Ching Workbook by Robert Lee Wing(4).


The I Ching, or 'Book of Changes'  is an ancient Chinese oracle that consists of casting Hexagrams from a process of dividing a set of yarrow stalks or tossing coins and interpreting the results from a text.


Reinterpreting the fundamental nature of the I Ching for 21st Century science from reading Illuminatus and the Tao of Physics by Fritjof Capra (5), each Hexagram consists of six lines, each of which are in one of two quantum states, even or Yin and odd, or Yang.  Given that there are six lines and two quantum states, 64 hexagrams are possible.


Each quantum state has two sub-states: one is metastable and the other is unstable and transforms into the metastable version of its opposite, thus completing the cycle of Life.  This means that each hexagram cast will either remain fixed or mutate into one of the other 63 hexagrams.


Each hexagram is made up of two trigrams: the eight possible trigrams are shown in the graphic below, that represents the T'ai Chi or 'Supreme Ultimate'.  Hexagrams and their upper and lower trigrams are built and interpreted from the bottom upwards, and the significance of each line varies between hexagram/trigram.


Tai Chi.JPG


Back to the project now:


MESS-UP will cast Hexagrams and give an interpretation of the result, with both a visual display on the PiFace card and an audio description in the spirit of Hagbard Celine's original computer, which did not have a display and was voice operated.  I might go for a Stephen Hawking voice for the project, but as that is so popular, i might go for something different.


Hexagrams are created randomly, so the program calls up a number of random numbers, but as the hexagram cast is a function of the state of the universe at the time it is cast, the random number generator will be seeded from a combination of the Real-Time Clock value, the GPS position of the Pi and the attitude of the Pi determined by the accelerometer when the cast command is issued.


If it turns up in time, i might work in the PiJuice solar power unit that i have backed on KickStarter so that the Hexagram Generator is truly portable.


That's it for now.  Introduction Part 2 will follow next week, unless the Kit arrives earlier than expected, in which case there will be some progress to report.



1.     The Robert Anton Wilson Website - The Illuminatus! Trilogy

2.     I Ching: Richard Wilhelm Translation ISBN 0-7100-9527-9

3.     The Authentic I Ching' by Henry Wei ISBN 0-7054-2507-X

4.     The I Ching Workbook by Robert Lee Wing ISBN 0-85030-372-9.

5.     Tao of Physics by Fritjof Capra ISBN: 9781590308356


Visus Sancto

Posted by sirusmage Apr 21, 2015

For my project I am doing a type of monocular on a head mounted set. This is for use in paranormal investigations (aka ghost hunting). The main setup will be an IR camera, microphone and maybe a thermal camera (I am still looking at prices to find the right one). There will be a small lcd screen mounted in front of the eye with the camera on a pan and tilt mount. When set correctly for the user it should give almost Augmented Reality. The RPi will do many things record the video and audio (for later analysis) but also it will control the pan and tilt mount. The name Visius Sancto is Latin for Ghost Sight. It is believed by many that IR and or thermal cameras can pick up these paranormal events that the human eye cannot see. It is my home that if this is successful that the user will be about to see these in real time. This would not only allow better understanding but also better research. I am hoping to make the finished product look like a one eyed version of "The Schufftein Glasses" from "Hellboy 2: The Golden Army" (see picture below). Please feel free to post any questions or comments that you have about my project.




Hello! I found out this morning that I have been selected as a challenger in the Sci-Fi Your Pi Design Challenge! This is my first time competing in an element14 design challenge -- I'm really excited and ready to dive in.


My design is called, PizzaPi, and the idea is to make pizza boxes and pizza delivery more intelligent. The inspiration for my project comes from Neal Stephenson's novel, "Snow Crash". Stephenson is one of my favorite sci-fi/cyberpunk/weirdo authors and it's a real treat to be able to make one of his futuristic ideas come to life.


I don't want to spoil the fun and give too many details at the start, but I will do my best to make each step of the process clear, informative, and hopefully interesting.


For now, please enjoy Dean Martin's rendition of "That's Amore" and eat lots of pizza.



When thinking about science-fiction, things that come to mind are touch/motion controls, things that slide in and out of place and bright lights. This is why I would like to propose to build the desk of the future, inspired by some visual effects of the Tron Legacy movie.






Slide1.pngThe desk should make optimal use of the surface available. This implies that if for example a desktop computer is not in use, it should disappear in order to free up the space. The idea here is to have a built-in desktop computer, using the Raspberry Pi 2 and some other accessories such as the Cirrus Logic Audio Card, Pi Camera and WiPi dongle, that would slide out of the desk at the touch of a button. The mechanism would be much like the Z-axis of a 3D printer, using threaded rods and stepper motors, controlled by the GertBot add-on board. Pressing the button again would result in the computer to slide back into the desk, freeing up space to be used for something else. Ideally, the desktop computer would power on automatically when sliding out and shutdown properly when sliding back in.


Another feature would be to have lights integrated into the desk. I’d like to experiment with lights built into the surface of the desk, consisting of white LED strips which would be toggled on or off using the PiFace Digital 2’s relays. The lights would be laid out in patterns inspired by the Tron Legacy movie. Their purpose should not only be limited to giving light though. For example, the integrated lights could blink briefly when an email is received or when a certain hashtag is used on Twitter.




The above mentioned features require buttons or some other form of control. This would be covered by capacitive touch sensors hidden in the table. Touching the surface at certain locations would act as buttons. I’ve been experimenting with Bare Conductive’s Touch Board, which is basically an Arduino with onboard capacitive touch chip and mp3 player. Pressing a button would trigger the required action, with accompanying sound effect. The picture on the side represents some experiments I've been doing with aluminium foil and conductive paint as sensors, with and without a plexiglass overlay.


Finally, once everything is proven to work as expected, I’d like to replace the off the shelf components with a custom built Raspberry Pi HAT, resulting in a more compact, easily wired and cost effective solution. The design of the schematic and layout of the board would be done using Eagle and will be released at the end of the project. In case of problems with the HAT, the BitScope Micro will come in handy for troubleshooting.






The starting point of the desk would be a cheap Ikea desk. Because the features mentioned earlier will be integrated into the desk, the surface will be impacted. To mask these modifications and give the project a more futuristic look, a sheet of white plexiglass would be used to cover the desk’s surface. This will still allow light to shine through and capacitive touch buttons to be triggered. A cutout will ensure the desktop computer can slide in and out of the desk.




The above sketches represent what I'd like to achieve with this desk. The futuristic patterns will be inlaid with LED strips, at the back of the desk the computer will be able to slide in and out of the desk and finally in the front right corner some capacitive touch buttons will be available. The surface will then be covered by a sheet of thick, white plexi, hiding the LED strips and buttons. At first glance, it will look like a normal desk, until a button is pressed ...



The goal of this project is to have something anyone can build at home. At first, off the shelf components will be used to demonstrate the desired functionality and will later be replaced by a custom HAT. The HAT will ensure the project is easier to build and more cost effective.

I hope you like my idea, and if you have any additional suggestions, remarks or questions, feel free to comment below!


Empathy box begins!

Posted by j0h Apr 21, 2015

Hey, Apparently my empathy box project has bee selected for Sci-Fi your Pi!

I am really excited to begin this project. I think i will start by re-reading Phillip K Dick's

"Do androids dream of electric sheep" book  again, since last time I read it was in 2003.

I want to provide a functionality similar to the described Empathy box, with the exceptions that

my device shall not actually induce pain (beyond mild audio annoyances), and will be mobile.

I want users to be able to take the empathy box with them, rather than having to sit at home and use it.


I'm grateful for this opportunity, and look forward to feeling the same as everyone else!


as an added bonus,  I recently ordered and received 12 usb wifi dongles, and 50 RGB LEDs for use in this project.

I have some ideas for a device housing too.


Knight Rider

Posted by scrpn17w Apr 21, 2015

This is my build thread of turning my 1999 Honda Prelude into an emulation of KITT (Knight Industries Two Thousand) from the TV series "Knight Rider". There is going to be a lot of buttons, switches and various indicator lights. I'll add a small screen in the car for both interfacing with the Pi and displaying various data/Images/Videos. Then of course there are the 3 most recognizable traits of the car: The LED "scanner" on the front of the car, the unmistakable voice from the "AI" of the car, and the LED "visualizer" of the cars AI system. Being that I won't be able to make this quite as awe inspiring as the actual KITT and the car I'm using was made in 1999 I'm going to call it the K.I.N.N. (Knight Industries Nineteen Ninety-nine).


Intelligent Body Armor

Posted by jlcarender Apr 21, 2015

Intelligent body Armor with sensors for body temp, heart rate, hit sensors and wireless helmet mounted display that has readout for GPS location, body temperature, heart rate, humidity, wind speed and direction and Armor damage percentage and wireless weapon mounted camera and tracking system using voice commands with wireless wrist mounted keyboard control  and all components would be networked together. for use in airsoft war games and maybe for real life combat implementation. like the armor suit in the video game Crysis.