Skip navigation

With Christmas almost upon us it’s time for our annual Element 14 Christmas Project. Last year we built a Santa Trap that used a Raspberry Pi and camera to photograph Santa as he delivers presents around the globe.

For 2015 we are going to create an IoT (Internet of Things) Christmas tree that will accept commands via Twitter, so we can control our tree from afar.



For this project you will need

A Raspberry Pi A+ or B+, this project does not work with Pi 2.A Raspberry Pi A+ or B+, this project does not work with Pi 2.

The latest version of Raspbian

A breadboard

A selection of Jumper cables (Male to female, male to male)

SN74AHCT125 Level converterSN74AHCT125 Level converter

2.1mm female barrel jack adapter

Terminal blockTerminal block

5v 4A power supply

A Neopixel string

All of the code for this project can be found in a Github repository

Raspberry Pi Zero Update

This project is compatible with the new Raspberry Pi ZeroRaspberry Pi Zero, enabling a truly low cost alternative that can be easily embedded.


Hardware Setup

Firstly we need to build the circuit for our Raspberry Pi. Firstly we shall connect our 74AHCT125 to a breadboard. Ensure that the pins are spread over the central channel of the board. You may also need to bend the pins so that they fit into the board, this can be done by gently pressing the chip against a solid surface.

Now we move to wiring up the chip and for this please refer to the diagram for more information.


Attaching the female barrel jack adapter to our power supply we now have two screw terminals, marked + and -. Unscrew each terminal and attach a length of wire or a male to male jumper cable. Insert the + wire into the + rail of your breadboard, likewise with the - wire. You will also have a spare black wire for your neopixel strip. This is a GND that will also need to be attached to your - rail. Attaching a wire from any of your Raspberry Pi ground pins to the - rail will complete all of the ground connections.

For now do not apply power to your neopixels and double check all of the connections made before progressing.

Software Setup

Power up your Raspberry Pi and boot to the Raspbian desktop. Firstly we will need to download and install a few extra software packages and for this you will need an Internet connection before you proceed.

To update and install the software open a terminal and type the following, then press Enter.

With that complete we shall now download the rpi_ws281x Python library. In a terminal type

Then navigate to the downloaded directory by typing

Next we shall compile the software library by typing.

We can now change directory into a new directory called “python” by typing

Our final install step for the ws281x library is to setup and install the Python library. Type the following.

Installation complete!

Hardware Test

Before we get into the project we need to check that our hardware is working. Ensure that all of the connection to the 74AHCT125 chip, Raspberry Pi and the power supply are correct before applying power to the neopixels. Power up your neopixel power supply, nothing will happen until we run a test. In the Raspberry Pi terminal navigate to the examples directory inside the Python directory.

Inside the directory is a file called “” and we will need to edit it before testing. Type the following to edit.

In the nano text editor you will see a line that reads

Change this to reflect the number of neopixels that you have, we have 120 in our 2 metre string. Looking further down the text we can see

This means that every neopixel will be at full brightness and will draw a significant amount of current, so let's reduce that to 1/8th of the power by changing it.

Save your changes by pressing CTRL + O, press Enter,and then exit by pressing CTRL + X

With the changes made type the following to test your neopixels.


Right now you should be dazzled by an area of colors and animations across the neopixels. When you are happy that everything is working, press CTRL + C to stop the code.

Our last task before continuing is to install the tweepy library for Python. Tweepy is a library that enables Python to use the Twitter API. To install open a terminal and type.

Our next step is to create a Twitter app which will enable our project to communicate with Twitter. To do this go to and sign in.

Once signed in you will see a “Create New App” button in the top right. Click on it and you will see the application screen.


Fill in the name of your application and a description. You can use an URL for the website, but if you have a site, put it in there. You can leave the callback URL blank. Lastly you will need to agree to the terms and conditions before continuing.


You will now see a screen, displaying your app, all we are need from this screen is our API keys, which are secret so don’t share them out. Click on “manage keys and access tokens” and you will be taken to a new screen.


You can already see your API Key and API Secret but we also need an access token. Scroll down the page and you will see the relevant section. Click on “Create my access token”. The page will update to show your access token, again keep this information secret.


Make a note of your API keys and Access tokens as we will need them later.

Our project

So now we can get down to hacking our IoTree (groan) into life. For this project you can use any text editor you wish but we shall be using IDLE for Python 2, which with the latest release of Raspbian Jessie, can now access the GPIO pins directly without sudo access.

As always we start by importing the libraries that will enable our project.

The main libraries are tweepy for our Twitter API, neopixel for our neopixels and time which is used to control the pace of the project.

Next we create a series of variables to store the API keys and Access Tokens that we created earlier.


We now reuse some of the code from the file that we tested earlier. Again we change the configuration to match the number of LED in our neopixel strip and set the brightness so that we do not exceed the max current available from our power supply.


We are also reusing the color wipe function from the example to produce a incremental change, pixel by pixel along the strip.

Next we create a class to listen to the stream of tweets on Twitter and act accordingly.

We create an if..elif..elif conditional statement that will check to see if one of three conditions is true. These conditions are the text “#E14XmasProject” followed by a color. If this is present then the color will be passed to the colorwipe() function we created earlier. So if we see “#E14XmasProject red” the colorWipe() function is told to use 255,0,0, which is red in the RGB color standards.

Next we create a function to capture any error messages and display them in the Python shell.

For our penultimate section of code we login to Twitter using our API key and Access Tokens before setting up our neopixels ready for use via the strip.begin() function.

Our last section of code is used to search Twitter for our hashtag “#E14XmasProject” and this will trigger the neopixels to change color. Of course you may wish to change your hashtag to something more personal, as if we all have the same hashtag then everyone’s IoTree will be changed at once…globally!

So with the code complete, save your project and click on Run >> Run Module to start. Nothing will happen until you send a tweet with your hashtag and color. So using your cell phone, tablet or computer send a tweet and your neopixels should turn on and change to the color your specified.

So there we have it, we have created our first IoTree device.


Happy Holidays

Many thanks to James Mitchell for his input relating to the Tweepy stream listener

Last time I had determined that the long ribbon cable was too long to fit into the box.  So, I ordered a GPIO header with longer pins for the sense hat and a smaller ribbon cable with a 40 pin connector on one end and a 26 pin header on the other to plug into the PermaProto board.

In order to make more room for the board and the connector stack, I removed the three screw terminal connector where I had planned to connect the wires from the Fog machine remote. 


2015-11-14 10.38.02.jpg


Using double sided tape, I secured the PermaProto to the bottom of the enclosure.  Then I secured the AC relay to the side wall near where the Fog machine cable will enter the box.


2015-11-14 10.49.08.jpg


Next, I installed the fog machine cable and wired up the relay board to it.  I secured the relay board to the opposite side wall of the enclosure from the AC relay with more double sided tape.


2015-11-15 07.47.19.jpg


I installed the neopixel ring on the outside of the box and used hot glue to secure the ring in place.


2015-11-15 08.12.28.jpg


Then I wired the neopixel wires to the three screw terminal connector.


2015-11-15 08.16.13.jpg


Next, I drilled holes in the enclosure top and secured the Raspberry Pi to it using 4-40 machine screws and nuts.


2015-11-15 08.37.15.jpg


You'll notice that one screw hole was a little off, so I was only able to install three of the screws.


2015-11-15 08.40.14.jpg


I installed the new GPIO header on the sense-hat and plugged it into the Raspberry Pi, then plugged the new ribbon cable onto the pins which extended through the sense-hat board.


2015-11-15 08.43.58.jpg


I closed everything up and tried to connect to the Pi from my laptop.  Then I realized the WiPi was not installed.  So, I had to cut a hole in the side wall of the enclosure for the WiPi to stick through. 


2015-11-15 10.10.04.jpg


I made the hole where the fog machine cable enters the box slightly larger so that I could run the Pi power cable and the speaker cable through the same hole.


2015-11-15 10.32.49.jpg


Now, the Foginator 2000 is assembled and ready to test.  At this point I know all of the parts work because I've tested them individually.  Since the audio amplifier board that I built is dead (the main IC is dead), I decided to use an old amplified speaker from a PC.  All in all, it looks pretty good. 


2015-11-15 10.39.25.jpg


However when I tested it, I found that when the PWM is running there is a bunch of noise on the audio line.  I thought this might be due to the neo-pixel supply voltage being fed from the Raspberry Pi.  So, I disconnected it and ran the neo-pixel ring and the voltage translator chip off of a separate power supply.  Unfortunately, that had no effect.  Looking at the PWM signal on a oscilloscope, I noticed that there's lots of ringing on the edges.  Even with the Raspberry Pi disconnected from the rest of the box, the noise still rides on the audio when the PWM is active.  I did a little research and it looks like there's a strength setting on the GPIO outputs on the Raspberry Pi.  So, maybe if I lower the strength, that will reduce the ringing.  I will write more once I figure out how to do that.


Other than, that everything appears to work.  The fog machine switches on when the relay closes and puts out three seconds worth of fog.  When the PWM is not running the audio is crisp and clear and plenty loud.  The PIR sensor senses motion through the hole in the side wall just fine and when the speaker is turned off, the neo-pixel displays a rotating rainbow when the PWM is turned on.  So, I am really close to having a working system.  All I need to do is merge all of the code together and I'll be set for next Halloween ;-). 

I finished up my last blog post by installing the relay board on my PermaProto board.

2015-11-07 14.11.58.jpg


Next, I needed to assemble the level shifter board.  I say assemble, but all that's required is to solder the single row headers to the board.


2015-11-03 21.39.04.jpg


Here's what it looks like with the connectors added.


2015-11-03 21.49.55.jpg


I went ahead and soldered the level shifter board on to the PermaProto.  If I were thinking ahead I wouldn't have placed it so close to the connector.  I would later find that my cable no longer fit on the board and I would need to order a different cable.  Planning in a hobby project is just as important as it is in a work project!


2015-11-07 15.14.51.jpg


I really wanted to get the neo-pixels working with the RPi 2.  It didn't seem like it should be that hard.  So, I wired up the neo-pixel ring with wires to connect to my Raspberry Pi (through the level shifter). 


2015-11-03 22.07.21.jpg


I downloaded the Raspberry Pi 2 branch of Jeremy Garff's rpi_ws2811 library. 

>git clone

>git pull rpi2

I compiled the code:

>sudo scons

Then I ran the test program

> sudo ./test

and viola!  It works!


2015-11-07 16.37.30.jpg


Next, I began to prepare the enclosure I purchased for this project.  The provided enclosure was too small because I wanted the RPi to be inside the box.  I cut a notch in one end of the enclosure to fit the over-molded cable that was originally in the fog machine remote.  I crimped solder-less lugs on the cable's wires in order to connect up to an AC relay coil.  Then I soldered on an extra wire to go to the relay board's contacts for switching the fog machine on and off.


2015-11-07 19.07.53.jpg


I purchased a relay with an AC coil that I could use with my fog machine (since it uses AC to light the indicator lamp). 


2015-11-07 19.12.47.jpg


I decided to use double sided tape to mount it inside the enclosure. 

2015-11-07 19.13.06.jpg


Then I added a 10k Ohm pull-up resistor to 3.3V on the PermaProto board.  This will allow me to use the relay to switch a GPIO input on the RPi.


2015-11-07 20.09.12.jpg


Here's a schematic of what this circuit looks like (Don't try this yourself unless you have training in handling high voltages):



Next, I drilled a hole in the side of the enclosure to allow the PIR sensor to "see" the outside world.


2015-11-07 20.41.16.jpg


I thought it would be cool if the neo-pixel ring surrounded the opening for the PIR sensor.  So, I drilled holes on each side of the opening to run the wires.


2015-11-07 20.50.00.jpg


I tested the fit of the PermaProto in the enclosure.  Everything was looking good at this point.  You'll notice here that I added wires with solder-less lugs to connect to the AC relay contacts.


2015-11-07 20.51.32.jpg


I like the way things are looking.


2015-11-07 20.51.46.jpg


However, when I went to add in the RPi/Sense-Hat/cable, there just wasn't enough room. 


2015-11-07 21.25.56.jpg


At this point I've decided to go back to the idea of using the header with extra long pins to connect the sense-hat to the raspberry pi.  I ordered the connector plus a cable which "down-grades" the 40 pin RPi 2 connector to the original 26 pins.  Hopefully, I will be able to connect this cable on top of the sense-hat to run over to the PermaProto.  So, now I'm waiting for parts, again...



Transportation is an essential part of our daily life especially for college students. Going from point A to point B without any hassle is always a must. However, if you are in a university that has over 66,000 students spanning across five different campuses (2,681 acres of land), there would definitely be issues specifically when it comes to scheduling. Having a PDF copy of the bus schedules is not enough to know when the bus is arriving because there would always be issues like traffic or mechanical failure as the bus goes along its route.


Buses are an important mode of transportation to college students


Technology has been trying to help alleviate this issue by providing real-time updates on when the bus would be arriving. Buses are now equipped with GPS systems that send data to a server that computes for the time a bus arrives at a stop. You can check these times on the internet, through an app, or on a LED matrix screen on the bus stop. This technology gives convenience to students as it tells if the route is active and what time will the next bus arrive.




Even so, the technology is not perfect. There would be times that the algorithm that computes the time of arrival make errors. You relax because you think your bus is coming in 12 minutes, then you turn around and the screen jumps to “Arriving in 2 minutes.” Having the system intelligently compute for the schedule would be good if it done its made more reliable. Thus, students have made mobile apps that simply displays the location of the bus in real-time. No computations, no fancy algorithms, just helps you know where the bus is and if its moving fast or slow.


This is good, but some bus stops do not have internet access for students. Also, what if your phones ran out of juice or you want to conserve battery on your phone. This is why I am proposing a project which solves these issues. With the Raspberry Pi 2 and 7" touchscreen, an android interface can be developed so that the bus real-time tracking app can be installed. This device can be placed in the bus stops along with the LED screens that post the schedules. They already have access to the internet so having the Pi 2 connected is not an issue for stops without wireless connectivity. This device will be called "Where R U?" referring to RU as Rutgers University, my school.


This project is in line with the Halloween Raspberry Pi Build-a-long which I got the key materials (Raspberry pi 2 and touchscreen) for this build.



Element 14 sent me the package containing all the materials needed to build the Raspberry Pi Trivia Candy Dispenser

Thank You Element14

Raspberry Pi Trivia Candy Dispenser:

Raspberry Pi2

Raspberry Pi 7" Touchscreen Display

Amplifier Kit

Mini WiFi Module

4GB SD card

Usb Power Supply

2 Servos

Adafruit Neopixel stick

Adafruit Neopixel RGB

LED's, Resistors, Capacitors


Where R U Build


Not all the parts provided will be used for the build.

Where R U?:

Raspberry Pi2

Raspberry Pi 7" Touchscreen Display

Mini WiFi Module

Usb Power Supply

8GB Micro SD Card

Raspberry Pi 2

It is my first time to use a Raspberry Pi 2 since my last device was a Raspberry Pi B+. This was definitely more powerful as it now has a 900 MHz (Broadcom BCM2836 ARM v7) quad core processor in contrast to B+'s single core 700 MHz (Broadcom BCM2835 ARMv6). This new version now has the ability to support Windows 10 IoT which I will fiddle around in my next builds. Instead of installing the standard Raspberry Pi OS, I will be searching for a means to install an Android OS for the device. This makes it much simpler to install the real-time tracking application to the device. More on the details of this in my next post.

Raspberry Pi 2

Touch Screen

Although touch screen implementations for the Pi is not a new thing, having Raspberry Pi have its own brand of touch screen ensures maximum compatibility. I tried using third part screens before in some of my builds in the Raspberry Pi and Arduino and it was definitely a challenge. You need to know its model number and the microcontroller or microprocessor you are using it on. Sometimes, these components may not work well together which makes this a really big gamble. Now, with RPi having its own 7" touch screen, makers would now have the convenience and assurance that it will definitely work on your Raspberry Pi device. I will be working on how to make a neat casing for these two components so that it can withstand the elements. More on the design will be discussed in my next posts.

Raspberry Pi Touch Screen

Project Timeline

Some of the key milestones for this project:

1. Installing the Android OS

2. Installing the real-time bus location app

3. Interfacing the Touch Screen

4. Designing the Enclosure

5. Installation and Test


Each milestone would be discussed in my future posts. Right now, I am already done installing the Android OS and I am currently in the process of testing it. The update would hopefully be up real soon when I address any issues I encounter.


Thank you for reading and keep making!

OpenCV is an amazing piece of software and the newer Pi's are getting the power to run real time image processing tasks, like face detection.






However, OpenCv is not that user friendly, Lets make an attempt to simplify it a little. Most detection scripts need accompanying files that contain the information to identify what we are searching for in the image we just captured. Luckily OpenCv comes pre packed with some of these Haar-Cascaded [The files that tell open cv what to look for], they can be found in:


/usr/share/opencv/haarcascades/          [for Pi]



[for information on what a HaarCascade is or want to build your own to identify a fish or banana? see:


>Robotics@Cyborg: How to make your own haar trained ".xml" files


OpenCv comes with default Haar-Cascades to find:

>Faces [frontal and side]

>Eyes [Right or left]

>Number Plates

>Upper and lower bodies of people

You can find many more by a quick google search and paste them into the above folder.


So how do we use these to detect faces, run in the terminal:


$sudo python Scriptname <Path to Haar-Cascade you like to use]


To run the one in the video,

>paste the code below into a empty text file saved as, right click and select any one can execute, then:

>Plug in a usb webcam into the pi [the cheap square ones from ebay work well $3]

>Type into terminal:


sudo python /usr/share/opencv/haarcascades/haarcascade_frontalface_alt.xml


if you want to exit, click on the video window and press esc on keyboard

## ********** Importing Some Libraries **************##

import cv2

import sys

##Dont know what this bit does, but it works



cascPath = sys.argv[1]

faceCascade = cv2.CascadeClassifier(cascPath)

#Telling script to capture from usb camera



video_capture = cv2.VideoCapture(0)


#************************Main LOOP *****************##



while True:


##********** We burn a few frames to make sure we have the newest one  [stops lag]

    # Capture frame-by-frame

    ret, frame =

    ret, frame =

    ret, frame =

    ret, frame =

    ret, frame =


##****** Resizing Incoming Image

    res = cv2.resize(frame,None,fx=0.5, fy=0.5, interpolation = cv2.INTER_AREA)


##********Converting to Grey Scale



    gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)

##******** Finding Faces, changing the values for size of faces in camera view



    faces = faceCascade.detectMultiScale(




        minSize=(30, 30),




    # Draw a rectangle around the faces

    for (x, y, w, h) in faces:

        cv2.rectangle(frame, (x, y), (x+w, y+h), (0, 255, 0), 2)



    # Display the resulting frame

    cv2.imshow('Video', frame)



    if cv2.waitKey(33)== 27:




# When everything is done, release the capture




Checkout the main home page for updates:


Animated_Grim Blog: Home Page

The skull is controlled by three servo's giving it three degree's of freedom, it also has Led's in the eyes to add a dramatic effect [see the later blogs]


Degree's of freedom:


>Pan [side to side]

>Head Tilt



I'm not going to be putting a full wiring diagram because Charles Gantt did it so well in his blog:

Trick or Trivia Halloween Candy Dispenser #004 - Building The Candy Dispenser & Servo Coding



Wire it up to these GPIO pins on a pi model 2:

SERVOVertical = 26

SERVOHorizontal= 19

SERVOShake= 13



Unfortunately I couldn't finish the project before Halloween because I moved recently and didn't bring the right tools with me. However, I will continue the build and blogs. Different from my last blog Step by Step Build Trick or Trivia Halloween Candy Dispenser #4 - LED blink test, GUI Interface, today I will blog the build of audio amplifier.



Usually the sequence when I manually solder components is from low profile parts to high profile parts. To get a good sense of the soldering sequence, I temporarily placed all parts on the board without soldering.



Starting from low profile parts such as resistor, diode and ceramic capacitors. Pay attention to the diode's polarity.



Soldering the low profile parts and cut their extra leads.




Next, solder the switch.



Then solder the terminals and the amplifier IC. Pay attention the orientation of the IC.



Next, solder the volume control rotary switch.



Next solder three elec. capacitors. Pay attention to their polarity.



Finally, solder the LED. Pay attention to the LED polarity.



The back side of the completed board.


As I went about thinking on how to assemble the circuitry together to make a presentable project, I wasn't too happy with the idea of using the extended pin header for attaching the sensor board.  However, I found that the GPIO connector on the Pi will happily take an old computer disk cable.  Luckily, I had one of these hanging around. 


2015-10-30 11.34.52.jpg

The only gotcha with this technique is that the connector rows end up swapped around if you connect the sensor board with the connector mounted as it comes originally.  However, apparently the creators thought of this.  The connector that comes with the sensor board can be turned around and installed on the top side of the board without any soldering!  In this case, the pins make the right connections with a simple male-male pin header like the one seen off to the right in the picture above.

2015-10-30 15.23.40.jpg

To make things organized and presentable, I decided to use an Adafruit PermaProto board for the original Raspberry Pi that I had laying around.  It doesn't allow for connecting any of the higher numbered pins, but luckily all I need are included.

2015-10-30 12.21.39.jpg

Working with what I had at hand, I installed the left over single row male pin header to cover the needed pins.  To make things foolproof, I soldered the PIR sensor onto the proto board with the idea of drilling a hole in the side of the enclosure to allow the PIR to peak through.  To make things fit in the desired enclosure, I cut a few rows off one end of the proto board.


I installed a screw-down type connector on the proto board to handle the connections with the fog machine remote control and added wire to connect the PIR sensor to the Raspberry Pi header.


Finally, I soldered on the relay board that I built before.  At this point I'm wishing I had just waited and installed the relay directly on the main proto board.  However, I might as well go with what I have already. 

2015-10-30 15.08.38.jpg

In the next installment, I will finish up the build of the Foginator 2000. 

Static Display

Wrap up time!


Sadly after cleaning, and testing and checking everything with my old foginator, it started having intermittent problems.  Sometimes it would work, sometimes, nothing.

Not having enough time to get a new one I changed over the idea of a relay just for firing the Foginator and built an outlet Relay inside a 4 gang box similar to , I had seen this video while waiting for parts and liked the idea of being able to remotely power the relay as well as having the PIR sensor flip the relay.  Plus being able to use the box to wrap everything up inside was a nice tie in.


Hating the idea of the Foginator not being consistent I set up 2 masked heads capable of movement, 2 masked heads static, and a cool Electric Plasma Glass Skull head.

All 4 masked heads are illuminated from within by solar led and one is lit from below by modified solar led that I swapped a UV led into.

The 2 moving heads are provided mobility by taking oscillating fans and attaching the heads to the top with drop cloths on the front to billow when the fan spins up and the head unit moves.  Side note, this drop cloth was not UV reflective, next year test before assuming.  :-)


As you can see in the above picture Static is pretty cool, but in the bottom video the creepiness went WAY up as the PIR was activated and things started moving!




I have to admit I had been working on this guy for a bit before adding it into the RPi display.  Had to keep busy while waiting for parts and pieces to come in!



This was going to be a static display with the coolness to be how the recessed eyes are glowing and will "follow" you as you move past.  But after a bit of creative thought the idea of working it on a standing oscillating fan seemed to be even more creepy since it would only move when the PIR sensor was activated it.  The billowing plastic below and UV led were extra steps to the overall feel.


Complex Head L2LComplex Head L2R


This head was based on a glow in the dark trick or treat pail and was illuminated by a Solar LED from the top and unseen in these pictures a Solar UV LED underneath.  The pail did not illuminate near as well as the balloons I tried on the others.  But it was helpful for me to be able to put in the recessed eyes for the "following".


Simple NonMoving


Once I decided to move forward with the full display I thought having a couple of non-moving simple heads would be nice.  These are on white balloons illuminated from within by solar LEDs so always lit when dark.


Simple Moving


This was the second moving head, Not as complex as the other but still with enough creepy to it that my little ones didn't want to go near the "trolls".  :-)

Also based on a white balloon, illuminated from within by Solar LED.  The wire was needed to keep it to the fan and was not obvious from farther away.


Plasma Skull


Plasma skull!



Final implementation in Video Format!


This was an awesome project to work on and I want to give a huge THANK YOU to Element14 for letting me participate!


Very eager to do more, and can't wait to see how everyone else did with their projects.  The element of limited timeline helped the creative process!


My Foginator 2000

Posted by severian Oct 31, 2015

Ralph's Foginator 2000


  It all works nicely, as of this afternoon.  I simplified the audio because I did not get my Raspberry Pi audio cable ready.  With as many Raspberry Pi B+ and newer that have been sold, I wonder why someone does not build audio cables.  Ideally, it would be a short cable with the weird Raspberry Pi trrs connector on one end and a female 3.5 stereo audio jack on the other end.  Then, you could plug cheap computer speakers in.  I mount Raspberry Pis on the back of LCD monitors.  These pften have a 3.5mm stereo jack built in.  Oh, well.


So, I used the network to play the audio.  When the neopixel display lights up, a command is sent through ssh to a desktop Linux machine to play the sound file Charles Gantt calls correct.mp3  I like the effect of it coming from the next room.  This project has been fun and I look forward to starting a new one  I'd like to build a movement tracker using the accelerometer, magnetometer, and gyroscope in the Sense Hat.  I have played with those a bit and I see I need to do some research on smoothing that data to make it useful.  Everyone have a happy All Saints Day(Nov 1).

Well, halloween is basically over and it was a successful letdown. It poured rain tonight, we had about 6 kids. The kids liked the candy dispenser but loved the Wishing Hell. The dispencer had to be setup inside our (small) doorway, the Wishing Hell was mostly covered and sitting outside in the rain. The kids said this is what brought them to our home, but due to the rain they didn't want to all try the dispencer. The main reason for this challenge is for the kids to have fun which made it successful.


We want to thank all the sponsers, Element14 and everyone involved for making this possible,


Pictures to be uploaded in the next few days


Dale and Chrystal Winhold

Filter Blog

By date: By tag: