Skip navigation
1 2 Previous Next

STEM Academy

20 Posts authored by: mikedavis    

A copy of this blog post can be found at my own blow (Faradaysclub.com).

As you all probably know, the Raspberry Pi turned 4 on February 29th, and this was celebrated with the release of the Raspberry Pi 3.  What’s new?  Well the big things are:

  • Built in wireless access (no more using 25% of your available USB ports for WiFi).
  • Built in Bluetooth
  • 64 Bit Cortex A-53 processor
  • 1GB of memory
  • Still just $35  (not really new)

I decided I would try to play with one of the new features and make a simple WiFi detector.  I am a big fan of the Best American Non-Required Reading series, which used to have a front section that included some of the most creative WiFi names around.  I was certain that there would be some fun names in my neighborhood, so I thought I would use my Pi3 as an SSID ‘sniffer’.

Since the WiFi is built in to the Pi3, I can strip this project down to just the Pi and a 5V cell phone charger.

WiFi_5

There are quite a few examples of these out there, and they vary in complexity.  The one I worked on uses the built in WiFi to scan all available networks every 10 seconds and record them in a .csv file.  My thought was that I would put this in my backpack and walk around the neighborhood.

The first thing to do is try this command:

sudo iwlist wlan0 scan

You should see a long list of things come back.  The command asked the Pi to scan the wireless signals and report back on everything it finds.  I am most interested in the SSIDs (shown below as ESSID:”coyote”).

WiFi_2

Once that works, I found and modified a python program that will output all of the SSIDs.

I created a folder called ‘wifi’ and a file called ‘test.py’.

mkdir wifi nano test.py

The code below gets copied into test.py.

from subprocess import check_output  
scanoutput = check_output([“iwlist”, “wlan0”, “scan”]) 

for line in scanoutput.split(): 
  if line.startswith(“ESSID”):
  line=line[7:-1]
    print line

The program is really simple.  First it imports the subprocess library.  Then it uses the check_output command and stores the results in a variable called scanoutput.  There are lots of lines in scanoutput, so we split them out individually and look for ones that startswith “ESSID”.  We strip those lines to get rid of the first 7 characters (ESSID:”) and the last character (“), and then print it.

If you see a short list of networks, then its working.

WiFi_3

That is the simple version of the program.  What I would like to do know is have that run as a loop every ten seconds and spit the values out into a .csv file.  This complicates the program a little bit, but the main parts are still there, and we can figure everything else out.

from subprocess import check_output 
import csv
import time 

#get the current time to name a file later on
timestamp = time.strftime("%m-%d-%Y-%I:%M:%S") 

#define a function for doing a wifi scan
def wifiscan(): 
  ssid = []
  scanoutput = check_output(["iwlist", "wlan0", "scan"])
  curtime = time.strftime("%I:%M:%S")
  ssid.append(curtime) 

  for line in scanoutput.split():
    line=str(line)
    if line.startswith("ESSID"):
       line=line[7:-1]
       ssid.append(line)
  with open('/home/pi/wifi/'+timestamp+'.csv','a') as csvfile:
  csvwriter = csv.writer(csvfile,delimiter=',')
  csvwriter.writerow(ssid)
  print ssid 

while True: #repeat the program in an infinite loop 
  ssid=[] #clear the list
  wifiscan() 
  time.sleep(10)

This program works just like the previous one but it has some enhancements.  My python skills are still pretty weak, and I was having a tough time getting all of the networks to show up instead of just the first one, so I started saving everything in a list called ssid.  It starts empty, and then gets populated with the current time (curtime) and SSIDs.  The program then uses csvwriter to open a file that is named with another date stamp that is captured just once when the program starts.  When the SSIDs are captured, it becomes a row in the CSV file, and the first cell in each row is the date stamp.  The infinite loop runs the wifiscan, clears ssid, waits for 10 seconds, and starts over again.

How does it look?

WiFi_4

It works pretty well.  I am really only interested in the names, and the number of names for each scan.  The iwlist command gives a lot more information like signal strength, frequency, and MAC Address.  I live in a pretty densely populated place, so I am curious about how many of these things I can detect as I walk to the coffee shop, or drive to work.

I can visualize it like this by doing a little work in Excel.

WiFi_5

An obvious next step would be to GeoCode this information through the use of a GPS HAT.  As I go through various neighborhoods it would be interesting to see how connectivity changes.

Enjoy!

I was as excited as anyone else when the Raspberry Pi Zero (RPi-0) was announced over Thanksgiving.  This $5 computer is 86% cheaper than the Raspberry Pi 2 which is already a great deal at $35.  I thought I would try it out as a temperature data logger.  In the end, it worked beautifully, and I hope to build some more probes for my students to use.

 

Raspberry Pi Zero Package


Additional HardwareTo run the RPi-0, some additional hardware is needed:

Operating System A more up to date operating system is needed for the RPi-0.  Raspbian-Jessie or later is recommended.

 

You can get the image here: Download Rasbian-Jessie Image You can then use a program (like Win32 Disk Imager) to write the image to an SD card.

 

windisk.jpg

 

The image is a little more than 1Gb while zipped and a little over 3Gb when unzipped.  You will need some space on a local machine when you are working with it.

 

Booting Up Once the operating system is loaded up, you can plug it into the RPi-0.  It doesn't click in like the RPi-2 (you have to get to $5 somehow).  Plug in power, and look for an intermittently blinking green light to see that all is working (the red LED is also gone). Attach pins to the GPIO Another thing that had to go was the pins on the GPIO.  The RPi-0 can still interact with the real world, you will just need to attach some pins.  I did this with a new Holiday Bundle from Circuit Specialists (video below).

 

 

 

I used the soldering kit to attach 4 header pins to the top of the Raspberry Pi.  You can see the attached pins on the top right of the RPi-0 here.  I used regular headers, and just pulled the ones I didn't need out with a pair of pliers.  I found that this was easier than just making the single connections I needed. We will need three connections.

 

IMG_20151228_121551382.jpg

 

  • Power - 3.3v
  • Data - GPIO 4
  • Ground

 

The GPIO layout is the same as the 40 pin layout for the other Raspberry Pi models.  The connections that are needed are labeled here with arrows.

 

GPIO-labels.png

 

Again I found it easier to attach four headers and just pull out the ones I didn't need before soldering. Make the Circuit To set up the temperature sensing circuit you will need:

 

There is a 4.7k resistor between data and power.  This tutorial from Adafruit is wonderful for setting things up.  Also, you can do this with a solderless breadboard.  I wanted something a little more robust, and went with soldered connections. When it is soldered together, it should look like this:

 

rsz_img_20151228_123326440.jpg

rsz_img_20151228_123401911.jpg

 

The female ends can be attached to the pins on the Raspberry Pi.  The connections are:

  • Probe Red - 3.3V
  • Probe Yellow - GPIO4
  • Probe Blue - Ground

Now the circuit is complete and ready to be tested. Testing the CircuitIn the RPi terminal, type:

sudo nano /boot/config.txt

Go to the bottom of the file, and add this line:

dtoverlay=w1-gpio

 

dtoverlay


Close this by hitting Control-X and then Y.  Reboot the Raspberry Pi. Once it has rebooted, type the following lines into the command prompt:

sudo modprobe w1-gpio sudo modprobe w1-therm

 

This activates the sensor on the GPIO.  Now we need to find it.

cd /sys/bus/w1/devices/ ls

 

sensor

 

The circled number here is the identity of the temperature sensor, like its serial number.  We need to go into that directory, and tell it to take a temperature.

cd 28* cat w1_slave

 

temp

 

The readout will contain the temperature in units of milli-Celsius. Look for t= If you have seen all this, your temperature probe is working.  Well done!  If not you might see nothing when you changed directory into /sys/bus/w1/devices/.  Chances are the wiring and/or soldering did not go right, and that would be the first thing to check.


Working with ThingSpeak

 

Go to ThingSpeak.com and create an account if you don’t already have one.  Once created, you will need to create a channel.  When you set this up, you can give it any title you want.  I have it filled out for my channel below.  Also, there is a box you can check that will make it public.  If you do, the public can see your graph (but not your account information).

 

BMP13

 

Go to the bottom and click on Create Channel.  Then go to the Data Import/Export tab. On the right-hand side, there will be a Update Channel Feed GET box.  It contains a URL that you are going to put into a python program below.  I am not showing the full one here, because it contains an API key that is unique to my channel.  You will need to copy that URL.


Make the Program

 

Now we want a program that will log the data from your sensor directly to the Internet using ThingSpeak.  The program is broken up into a few parts

  1. Libraries to import
  2. Functions for sensor readings
  3. Value Writing

In short, this program uses a library give commands to the operating system (using the library 'os').  The Raspberry Pi will return the temperature in that 2 line response we saw before.  Then the program looks for the temperature and gives reports it to ThingSpeak, where it can be plotted.

 

import os

import glob

import time

import sys

import datetime

import urllib2

 

baseURL = "https://api.thingspeak.com/update?api_key=YOURAPIKEY"

 

#initiate the temperature sensor

os.system('modprobe w1-gpio')

os.system('modprobe w1-therm')

 

#set up the location of the sensor in the system

base_dir = '/sys/bus/w1/devices/'

device_folder = glob.glob(base_dir + '28*')[0]

device_file = device_folder + '/w1_slave'

 

def read_temp_raw(): #a function that grabs the raw temperature data from the sensor

     f = open(device_file, 'r')

     lines = f.readlines()

     f.close()

     return lines

 

def read_temp(): #a function that checks that the connection was good and strips out the temperature

     lines = read_temp_raw()

     while lines[0].strip()[-3:] != 'YES':

         time.sleep(0.2)

         lines = read_temp_raw()

     equals_pos = lines[1].find('t=')

     if equals_pos !=-1:

         temp_string = lines[1][equals_pos+2:]

         temp_c = float(temp_string)/1000.0

         temp_f = temp_c * 9.0/5.0 + 32.0

         return temp_c

 

while True: #infinite loop

     tempin = read_temp() #get the temp

     values = [datetime.datetime.now(), tempin]

     g = urllib2.urlopen(baseURL + "&field1=%s" % (tempin))

     time.sleep(60)

 

This is the program, and it works pretty well.  The plot below shows the temperature in my kitchen.  (If its really high, that is ok.  I am doing some experiments on how well containers retain heat.)

I have been working on a sunrise project that is heavily inspired by Ken Murphy's amazing History of the Sky project.  I loved what he did, and I decided to use a Raspberry Pi to do something similar using the Chicago Skyline and our daily sunrise.  I was completely sure how to make this project work, and I still have some unanswered questions.  Things are coming along nicely, though.  I thought that I would post my first 30 days of sunrises here for you to see.  You can also link to it here (https://youtu.be/ixQvy3rIkEU).

 

 

 

 

My colleagues at DePaul let me set up a camera.  My 'weatherproof' box is really a plastic sandwich box from Target.  I use a Dropbox uploader to send all of my movies to my Dropbox account every day, and I use another Raspberry Pi to download them, break them in to stills, and batch resize them.  I then used Imagemagick to make a tiled 'montage' and that gets compressed into another movie.

 

I hope you enjoy it!  I have instructions on this process that will follow soon!

On Tuesday, June 16th students from the City Colleges of Chicago participated in their first balloon launch.  This is supported by a grant from NASA and the Illinois Space Grant.  With this award we support students with scholarships and stipends as they design, build, and launch experiments to be conducted over 90,000 feet above the earth.

 

Weather-wise this was a perfect day.  It was nested between two days or torrential in northern Illinois.  Our flight predictions had us launching from Lexington, IL and landing just west of I-57.

 

Landing Prediction
Prediction of where the balloon will land, based on our weather and weight.

 

We launched from Lexington, IL which is just a little north of Bloomington.  The students were in high spirits as we drove down and started preparing for the launch.  A lot of work goes in to a flight, and a lot of it boils down to a couple hours of activity where we set up our radio and satellite trackers, troubleshoot our experiments, and secure all of the payloads to the balloon.  Once that is done, we are ready to start filling the balloon with helium.

 

balloon filling
The balloon is made of a natural rubber, and we do our best to keep the oils from our fingers off of it.

 

When it is full of gas, the balloon is pretty big, and it is doing its best to start rising.  When it is full and ready to go, Heather is pulling down on the balloon with about 15 pounds of force to keep it on the ground.

 

pre-launch
It takes about 15 pounds of force to keep this balloon on the ground with us.

 

With a balloon full of helium and favorable weather, we are ready for a launch.  We slowly start to release the balloon and start feeding the payload up along with it.  Each of our payloads is separated with about 6 feet of mason's line and attached with swivel clips (from a fishing store).

 

The balloon is ready to go to near space, and has enough lift to carry our payloads.
The balloon is ready to go to near space, and has enough lift to carry our payloads.

 

Within a few seconds the it is more than 100 feet in the air.  After a few minutes, we lose sight of it entirely.  If we did our math correctly, it will go above 90,000 in about 90 minutes, and land near I-57.

We had a Raspberry Pi camera taking pictures every 10 seconds, and we got some beautiful pictures on the way up.

 

frame437
A view from a couple miles above Lexington, IL.

 

 

 

Much higher above central Illinois.  Notice the blackness of space, the light blue of the atmosphere, and the slight curve of the earth.
Much higher above central Illinois. Notice the blackness of space, the light blue of the atmosphere, and the slight curve of the earth.

 

After the balloon is out of our hands, it is time to pack up, and hit the road.  We loaded all of our launch materials into the vans, and start off along the flight path.  After 90 minutes the balloon popped and we started driving around the area where we projected it to land.  One of our vans was close enough to see it land!  It was close to the edge of a field, and we were able to retrieve it.

 

The balloon landed about one mile into a soy field in Piper City.
The balloon landed about one mile into a soy field in Piper City.

 

The flight was a success on a number of levels.  We got more than 2000 pictures, temperature data, pressure data, and some speed of sound data (not to mention the balloon itself).  Our next launch will be in early July and we will have more to share on our experiments.

As many of you know, I am working with my colleagues at the City Colleges of Chicago and DePaul University on some high altitude balloon (HAB) launches.  This activity has been funded by NASA, and we really excited to to do some launches, and share our findings.  The STEM Academy is a great resource as we build our experiments.  Today's post, however, is about launching and retrieving a balloon.

 

We live in Chicago, which has two airports, lots of people, buildings, and a huge lake to the east.  The Federal Aviation Administration (FAA) has some guidelines on balloon launches, and one of them is that we cannot launch a balloon within 5 miles of a commercial airport.  Since what we send up also comes down, we don't want to get our balloon out of the lake or off the top of a very tall building.  In short, the city of Chicago is a terrible launch site.

 

So what is a good launch site?

 

That actually has more to do with the landing site.  If we know a few things like, the total mass of the payload, amount of lift, and the type of the balloon, we can make a pretty good prediction about where it will come down.  Ideally it would come down in a field, away from a populated area.  A lot of central Illinois has soy and corn fields, which makes a lot of central Illinois an ideal launch and retrieval site.

 

Making a Prediction

 

We use the HABHUB site for making predictions about the landing site of our balloons.  It asks us for:

  • Launch site (including latitude and longitude)
  • Launch altitude
  • Launch time (in UTC)
  • Launch date
  • Ascent rate
  • Burst altitude
  • Descent rate

 

launch site

 

Here I searched on the map for Lexington, IL (a place we commonly launch from) and found its elevation using Google.  The time and date are easy as well.  They are also very useful for trying to determine things a day or two in advance.  Since our vertical flight goes through the jet stream, we can't really count on anything that is more than a couple days away from the actual launch date.  In short, we become more certain about things the closer we get to the launch date and time.

 

burst calculator

 

Above you can see the burst calculator for a 2000g gram balloon made by Kaymont.  The target burst altitude or target ascent rate are things that we would try to reach for.  I calculated the ascent rate using another calculator found here.  The result is in feet/min which we convert to meters/sec for this calculator.  As you can imagine, anyone that teaches dimensional analysis or unit conversion would have a field day with this stuff.  There are all kinds of conversions that need to be done.

 

With this information, we can get a prediction for this flight.

 

landing prediction

 

This zig zag motion is pretty common for our flights.  Winds close to the earth and the jet stream will grab the balloon and take it north and east.  Once it emerges from the stream, it will head back west.  After it bursts, and comes back down, it will come east again.  In the end we should expect to collect the balloon close to the I-57 expressway.  When I switch over to Google Maps, I can see that this is in Danforth, IL, and in a farming area.

 

landing site

 

The landing site is pretty good.  Right now, I have now idea what is growing on the field (soy being short is preferable to corn).  But I can see that it is not near a populated area, its not near a river, and it is more than 5 miles from the expressway.  We wouldn't want to come down in a busy road.  Depending on some other factors we may choose to launch a little further to the west so we are further away from the expressway, but for now this looks good.

 

Students

 

This is a good place to involve students.  In the days leading up to the launch, we know things like the total weight of our payload (including the parachute), the type of balloon we are using, and how high we would like to go.  At this point our students can become weather experts and start tracking the jet stream.  We use these sites for weather and jet stream information.

 

 

We will ask our students to do a few predictions with slight changes in payload, ascent rate, and burst altitude to see what the effect it has on the landing site.  When it comes time to pack up the van and go, we want to be sure that our students are very confident of where things will be going.

 

Next Post

 

In my next post, I plan to write about tracking a balloon while it is in flight.  Assuming this goes well, I will also probably write about how our launch went.

With the school year behind me, I finally have some time to work on projects that I have been dreaming up all year.  The abundance of free time also means that I am saying 'yes' to a lot of things, and I will soon start to run out of time again.  Regardless, I will complete at least one project this summer!

 

One thing I wanted to do was re-create Ken Murphy's beautiful 'Year of the Sky on Earth'. In this project, he put camera on top of the Exploratorium in San Francisco.  He took a picture of the sky every 10 seconds for a year, and they put all of the time lapse movies together in one panel.

 

 

 

What I love about this project is that it really captures how sunrise/sunset changes with the seasons.  This gives teachers an opportunity to talk about the seasons from a planetary perspective.  It turns out that people have some very strongly held misconceptions when it comes to the reason for the seasons.

 

Now that I have a little practice with the Raspberry Pi and cron, I can see how something like this might work.  I would need:

 

  • A program to start my camera every day at the same time, and let it run until a certain time.
  • A program to turn those pictures into a movie file, and put that into a Dropbox account.
  • A program to remove the pictures each night, and start again.

 

I would like to recreate this in Chicago with a clear view of the skyline, facing east.  I have a school that is ready to participate, and I (more or less) have the code necessary to do it.  As I am doing this, I am running into some predictable problems.  I think I see solutions to them, which is encouraging.  My hope is to start my project by June 1st, and of course, I will keep the community posted with what is going on.

 

If there is anything you'd all suggest, I would be more than happy to hear from you!

My college recently received a grant from NASA that allows us to pursue high altitude balloon projects.  I have written about this before, and asked about some potential ideas for experiments and sensors.

 

Today, in Lexington, IL we launched a balloon with some payloads on it.  This was our first launch, so our payload consisted of a pair of Raspberry Pi cameras, and an Arduino with the BMP Pressure/Temperature/Altitude Sensor.  Prior to the launch I tested the BMP sensor by taking it for a ride to the top of the Sears Tower. 

 

sears_tower.jpg

The sensor worked.  The shape of the curve is what I would expect, though the actual altitude values surprised me.  I am going to have to look in to how that is calculated.  At any rate, the circuit worked, and one of my students built in a red light / green light system to let us know if things are working.  After one test, everything looked great.

 

A storm was brewing in Lexington, so we had to hustle to get our balloon out in front of it.  We used a 1600g balloon with about 14 pounds of lift for roughly 8 pounds of payload.  (The FAA limits us to 12 pounds overall.) 

 

ballooning.jpg

 

The sky behind us was getting pretty dark. 

 

Our materials were packed into a foam box, held in place with cable ties.

 

rsz_balloon_payload.jpg

Once we did the pre-flight check, and we were convinced everything was working, we let the balloon fly.  The pictures below are just a few of the more than 2000 that our two Pi Cameras took at 10 second intervals. 

 

frame0411.jpg

frame0490.jpg

frame0755.jpg

 

The balloon burst at an altitude of about 90,000.  In later pictures I can see the turbulence with a brief free fall before our parachute deployed.  Shortly after that, both cameras failed.  I think it might have something to do with the fact that the cameras were on the outside of the payload box, and it descended through a storm.  When it landed, both RPis were still on, but the cameras weren't taking pictures.

 

This was a tremendously fun project.  I have about six more planned over the summer, and we have more experiments to do.  We have some things to revise, but this was a great start.  Any guidance or suggestions from the community would be most welcome!

It has been an exciting few months.  Don't let the lack of blog posts fool you.  I have honestly just been too busy to write anything.

 

Earlier this year, the City Colleges of Chicago received a grant from NASA to do some high-altitude ballooning (HAB).  The vast majority of the money we receive will support undergraduates with stipends so they can build and design experiments that will be conducted 30,000 to 90,000 feet above the earth.  We have our first crop of students, and they are ready to get moving!

 

Prior to any of our materials arriving, I dug out some Raspberry Pis and walked them through the construction of a temperature sensor. I thought this was a good place to begin, and it seemed relevant.  No matter what kind of experiments they want to do, they will most likely collect information on temperature and pressure.  We used the DS18B20 sensors, and the gspread library to keep track of our temperatures.  The big 'ah-ha' for the students and faculty was that the small device here was taking measurements and reporting them to another computer.  It was very exciting.

 

I am now looking for some experiments that we can do in a weather balloon.  Here are the limitations:

 

  • Total weight, including balloon, parachute, and tracking equipment, must be under 12 pounds.
  • The balloon will be exposed to temperatures as low as -70 C.
  • The balloon will be exposed to pressures as low as 0.02 atm (basically 1/50 of the pressure we feel on the surface of earth).
  • The entire balloon flight is typically around 3-5 hours depending on weather and other conditions.

 

Things that we will require are:

  • Sensors that can record to some kind of permanent memory (like an SD card or external drive).
  • Sensors that can withstand the conditions described above.
  • Experiments that can survive a 17 mile fall back to earth (it should be robust).  We have a parachute, and that slows it down nicely.
  • Inexpensive
  • Light-weight

 

I am very familiar with things like the Raspberry Pi and Arduino, and I feel pretty confident that I could find a way to make something work.  My students show a lot of aptitude as well, so they will most likely lead in this area.  We like the open source electronics because they are inexpensive and light weight.  It also seems like any kind of sensor we could be interested in would be able to work with Pis or Arduinos. 

 

I am interested in doing some radio spectrum work and doing some speed of sound measurements as the balloon goes up.

 

If any of the engineers on this forum would like to weigh in, point us towards resources, or flat out give some help, I would be most appreciative.  Scanning the internet shows me that there is a lot of information out there.  I think this community and this forum does a good job of cultivating that information and distilling it into something that is really useful.

 

Our first launch will be April 24th.  More news to follow!

A while ago, I posted an instruction sheet for temperature sensing with the waterproof DS18B20 probe.  At the time, I was really interested in how quickly my coffee would cool off depending on the kind of cup I was using.

 

The program I used involved a gspread library for a python program.  My problem was that I had an easy time getting temperatures, but I had a hard time getting them into some kind of file where I could save them and do some work later.  Finding gspread was excellent, because it did two things for me.  First, it saved my temperatures into a Google Spreadsheet that I could access from anywhere in the world.  That was very handy.  Second, it gave me the confidence that the probe was actually working.  If I checked and saw a recent temperature, I knew things were working.

 

My problem, as I knew (and pointed out by @Charles Turner) if the program ever hung up, an infinite loop would take over, and that would be the end of the temperature sensing and logging.

 

It was recommended that I made the program something that would execute just once, and then have crontab make it a repetitive action.  So now, my program executes every two minutes, and logs a temperature into my Google Spreadsheet.  I have been using it to monitor the temperature in my classroom, and I have posted it to our class website (Mike Davis Chemistry).

 

I will do a more detailed blog post and set of instructions very soon.  In the meantime, I was just so pleased with how well this worked, that I wanted to share it.  I also wanted to thank the community for helping me get over a problem.

 

Check out the temperature of room 3831 at Truman College here.  (https://docs.google.com/spreadsheets/d/1mccAri0TsLIVCXKzzrx8lxO7JXAAdm6i6FmboO75wjE/edit?usp=sharing) or follow it here.

 

 

Not that long ago, I got a Pi camera, and I quickly became excited about all of the wonderful time lapse projects I could attempt.  I mean the script is so simple and easy to understand, how could anyone not be interested in making time lapse movies.  My one struggle was looking for things that change dramatically but slowly. 

 

Most of the science demonstrations I do change quickly and dramatically.  I do these demonstrations live for audiences of 200 or more at a time.  They like explosions and so do I.  Recently, I attended an incredible conference called ORD Camp.  I did a presentation on science for large audiences (including fire).  I did a simple exploding powder demonstration and someone captured it on their iPhone at 240 frames per second.  The video below shows what we got.

 

 

This got me thinking about my Pi camera.  Could it capture video with a high frame rate?  It turns out that a new camera mode using raspivid can capture 90 frames per second.  This seemed like a decent place to start. 

 

I was recently given a Pi NoIR camera.  When I took my first selfie, I was expecting a cool looking night-vision kind of photo.  What I got was a regular photo, and I later learned that NoIR literally means that this is a normal camera that just doesn't an IR filter (thus the No IR name).  Its amazing that I am allowed to teach.

 

At any rate, I figured out that I needed an IR source that the camera could detect.  So I grabbed the DVD remote, and did some scary face pictures.

 

rsz_irtest.jpg

 

Seeing that the camera worked, I brought it the lab to capture one of my most popular demonstrations, the Whoosh Bottle.  In this demonstration, a small amount of alcohol is poured into a 5 gallon water jug and allowed to evaporate.  When I ignite the fumes, a large blue flame would erupt from the top, and it makes a loud whooshing sound.  The video below was captured with the Raspberry Pi NoIR camera at 90 frames per second.  It speaks for itself and I could not be happier with the results.

 

 

This is so cool, and it opens up a whole new world of projects for me.  I do several explosion demonstrations, and I think I found the perfect set up for capturing them.

The Raspberry Pi Camera is surprisingly good and easy to use.  I am very happy with how simple it is to take pictures and use other free software to make time lapse movies with it.

 

During this winter break, I asked the Chicago Children's Museum at Navy Pier if I could use their network and balcony to take some pictures.  They were more than happy to oblige, and I was out there on December 29th getting some photos.  It was about -6C at the time, so I didn't spend a lot of time composing the shot.  After getting a practice image or two:

 

rsz_sky0002.jpg

 

I was ready to set the camera to take a whole bunch of pictures.  While I wasn't going for a well composed shot (again, it was -6C, and I wasn't wearing gloves) I knew that I wanted something that wasn't going to move somewhere it in it.  In the movie below, you will see that that is some drift, so a second flag pole gets in to the shot. 

 

The command is pretty simple

 

raspistill -vf -hf -o /home/pi/camera/sky%04d.jpg -tl 60000 -t 18000000

 

This will take an image every 60000 milliseconds for a total of 18000000 milliseconds (5 hours).  All of it gets saved to the camera folder in my home directory.  The "%04d" means that they will be numbered sequentially (sky0001, sky0002, etc) as they go.  That makes it easy to sequence them later on.  I let it sit out for five hours.

 

After that, I collected the Raspberry Pi from the museum, and took it home to see what I got.  I found the easiest and fastest way for me to get these off the device was to use WinSCP and use SFTP.  That allowed me to transfer the images off the RPi and on to my laptop.

 

From there, I used Cineform from GoPro to make the time lapse.  There are probably better programs out there for doing this, but I found this one to be very easy to use.  There is a three step process for making the time lapse video, and it results in an mp4 file that can be uploaded to YouTube.

 

 

All in all, it was very easy and satisfying to do this project. 

 

There are some things I would like to do once the weather gets a little better (April-ish). 

  • Take time-lapse photos of the sky against a static sky line view for several days.  I saw a great project done in San Francisco, and I would like to re-create it in Chicago.
  • Take a picture of the sun every day at noon, against a static background.  Over the course of a year, it should make a figure 8.  The shape of that figure changes based on the distance from the equator, so I thought it would be neat to get several different latitudes (every 10 or 20 degrees or so) to do this. 

 

Finding things that change slowly, but dramatically is a challenge.  It looks like some of the Raspberry Pi road testers, like nbizzell are off to a really good start!

 

I hope you are all having a good new year!

About a year ago, I got my first Raspberry Pi, and sometime in the past year, I have come to understand it as a powerful computer that I can dedicate to a specific task.  A couple weeks ago I got the Raspberry Pi Camera Board, a simple camera that...well...takes pictures.

 

The obvious thing to try is time lapse photography since I know I can make the RPi take a lot of pictures, and another piece of software can stitch them into a movie.  So I have the method I used for doing it here.  There are lots of ways to do this.  I am just showing how I was able to do it in a short period of time.

 

Connect the Camera

 

This was pretty straightforward, and there is a tutorial for this here.

 

Take a Picture (probably a selfie)

 

No additional software is needed to do this.  You just need a simple command:

 

raspistill -o picture.jpg

 

This captures a picture and stores it on the Pi.  If you want to specify a location for it then you can put that in there (/home/pi/camera/picture.jpg).  I am a folders kind of guy.

 

Pi_selfie.jpg

 

Now you can set up for your actual experiment.  This is actually the hard part.  You want to find something that changes dramatically, but does it very slowly.  Obvious things include clouds, traffic, etc.  I didn't have access to a good view or anything I would consider to be weather-proof.  So I went with lettuce leaves changing their color when placed in a small amount of food coloring.  Here they are at the start of my experiment.

rsz_lettuce_0001.jpg

 

Over time, through capillary action and some other processes, these lettuce leaves should absorb the food coloring in the beakers.  So now I am ready to get it moving.

 

raspistill -o /home/pi/camera/lettuce/lettuce_%04d.jpg -tl 60000 -t 43200000 &

 

This command has a few basic parts

  • raspistill - the name of the program I am calling up
  • -o /home/pi/camera/lettuce/lettuce_%04d.jpg - an output file to a specific folder.  This command will automatically make a new file with a four digit number.  So the first file will be lettuce_0001.jpg.  The next will be lettuce_0002.jpg and so on.  This is very handy when it comes time to sequence them.
  • -tl 60000 (this is the letters 't' and 'l'.  I tried the number '1' and a capital 'I' and failed both times).  This is an interval.  The camera will take a picture every 60000 milliseconds, or 60 seconds.
  • -t 43200000 - the total time for this experiment.  It will run for 12 hours.


So when all is said and done, I should have one picture every minute for 12 hours, which comes out to 720 pictures.  Unless you change something about the size of the photo you take, each picture will be about 2.4MB. 


Getting the pictures from the camera, can be a bit of a trick.  There are a couple methods you can use.  In the big picture sense (pun!), you will probably want to transfer these pictures to a computer with a little more processing power than the Raspberry Pi.  There area few methods for this.  Below are the ones I researched before settling on something that I really liked.

 

  • Rsync - This method will sync a folder on the Raspberry Pi with a local folder on another machine.  This uses an SSH connection between the two machines. 
  • Dropbox - This method makes a lot sense (though I haven't tried it yet).  It basically allows you to save the files to your drop box folder as it goes.  There are a lot of benefits to this method as you can access the pictures from anywhere. 

 

I finally settled on using WinSCP.  This is something I ended up using for other web work I was doing.  It occurred to me that knowing the IP address of the RPi, I could SFTP to go in there and transfer the files I needed.  This worked great, and allowed me to use some software I was already familiar with. 

 

2014-12-08_1516.png

 

With all of the files selected and organized, I could use another piece of software to sequence the pictures into a time lapse movie.  Some photographer friends recommended Cineform Studio, which is the free software that accompanies the GoPro camera.  Just a word on this kind of thing.  I am already well outside my comfort zone, and trying to discern between different photo editing packages is like asking me to learn Shakespeare in Latin.  There probably lots of ways to go here, but this one seemed intuitive for me.

 

2014-12-08_1535.png

 

This would be the part where I put the finished video, but I am letting this run for another few hours, so that will have to wait.  But until then, I hope you have found this helpful!

So I have been tinkering with Raspberry Pis for a little while now, and I am thoroughly convinced of their utility, and I am hoping to spread that out in my classroom.

 

On example of a project I would like to do is taking temperatures using the waterproof temperature probe.  There are many others, but that is a good place to start.  My students make it a habit to bring their devices (phones, computers, tablets) to class, and it seems like I should be able to get their devices to control a probe on the RPi.  The most obvious way to do that is through a web client. 

 

So, I have some code I have been using for getting temperatures to report out to a Google spreadsheet by using gspread.  You can find it here:  https://github.com/MDScience/temperature

 

I would like to use it improve upon it with a web client so that my students could use their device to go to the RPi and tell the probe to start collecting temperature information once they are ready.  I imagine there would be buttons and fields for required information (like the name of the file they would like to output to, and so on).

 

This, of course, is just the beginning.  We would want to do more than temperature measurements, and we would need other sensors and other web clients. 

 

If there is a good tutorial, or a model to follow I would greatly appreciate knowing about it.  Similarly, if there are any engineers who have experience in this area, and would like to lend a hand, I would appreciate that as well.

 

Thanks!

I did this work in early October when I had some free time on the weekend.  As a reminder, I was interested in how different colors of glow sticks spend their energy.  I know, for example, that blue is an energetic color of light, and that red is a less energetic color of light.  For a light source, giving off red visible light means that energy is being given off in small packets.  Its like spending $100 just pennies at a time. 

 

So I made a simple circuit just using a photoresistor inside a wooden box.  I used an arduino to capture the readings and sent it to a Xively account, and later downloaded that information and plotted it (thanks Peter Oakes!).  The Xively aspect of things was cool because I could see my experiment from anywhere in the world (even if I was only in the next room).  I was hoping for a gspread library so I could use a Google Doc, but I think that will have to wait.

 

At any rate, my results are below:

 

data.jpg

 

As you can see, Green and Orange behaved very similarly.  White, however, was always dim.  On all accounts, I was shocked by how quickly they faded.  I didn't expect it to be quite so exponential.  The X-axis is in seconds, which means that in a matter of minutes, the light being given off is really diminishing.  Green and orange have a sizeable difference in energy, so I was also surprised to see that they behaved so similarly.  I, do however, know that the green color is easy to make with luminescent chemicals.  Producing other colors often involves fluorescence.  That is something with high energy emits light, which gets absorbed by something else and re-emitted as lower energy light.  I think that might be the case here.  Still I was surprised to see that this didn't lead to much dimming.

 

White should be a collection of a lot of colors, but not of lot of things give off white light as part of luminescence, so there is probably a lot of muting and filtering just to get white.  In other words, it doesn't work great, and don't take it camping. 

 

All in all, I am pleased with how this went.  I am not done, not by a long shot.  I wasn't able to make my light detector (lux) work very easily with the xively account, so I had to settle for the photoresistor.  I will repeat this soon with the Lux sensor and will play with the positioning of the glow sticks as well.

 

Thanks, for all of the suggestions to make this work.  If you can see anything else that is interesting please let me know.  My goal would be to connect this to kinetics for a general chemistry class.

I am doing it.  This weekend!  I am done talking about it.  This is the weekend!

 

For years I have been imploring teachers, students, and parents to expand their mindset when it comes to the science fair.  I see a lot of

 

  • Which battery lasts the longest?
  • Which light is the best for plants?
  • Which diaper is the most absorbant?
  • Which music is makes me the calmest?

 

All of these are not without merit.  They teach process and you might get a good graph out of it.  At the end of the project, however, people haven't really changed their battery purchasing habits, or diaper habits, or gardening output.  That is, they are a project for the sake of a project.  In this case they are graded and mandatory.  I think, through open source electronics like the kind this network seems to love, there is a chance to do more engaging work.

 

For years, I have been curious about the lasting power of glow sticks.  I know that, as a chemical reaction, they glow dimmer and last longer when they are cold.  I know as a consumer that they come in a number of colors.  I am curious, as a scientist, if their light output changes with their color.  So that is my question.  If I was 25 years younger, it would be my science fair project.  Now it will be my classroom project.

 

Using the Arduino Uno and the Lux Sensor, I am going to be able to measure the light coming off one of these and measure its decay as a function of color.  I can also do it with temperature if the time and inspiration are there.  I am very excited, and this is the time. 

 

The only help I could use, and this is the best part of a community like this, is that I would like to be able to get my Arduino to send the information to a document like a spreadsheet, or a webapp like Xively.  Any advice there would be very helpful.

 

Stay tuned for results!