Skip navigation
2015

Previous posts for this project:

 

 

Project Update

 

There was some nice weather in Belgium the past week, and I took the opportunity to prepare the garden for summer. Mowing the lawn, planting some herbs, cleaning the terrace, etc ... This means that I didn't make a whole lot of progress on my project this week, but it doesn't mean I didn't do anything either. For this week's update, I've been combining some components I got up and running in the previous weeks, more specifically: the Touch Board and the Raspberry Pi with LED strip.

 

I hooked up the Touch Board via USB to the Raspberry Pi and had it send PLAYX (where X is the number of the electrode pressed) messages to the Pi over serial. Listing the tty devices, I determined the Touch Board was the "ttyACM0" device.

 

pi@PiDesk ~ $ ls -l /dev/tty
tty        tty17      tty26      tty35      tty44      tty53      tty62
tty0       tty18      tty27      tty36      tty45      tty54      tty63
tty1       tty19      tty28      tty37      tty46      tty55      tty7
tty10      tty2       tty29      tty38      tty47      tty56      tty8
tty11      tty20      tty3       tty39      tty48      tty57      tty9
tty12      tty21      tty30      tty4       tty49      tty58      ttyACM0
tty13      tty22      tty31      tty40      tty5       tty59      ttyAMA0
tty14      tty23      tty32      tty41      tty50      tty6       ttyprintk
tty15      tty24      tty33      tty42      tty51      tty60




 

I installed "minicom" to verify the expected serial messages where being received by the Pi.

 

pi@PiDesk ~ $ sudo apt-get install minicom
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following extra packages will be installed:
  lrzsz
The following NEW packages will be installed:
  lrzsz minicom
0 upgraded, 2 newly installed, 0 to remove and 22 not upgraded.
Need to get 420 kB of archives.
After this operation, 1189 kB of additional disk space will be used.
Do you want to continue [Y/n]? y
Get:1 http://mirrordirector.raspbian.org/raspbian/ wheezy/main lrzsz armhf 0.12.21-5 [106 kB]
Get:2 http://mirrordirector.raspbian.org/raspbian/ wheezy/main minicom armhf 2.6.1-1 [314 kB]
Fetched 420 kB in 2s (173 kB/s)
Selecting previously unselected package lrzsz.
(Reading database ... 78547 files and directories currently installed.)
Unpacking lrzsz (from .../lrzsz_0.12.21-5_armhf.deb) ...
Selecting previously unselected package minicom.
Unpacking minicom (from .../minicom_2.6.1-1_armhf.deb) ...
Processing triggers for man-db ...
Processing triggers for menu ...
Setting up lrzsz (0.12.21-5) ...
Setting up minicom (2.6.1-1) ...
Processing triggers for menu ...




 

Using the minicom command with parameter "s", the serial port information can be configured. I specified the correct interface and baudrate and could see the messages coming in.

 

pi@PiDesk ~ $ sudo minicom -s





 

Screen Shot 2015-05-29 at 18.58.07.pngScreen Shot 2015-05-29 at 19.17.36.png

 

To be able to read from the serial interface from Python, I tried to install the "python-serial" module. It turned out to be pre-installed.

 

pi@PiDesk ~/rpi_ws281x/python/examples $ sudo apt-get install python-serial
Reading package lists... Done
Building dependency tree
Reading state information... Done
python-serial is already the newest version.
0 upgraded, 0 newly installed, 0 to remove and 22 not upgraded.





 

After taking the "strandtest.py" NeoPixel strip example and modifying it to react to serial input, the PLAY messages could trigger the LED strip. It does not yet make a distinction between the buttons being pressed, but the mechanism itself is working.

 

pi@PiDesk ~/rpi_ws281x/python/examples $ sudo python button.py
Press Ctrl-C to quit.
PLAY0
PLAY0
PLAY0
PLAY1
PLAY4
PLAY7
PLAY0





 

The Python script reading the serial input and triggering the LED strip can be found here:

 

And finally, a short demo:

 

Introduction

At the actual date the first part of the project is closed: machines are assembled in the box, powered and the entire architecture has been set about 95% accordingly with the initial design. The only plus is the introduction of a fourth PI as the Cirrus audio board it is very difficult to match with any other hardware implementation on the same device so the PI hosting the Cirrus will only work in conjunction with the Bitscope module via the USB that is used in reverse mode. I mean as an acquisition and pre-processing probe instead of its traditional role of analog and logic analyser.

 

A note on the power supply

As mentioned before in this first experimental device Medtiech will be powered by an external ATX source providing 3.3V, 5V and 12V distributed with a simple set of power replication units as shown in the images below: the first shows the ATX connected to the main logic power supply switch while the second shows the simple circuit of the power replicators distributed along the center of the box to reach easily the devices..

 

TODO: Analyse in-depth alternative powering options. As the high voltage power supply is external to the device this solution can be acceptable but in big mobility conditions a battery operated Meditech system should be preferred. A detail that should be considered instead is the power consumption of the entire system - also after a further power usage optimisation. The Raspberry PI units tend to consume at least 2mA/h each one or more, depending on the extra boards attached. Another factor if the operating environment: in extremely hot conditions in conjunction with high humidity percentage the batteries are not a reliable solution if used as the main power source of the system.

500W ATX power supply connected to the logic main switch unit inside the Meditech box The simple circuit of four power replicators

 

LCD display assembly

The LCD display has beed adapted to shown the lowest possible profile and the electronic parts has been reassembled with flexible lightweight acrylic plastic components and super-compact micro foam to reduce as much as possible the weight of the unit. The following image shows the final aspect of the 15 inches display that actually weights about 1 Kg less than original.

IMG_20150530_132416680.jpg

TODO: A smaller device (e.g. 7 inches) may fit in the Meditech box; this aspect has to be evaluated at the end of the project to give more portability to the entire system without penalising the shown information.

 

The images below shows some details of the assembled display as in the actual experimental version.

 

IMG_20150530_132443273.jpgIMG_20150530_132500209.jpg IMG_20150530_132512595.jpg

 

Assembled views

The following image shows the assembled Meditech box as it appears when is close for transportation and open ready to be used. Not that an extra-12Vplug is provided (to the left of the second image) to power the printer. This device can wok with or without connected power as it has its own internal battery accessed via Bluetooth wireless connection.

 

IMG_20150530_132752249.jpg IMG_20150530_134041575.jpg

 

Internal view and details

The center of the box, including the HD (1Tb 5 inches, will be replaced by a 2.5 SSD HD, 180 Gb) and the network hub together with the power units. Note that the cables - especially the LAN connections - will be replaced by shorter ones to reduce the wires occupation.

IMG_20150530_134524189.jpg

 

IMG_20150530_134449918.jpg

The image above shows the internal of the devices container. It hosts the service PI (with GPS, Accelerometer, Bluetooth for printing and other support features) on the center, the main PI (with database, collector features, WiFi bridge and data organisation and remote feeding, real time clock) to the right side. The left side hosts the Pi dedicated to the audio acquisition and the analog / digital readings through the Bitscope.

Note that to the bottom right side there is a mechanic switch to detect when the box section opening. This information and other related to the health status of the system (i.e. the internal temperature, the fan speed etc.) are controlled by the ChipKit board (right side) sending critical conditions and changes on the system status to the main PI governing the entire architecture. In response to the sensors feeds the main PI can show alarms and warning pop-up windows with high priority, stop some functionalities or shutdown the entire system.

Introduction

In the Meditech container there is a variable number of devices working at least at two different voltages: 5V and 12V. In this prototype version the definitive powering system has been moved to the bottom of the priorities due the restricted deadlines and the need to have an experimental unit before a final decision: battery type, charging type, power consumption etc. Just for this reason the power will be granted by a common ATX switching power supply. This implies a couple of conditions:

 

  • For any future change, we will grant a power distribution for 3,3V, 5V, 12V
  • The number of power points is variable and should not be difficult to increase / decrease
  • The main power should be simple to be replaced as in future the power source is different than the first prototype

 

Main power module

The main power module simply act with a NE555 in a similar way as the logic power switch of PCs. This is a small independent circuits that should be replaced by the future battery power control.

The following images shows the circuit schematics and the relative layout.

 

Main power distribution schematics.png Main power distribution layout.png

 

Power rail distribution

Every powering unitis a short module that will fit in a plastic rail; the circuit can be positioned along the wider side of the bottom of the Meditech box for easy reachability of the powered devices. Every group is connected with the previous and exposes all the power supply voltages and ground.

The following images shows the circuit schematics and the relative layout.

 

Power Rail schematics.png Power Rail layout.png

Starting from the selected box that will host the Meditech prototype components the first step has been to prepare the container to host the prototype components. First of all the setting of the cooling fan. The following image shows the general idea on how to use one of the two sides of the box. The other side will contain the default probes, accessories etc.

 

IMG_20150527_133141759.jpg

IMG_20150527_133150763.jpg

A modular system

To keep the system totally modular, every specialised device (based on the Raspberry PI) is hosted on one of the yellow frames. The central zone will host the cooling system only because it will contain several control components, a status display, status leds and so on.

 

Every RPI unit is simple to install and replace for any reason and to keep the things more simple the standard connectors will left untouched so that any non-expert user can repair a unit simply replacing the entire block.

 

A mechanical microswitch should disable the entire powering until the top lid is not closed.

 

The air circulation is granted by a 12 cm PWM controlled fan and the frames will be adapted to permit to the cooling system to work properly; the fan will start at a sensor detect a critical temperature.Temperature and switch will work autonomously controlled by a program running on the ChipKit PI board that also control the led and other display feedback - despite the LCD monitor - interacting with the RPImaster that controls the entire networked system.

The storage and network hub should be fixed in the center area - together with the powering system - as they will cable the entire system.

 

First modification: the cooling system

The following three images shows the installation of the fan and the preparation of the forced air circulation.

IMG_20150527_192904432.jpg IMG_20150527_192806681.jpg IMG_20150527_192826521.jpg

The prototype container is arrived today, just waiting to be adapted to host the Meditech components.

 

IMG_20150526_142455241.jpg IMG_20150526_142434930.jpg

 

It is almost compact and has sufficient space to share all the components, the power unit (in the base) and one of the two sides will contain probes and accessory.

For now the decision is to posticipate the question of the battery powering and charger system. Just to remember, the project will include: solar cell, external domestic power charger and car power charger. For this first version I will adopt an external power unit (maybe an ATX switching) then the entire system should be tested for a correct calculation of the battery power, battery charger etc. The only limit will be to avoid the AC voltage directly connected to the box.

Next images will show the box open.

 

IMG_20150526_142607270.jpg IMG_20150526_142633860.jpg

Since my last post I've made some good progress on setting up communication between the various Raspberry Pi computers over MQTT. I got a case in the mail for my RPi 2 and that bugger is now sitting in the living room with a network cable plugged directly into the cable router. At my desk I have the other two RPi's that make up the brains of the pizza carrier.

 

PiFace CAD is running on my old RPi model B and the Xtrinsic board and GPS module are on the Model B+. Both have their own wifi dongles, the B+ is getting internet via the WiPi dongle that came with the kit. The other is using the standard dongle available from Adafruit.

 

So far, I can successfully publish messages from the B+ computer to the server (running on the RPi 2) and I can subscribe to those messages on the model B. This was a major hurdle for me because while the libraries that are available for communicating over MQTT are dead simple to use, getting a grip on how it all works is not. At least for me, anyway.

 

As mentioned in my previous posts, the server is running lighttpd and mosquitto. Mosquitto is the MQTT broker, it acts as a middleman between the subscriber and publisher clients. Lighttpd is a lightweight web server. The next hurdle was to get websockets enabled and working with mosquitto. It turns out that this was not possible until mid-last year. I have managed to get mosquitto working with websockets enabled and I can send messages using MQTT but I'm having some trouble getting messages sent over websockets via JavaScript. I'll have to keep working on this, there must be something in the config file that needs adjusting. Right now, my browser can connect to the broker but the broker gives me an error message:

 

1432607712: Socket error on client lilxbuntu, disconnecting.

 

Not sure where the socket error is coming from, so I'll have to dig deeper!

 

Once I get that fixed, though, I'll be in business and I can start building out the various interfaces! I don't think I mentioned this before, but initially in my proposal I had planned to 3D print the case that would hold the pizza box. I've since given this a lot of thought and I think a better idea is to modify the existing pizza delivery bags. There are a number of reasons for this, but the obvious one is that it will save me money. Furthermore, I think there are some pluses to having a soft case over a hard one. I kind of like the idea of modifying the bag for another reason, which is that I can dust off my sewing machine skills (and my sewing machine).

 

Other thoughts for the project before the deadline is up:

 

1) Handling multiple orders for one bag (turns out that pizza delivery bags handle at minimum two pizzas at a time).

2) Figuring out how to help the driver make speedy deliveries with PizzaPi in tow and,

3) Is there an easy and cost effective way to keep the bag heated?

 

I was supposed to be on vacation already, but I had to stay on at the lab a few extra weeks. I'm looking forward to spending endless days working on my projects at home and sleeping in a bit. Until my next update...

 

References:

http://tech.scargill.net/mosquitto-and-web-sockets/

https://goochgooch.wordpress.com/2014/08/01/building-mosquitto-1-4/

Make your Raspberry Pi into the Ultimate IoT Hub - ThingStudio BlogThingStudio Blog

Build your own Javascript MQTT Web Application

Talking Small

Paho-MQTT Open Source Messaging

Previously:

Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass

Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass - Functional Design

 

Route Selection and Indication

One of the main elements that makes up the design is the indication of the route that will be taken from the current location to reach the destination. This works in two parts the selection of the route and then the indication on the map.

 

To select the route I wanted to use some suitably dramatic wheels that turn a location roll with a selection of locations and destinations listed. These wheels will not directly drive the rolls as this would move them too quickly so the will be connected via a small cog on the wheel and a large cog on the roll. This will give a large number of turns needed to move the roll using the wheel. This will also add to the Steampunk aesthetic by adding functional cogs to the design.

 

01f8fcb68b6a3d3bf7db9e6b8ed9a3449490b40b70.jpg

 

 

The selection of the destination and start point will be handled by a wheel for each. The rotation of the wheel will connect different circuits which consist of strings of LEDs which will sit underneath the map to show the route to be taken.

 

Each tube will have a connector running from one side of the roll to the other. This will connect with some brushes or sprung contacts to complete the circuit with the correct string of lights.014c5e119e185a943ddce811ab7154a6cd0d60f68f.jpg

There will probably be some resistors required to ensure this does not fry the RPi, the diagram is to indicate how the idea works rather than being and exact schematic. Essentially each of the circular rolls shown in section have a connection that links two contacts to select the correct string of LEDs.

 

This selects the correct circuit then the RPi program when triggered (through suitably dramatic methodology) will light up the string of LEDs to show the route to be taken. The plan to make this work is to have the strings of LEDs mounted flush in a bed on which the map will sit. Thus when the LEDs are illuminated it will show the route son the map. TO make this work some suitably bright and Small LEDs will be required to make the lights look right when under the map (which will diffuse the light slightly depending on the paper used to print the map).

 

 

01ddf718be5647fb132bb5eafdf7453017f0393b9a.jpg

The LEDs for the non selected routes should not be visible with only the selected route illuminated. For stylistic reasons the routes may be indicated on the maps (like the trade route shown on old types of map. The map will also be a slightly (OK a lot) more accurate rendering and will be a world map rather than just part of it.

May 23 2015

Day 30 Sound Test

I have been working diligently on some of the sound bytes I intend to use and the necessary scripting involved triggering the sound bytes at the appropriate time during sensor reading activity. Scripts will be commented appropriately when modified specifically for the Picorder operation. Final code and documentation will be provided later as the project nears completion.

 

In the accompanying video sound test, I am using the previously dissembled stereo speakers normally used with an mp3 player or Smartphone. I took the system apart and mounted the speaker housing on one side of a perf board and secured it with hot glue and mounted the accompanying circuitry on the reverse side. The original unit was powered by a 1.5 vdc power source and will later on be adapted to draw its power from the Raspberry Pi power source. I supplied power to the speakers with a 1.5 vdc battery and made temporary connections to the Pi sound output jack for the video demonstration.

 

Within the script are 4 GPIO pins set as input triggers. Each pin is held high in its idle state. When pulled low on a specific GPIO pin, the appropriate sound byte will play through the speaker. I am using temporary sound bytes from Star Trek the original series for testing and these will change to be aligned with the sensors I use later on such as the tricorder or alert sounds. As seen in the video, as I pulled each GPIO pin low (to a ground potential) that action initiated the playing of the particular sound byte identified in the script and associated with the specific pin. The momentary triggering is essential to avoid the instant replaying of the sound byte over and over until it is unintelligible as demonstrated in the video.

 

Here is the basic code used for this sound test:

#!/usr/bin/env python

import os

from time import sleep

import RPi.GPIO as GPIO

GPIO.setmode(GPIO.BCM)

GPIO.setup(18, GPIO.IN)

GPIO.setup(23, GPIO.IN)

GPIO.setup(24, GPIO.IN)

GPIO.setup(25, GPIO.IN)

 

 

while True:

    if (GPIO.input(18) == False):

        os.system('mpg123 -q twohours.mp3 &')

    if (GPIO.input(23) == False):

        os.system('mpg123 -q access.mp3 &')

    if (GPIO.input(24) == False):

        os.system('mpg123 -q defense.mp3 &')

    if (GPIO.input(25)== False):

        os.system('mpg123 -q Destruct.mp3 &')

    sleep(0.5);

 

Speaker Mount Hot Glue Test spkr L/R Channels Power connections Power speakers

   Speaker mount                         Hot glued to secure to board      L/R channels and amplifier       Amplifier power connections        Speaker powered up

 

The above assembly will be positioned inside the final casement along with the Pi and appropriate sensors.

 

Next blog post is planned to be the multiple sensor testing and scripting for the various sensors. It may also include the readout display either as a graphic or numerical display.

 

Michael

Abstract

The project proposed has a subsystem which can use the image processing capabilities of the RPi to get commands visually which can be transmitted to various devices over the network. In the previous post (http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/16/project-virusweek-2-getting-started-with-opencv-and-rpi-camera) I started on how to install OpenCV and how to take a picture with the RPi Camera. In this tutorial, I will be going through the procedure of setting up video acquisition using Python, OpenCV and the RPI Camera

 

Why another tutorial

There are a number of tutorials on the subject of Capturing Video in Python however this series is focused on using OpenCV on the Raspberry Pi 2 and the RPi camera. For a beginner, it can be confusing to get existing example code to run using the RPi Camera since the basic functionality is a bit different. The RPi Camera is far more capable than an ordinary USB camera since we can control some functionality of the camera using some functions and code as we will see in this tutorial.

 

IMG_9591.jpg

 

Pi Camera Python Module

In the previous post, we discussed the installation of OpenCV and the PiCamera Module. I will list out the most useful functions in the module and how to use them as follows:

1. capture(output, format=None, use_video_port=False, resize=None, splitter_port=0, **options)

In the above, the filename, output format, and image size can be configured.

2. capture_continuous(output, format=None, use_video_port=False, resize=None, splitter_port=0, burst=False, **options)

Used to capture a video stream with a specific format, and size

3. capture_sequence(outputs, format='jpeg', use_video_port=False, resize=None, splitter_port=0, burst=False, **options)

Used to capture a sequence of images for say a time lapse video?

4. record_sequence(outputs, format='h264', resize=None, splitter_port=1, **options)

Used to record a sequence of video clips of predefined length

5. awb_gains

Used to get or set the auto white balance gains. Useful when you are trying control the white balance in a scene

6. awb_mode

Used to get or set the Auto White balance mode. You can set it to ‘off if you face problems with image processing in a place.

7. brightness

Used to set the brightness manually.

8. contrast

Used to set the Contract Manually.

9. exposure_mode

Used to adjust the exposure mode of the camera.

10. meter_mode

Retrieves or sets the metering mode of the camera. Can be set to average, spot or matrix or backlight.

The complete list is available at (https://picamera.readthedocs.org/en/latest/api_camera.html#module-picamera.camera) but these are the ones I commonly use. Lets start by making a template for our OpenCV projects.

 

 

Pi Camera Template

Since OpenCV allows the option for image compression out of the box, you may be tempted to use it. It will save space if you are saving the images HOWEVER, jpeg is a lossy compression format AND it takes processing horsepower to compress and decompress it. Instead I would recommend capturing and processing raw images which is better in live streams. Here is the code...

 

# import the necessary packages
from picamera.array import PiRGBArray
from picamera import PiCamera
import time
import cv2

# initialize the camera and grab a reference to the raw camera capture
camera = PiCamera()
camera.resolution = (640, 480)
camera.framerate = 30
Capture = PiRGBArray(camera, size=(640, 480))

time.sleep(0.1)

# capture frames from the camera
for frame in camera.capture_continuous(Capture, format="bgr", use_video_port=True):
    image = frame.array

    # show the frame and do stuff to it here
    cv2.imshow("Frame", image)
    key = cv2.waitKey(1) & 0xFF

    # clear the stream in preparation for the next frame
    rawCapture.truncate(0)
    # if the `q` key was pressed, break from the loop
    if key == ord("q"):
        break




 

The code above is a good starting point for your OpenCV projects and I would use try and catch to allow cleanup. The alternative is to use ‘with’ statement which is explained here (https://www.python.org/dev/peps/pep-0343/) and is given in the (https://picamera.readthedocs.org/en/latest/recipes1.html)

 

import time
import picamera
import picamera.array
import cv2


with picamera.PiCamera() as camera:
    # camera.start_preview()
    camera.resolution=(640,480)
    camera.framerate=30
    time.sleep(2)
    with picamera.array.PiRGBArray(camera) as rawCapture:
        time.sleep(0.1)
        for frame in camera.capture_continuous(rawCapture, format='bgr', use_video_port=True):
            image=frame.array
            cv2.imshow("Video Feed", image)
            # Do Stuff here


            key=cv2.waitKey(1) & 0xff
            rawCapture.truncate(0)
            if key==ord("q"):
                break




 

I was able to get a decent output with this and if you know of a better method, please do let me know and I will update it.

 

Capture a timelapse

Calling all 3D printer people. This is something that you will like and I will be trying this out myself. The concept is to take a picture every few minutes if not seconds and then put them all together to for a video. This allows for a “Fast-Forward” view of the subject which may be a plant growing, ants building, sunrise and sunset or my favourite… a model being 3D printed. The script is simple as:

 

import time
import picamera


with picamera.PiCamera() as camera:
    camera.start_preview()
    time.sleep(2)
    for filename in camera.capture_continuous('img{counter:03d}.jpg'):
        print('Captured %s' % filename)
        time.sleep(300) # wait 5 minutes




 

That’s it! You can set various parameters like AWB and the duration between the images. I recommend copying this script in a folder and running it from there so that all the files are created in the folder only.

 

There are other stuff that you can do like stream the video over a network but my interest was only to capture and process.

 

Conclusion

I have presented a small segment of code that I hope will be useful to you all starting out. In the next episode, I will be showing you how to select objects in a live stream and then track them. See ya next time!

The HDMI Monitor

As I mentioned in some post before, the initial idea to use a separate tablet as display unit has been expanded and simplified, using an LCD display integrated with the device. With the advantage to use the external smartphone or tablet as the access point for tethering only.

The two images below shows the prototype display. It was already available so it is sufficient to make tests but it is 4:3.

 

IMG_20150523_154741157.jpg IMG_20150523_154806935.jpg

The used monitor is a non-standard HDMI; I mean that it supports the HDMI input but as a matter of fact it is a 4:3 proportion screen so you can see the image is stretched.

 

The screen resolution fine-tuning

If the monitor is not the standard HDMI, frequency etc, there is a complete wiki page explaining how to properly set the file /boot/config.txt accordingly with the monitor characteristics.

Following the clear explained settings it is possible to calibrate the GPU settings at boot to manage the display accordingly with the screen size, resolution, aspect ratio and scan frequency. It is not possible to write down a tutorial due the wide number of options and the fact that in most cases these settings strictly depends on a certain screen characteristics. In fact it is not a too risky operation to make tests if the boot is set to not start immediately the graphical environment. If something goes wrong it is always possible to step back with the settings without problems. Just following this empirical method I have set the display changing the following parameters (the order they appear in the config.txt file does not matter):

 

# Custom resolution settings

hdmi_group=2

# Set to 1024x768 60Hz

hdmi_mode=16

 

# Set the screen to composite PAL (maybe meaningless for the HDMI output)
sdtv_mode=2
# Screen aspect ratio  (maybe meaningless for the HDMI output)
sdtv_aspect=2

 

The following image shows the correct proportions on the screen after reboot.

 

IMG_20150523_181650029.jpg

 

Now the proportions are correct but the screen resolution was too low as far as I know for this LCD. Keeping the same aspect ration and the same refresh rate the screen settings has been changed again to

 

#  Custom resolution settings
hdmi_group=2
# 1280x950 60 Hz
htmi_mode=32

 

As shown in the next image now aspect ration and resolution are correct (note the different size and disposition of the icons on the desktop in these last two images). All the settings tables are included in the mentioned wiki page that is attached to this post in PDF format for any use.

 

IMG_20150523_184534335.jpg

My kit have arrived (yes, almost 45 days, thats a con. to live in Brasil), SO, i will keep it updated weekly, any news, in the next posts...

 

Tks all, and good luck to all of us

Hi everybody

 

Is time for a new step in Cybernetic Interface development - audio interface configuration.

Because RaspberryPi lacks a microphone input, which I need for my project, I had to add and configure an external USB sound card.

Due to space constraints I picked the smallest I could find,  Konig 3D Sound, based on C-Media CM108 Audio Controller.

konig_usb_sound.jpg

It worked from the first try, plugged directly in RPi USB port or in USB Hub.

Configuration and testing steps are listed bellow:

 

Check sound card configuration:

 

cat /proc/asound/card ->

 

0 [ALSA          ]: bcm2835 - bcm2835 ALSA

                      bcm2835 ALSA

1 [Device        ]: USB-Audio - USB PnP Sound Device

                      C-Media Electronics Inc. USB PnP Sound Device at usb-bcm2708_usb-1.2, full speed

 

lsusb ->

 

Bus 001 Device 002: ID 0424:9514 Standard Microsystems Corp.

Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp.

Bus 001 Device 004: ID 0d8c:013c C-Media Electronics, Inc. CM108 Audio Controller

 

show that the USB sound card is visible. Next, edit alsa-base.conf to load snd-usb-audio as first option:

 

sudo nano /etc/modprobe.d/alsa-base.conf

 

Change configuration to make USB sound card the default one:

 

options snd-usb-audio index=-2

to

options snd-usb-audio index=0

 

and after a sudo reboot, cat /proc/asound/cards -> should looks like this:

 

 

0 [Device        ]: USB-Audio - USB PnP Sound Device

                      C-Media Electronics Inc. USB PnP Sound Device at usb-bcm2708_usb-1.2, full speed

1 [ALSA          ]: bcm2835 - bcm2835 ALSA

                      bcm2835 ALSA

 

 

If not already installed, install alsa-base, alsa-utils and mpg321 (or mpg123, mplayer, etc.) :

 

sudo apt-get update

sudo apt-get upgrade

sudo apt-get install alsa-base alsa-utils mpg321

sudo reboot

 

Next, edit /etc/asound.conf and change playback and capture devices from "internal" to "usb". Mine look like this:


sudo nano /etc/asound.conf

 

pcm.usb

{

    type hw

    card Device

}

 

pcm.internal

{

    type hw

    card ALSA

}


pcm.!default

{

    type asym

    playback.pcm

    {

        type plug

        slave.pcm "usb"

    }

    capture.pcm

    {

        type plug

        slave.pcm "usb"

    }

}

 

ctl.!default

{

    type asym

    playback.pcm

    {

        type plug

        slave.pcm "usb"

    }

    capture.pcm

    {

        type plug

        slave.pcm "usb"

    }

}

 

To be sure, do an other reboot and proceed to test the configuration.

 

To check configuration I used:

 

amixer -c 0 - to display current settings. Mine looks like this:

 

pi@cyberpi ~$ amixer -c 0

Simple mixer control 'Speaker',0

  Capabilities: pvolume pswitch pswitch-joined penum

  Playback channels: Front Left - Front Right

  Limits: Playback 0 - 151

  Mono:

  Front Left: Playback 119 [79%] [-6.06dB] [on]

  Front Right: Playback 119 [79%] [-6.06dB] [on]

Simple mixer control 'Mic',0

  Capabilities: pvolume pvolume-joined cvolume cvolume-joined pswitch pswitch-joined cswitch cswitch-joined penum

  Playback channels: Mono

  Capture channels: Mono

  Limits: Playback 0 - 127 Capture 0 - 16

  Mono: Playback 96 [76%] [17.99dB] [off] Capture 0 [0%] [0.00dB] [on]

Simple mixer control 'Auto Gain Control',0

  Capabilities: pswitch pswitch-joined penum

  Playback channels: Mono

  Mono: Playback [on]


and

 

alsamixer -c 0 - to modify speakers and microphone levels.

 

Using mpg321 (mpg123/mplayer/aplay/other) and favourite test sound file, plug headphones or speakers into external soundcard output and check if the sound is correctly played. It did

 

pi@cyberpi ~$ mpg321 /home/pi/test.mp3

 

Ok, with playback working, let's check the recording side.

 

Plug microphone into USB soundcard input and launch:

 

pi@cyberpi ~$ arecord -D plughw:0,0 -f cd ./test.wav

 

Use Ctrl+C to stop recording.


Check the result -> pi@cyberpi ~$ mpg321 test.wav -> success .

 

If needed, use "alsamixer -c 0" to adjust sound levels to meet your requirements.

 

That's it. Now I have both audio playback and recording on Raspberry PI.

Next step is to do some speech recognition and implement the audio aided menus.

 

All the best

-=Seba=-

Week 3 Blog

This week has seen more Python coding to produce menus and test out the capabilities of the PiFaceCAD. but it is not elegant enough to share just yet.  I need to cobble together the Service Element from the Sysinfo program and the Internet Radio program so that i can get the Pi to start up the generator as a service on start up so that users don't need a console or TTY connection to use it.

 

I do have some pictures to share, though, as I have been looking at the possibility of using one of the intelligent touch screen displays from 4D Systems in Oz http://www.4dsystems.com.au.  I will still produce a PiFaceCAD version,even if I do keep this interface as an option.  The big issue with the 4D display is the parameter passing between the display module and the Pi, so the PiFaceCAD version is likely to be less of a challenge.  Here are some screenshots of the basic pages from a uLCD-43PT which has a 480x272 display with resistive touch screen and a Picaso processor (the uLCD-43DT with the Diablo processor is recommended for new designs).

 

Startup Screen                                                        Casting Options

I Ching Startup ScreenOptions Screen

 

Casting Option 1                                                       Casting Option 2

TDC_2015_0356.gifTDC_2015_0357.gif


Casting Option 3                                                      Casting Option 4


TDC_2015_0358.gifTDC_2015_0359.gif

 

 

Settings Page


Settings Page


Kit Supply Update


Dave Hamblin of Element 14 contacted me regarding the bits that were not delivered.  It seems that apart from the Pi A+, which i did not need, i might be getting the remaining bits.at some stage.  I was intending to use the Wolfson Audio card to do the voice output for the interpretation.  Also the I Ching is supposed to be a random creation based on the state of the universe at the time it is cast so the RTC Shim and GPS/Accelerometer bits would give the user options of what inputs to to select as seeds for the random number generator.

 

I will be off work next week with fewer other interruptions to get in the way, so expect more progress by this time next week.

Application Information
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/04/22/some-information-from-my-application

ChipKit Pi Vs Arduino Pro Mini
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/01/quick-update-on-the-quadcop-and-the-chipkit-pi

Quadcopter Assembled (You call that a Quadcopter?)
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/06/quadcopter-assembled

QuadCop -The Control Switch
http://www.element14.com/community/videos/16202/l/control-switch-explanation

Quad Cop with ChipKit Pi - An "Experience" with Innovation Required

http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/07/quad-cop-with-chipkit-pi--an-experience-with-innovation

The Raspberry Pi goes for a fly!  With Pi cam

http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/11/quadcop-the-raspberry-pi-goes-for-a-fly

 

For the Raspberry Pi Flight System (RPFS) I want the Pi to do all the GPS manipulation, parsing and calculations.  Working with the GPS is only one of several tasks the RPFS will be doing.  Because  Pi 2 now has multiple cores so this make multi-tasking a really good option.

 

The Microstack GPS is connected to the Pi's serial port and spits out NMEA strings.  For more information see http://www.gpsinformation.org/dale/nmea.htm.  NMEA is a common format for exchanging GPS data.

One way to do multi tasking within a single application is to use threads.  In GNU C, there is a thread library called PThreads that is very easy to use.  I decided I would write a multi-threaded object class that can easily be reused in any project with minimal effort.  To scope out what it takes to make this possible, here is what needs to be accomplished:

 

  1. Read the data from the serial port on the Pi, which is /dev/ttyAMA0.  This reading must be interrupt driven and buffered.  We don't want to miss any bytes.
  2. Parse the NMEA data coming from the GPS via the serial port, perform validation and fill out variables with all the information that the GPS can provide.
  3. Provide an easy way to access the variables, that is thread safe.  What this means is that when reading the variables, a flag must be set not to update them in the middle of the read.  This will cause odd results to come through.
  4. An instance of the object can be created, started, and it will begin parsing the data in the backround and provide information as requested.  No more interaction is required by the host program.  Basically a "start and forget" mentality.

 

 

 

After doing some research, I found how to read the serial port with an interrupt as well as a small pre-written library for parsing NMEA data.  The library is called TinyGPS++ and it is actually written for the Arduino.  I ported it over to the Raspberry Pi for use with C and C++.  You can find the TinyGPS++ library here: http://arduiniana.org/libraries/TinyGPS/

 

I then wrapped the serial code and the TinyGPS++ code into a nice multithreaded GPS class.

Here is an example of how to use my new C++ class which has gps.h and gps.cpp as well as the 2 TinyGPS++ code files.

 

 

include <iostream>
using namespace std;
#include "gps.h"

int main(void)
{
        GPS *gps = new GPS();
        gps->Initialize();
        gps->Start();
        while(1)
        {
                sleep(5);
                cout << gps->GetAge() << endl;
                cout << gps->GetLat() << endl;
                cout << gps->GetLong() << endl;
                cout << gps->GetAlt() << endl << endl;
        }
        delete gps;
        return 0;
}
















 

 

As you can see, it is as easy as creating the GPS object, initializing and starting it. I have implemented a few functions for testing:

GetAge - How old the GPS coordinates are in seconds.  The Microstack sends new information every 1 second so typically the age of the coordinates is around .3 to .5 seconds.  This is a good thing to check as if the coordinates are too old you may want to wait for an update.

GetLat - Get the current latitude

GetLong - Get the current longitude

GetAlt - Get the current altitude

 

The reason I use functions instead of directly accessing variables is because we need thread safe reads:

double GPS::GetLat()
{
        double l;
        readBlock = true;
        l = currentLat;
        readBlock = false;
        return l;
}









 

The readBlock flag is checked in the threaded update code, and as long as the flag is set to true, it wont update the variables.

 

If anyone is interested in using my GPS object above, please let me know.  I can send the code out as well as how to compile it.

It is designed for the RPi and microstack.  I have some cleanup to do in the code but I am happy to pass it a long.  Once I get it completed I will be putting it up on my personal blog.  But if you want to get started with the GPS now with C++, here is a way to get going.

 

Another update is that I have the ChipKit Pi working completely as a control switch.  Here is a demo flight I did with the control switch reading my radio inputs and sending them to the flight controller.  A small step in the right direction.  I am working hard to get some auto flight code done, and having the GPS working is a HUGE step in that direction.  It doesn't look like much but a lot is going on.  The ChipKit pi is reading the PWM signals from the Rx, and then passing them to the flight controller on the quad via the software PWM library.  A switch on the radio will put the Raspberry Pi into control, and my radio inputs will be ignored except to put it back into manual control at my request.  Its been very windy the last 2 weeks so I had to settle for a simple hover inside, a bit nerve racking.  This is in fact the first test of this.   Did it turn out ok?

 

DSCN2014.JPG

 

Here is a video of the short hover flight:

 

 

Here is some rambling about what is going on so far with this, this is a bad video my apologies I just felt I need some "proof" since the flight looks like any other flight 

Introduction

The Meditech architecture is based on a certain number of Raspberry PI (initially three in the actual prototype) every one dedicated to manage specific tasks. These tasks are strictly related and needs to be also synchronised in time: For example the database records on the RPImaster device should have the right timestamp accordingly with the other network node from where the event is collected. Not only but some continuous data collections comes from different devices and should report the same timing with at least 1/10 of second precision.

 

To these functional aspects we should add the fact that there are conditions that does not permit to the devices to stay synchronised with the Internet time because of poor signal, Internet connection unavailability and so on. This is the reason that I have adopted an internal time synchronisation using the NTP time protocol.

 

Reference docuemtation

A good yes not too complex explanation on how the NTP protocol works and the difference between the NTP client and server can be found on this article (also attached in PDF to this post) The concept is clearly explained but as our needs are slightly different it is not to be considered as a cut-and-paste document and should be interpreted.

 

Meditech NTP client-server architecture

 

NTP Server

IMG_20150520_152259496.jpgFirst of all, the RPIslave1 device, that is the network NTP server, hosts the PiFace real-time clock board. The network is not accessible by the because of the bridge RPImaster, but when needed every internal device is enabled to access the Internet, with limitation to some protocols only. RPIslave1 is the only device that access to the Internet NTP servers, when there is an available connection, to update its RTC. When there is not any available connection the RTC can provide anyway the right time to synchronise all the other devices because it is configured as NTP server.

 

NTP Bridge client-server

Meditech hosts internally two different networks:

  • A wired network through the devices eth0 for all the devices hosted intside the Meditech box (net 192.168.5.0/255)
  • A WiFi network through the device wlan0 for all the other devices that should be removable and battery-operated (net 192.168.1.0/255)

The configuration of the NTP protocol in the RPImaster device is a bit uncommon because it is the client of the RPIslave1 device but is also a secondary server, bridging the NTP protocol between the two networks.

 

NTP Wireless client

The RPIslave2 device, connected via WiFi, can't access directly the ethernet wired network so it has its NTP protocol configured to access to the RPImaster secondary NTP server.

 

Common to all devices

Excluded the NTP server in all the other devices the access to the NTP Internet servers has been disabled resulting a time synchronised network based on a primary NTP server.

 

NTP configurations in practice

The NTP configuration parameters are stored in the /etc/ntp.conf configuration file. That should be edited accordingly to the network configuration. The three configuration files are attached in the ntp_conf.zip file to this post.

The nest paragraphs only show the ntp.conf file parts that are meaningful for this kind of configuration

 

NTP main server (RPIslave1)

As this is the main NTP server the default configuration of the public servers remain untouched: if the Internet connection is available the internal date and time is updated with one of the accessible public servers.

 

# You do need to talk to an NTP server or two (or three).
#server ntp.your-provider.example

# pool.ntp.org maps to about 1000 low-stratum NTP servers.  Your server will
# pick a different set every time it starts up.  Please consider joining the
# pool: <http://www.pool.ntp.org/join.html>
server 0.debian.pool.ntp.org iburst
server 1.debian.pool.ntp.org iburst
server 2.debian.pool.ntp.org iburst
server 3.debian.pool.ntp.org iburst

 

The NTP server is configured to grant the access to the Meditech internal networks so that any client can ask for NTP updates.

 

# Internal network for both ethernet and WiFi lans have full access
# to this server (nocriptography needed).
# If you want to grant the access to only encrypted clients access, add "notrust" at the end of every network definition
restrict 192.168.1.0 mask 255.255.255.0
restrict 192.168.5.0 mask 255.255.255.0

# Authorised clients subnets
broadcast 192.168.1.255
broadcast 192.168.5.255

 

NTP bridge client-server

RPImaster is configured as both an NTP client and a (secondary) NTP server. For testing purposes only (this devices has a 1Tb hard disk for data storage) an extensive NTP logging has been enabled.

 

# Enable this if you want statistics to be logged.
# For NTP internal server testing and data logging
statsdir /var/log/ntpstats/

statistics loopstats peerstats clockstats
filegen loopstats file loopstats type day enable
filegen peerstats file peerstats type day enable
filegen clockstats file clockstats type day enable

 

Only the RPIslave1 NTP server is enabled while the NTP Internet pool is disabled (left commented)

 

# You do need to talk to an NTP server or two (or three).
# Internal server
server 192.168.5.1

# Internet NTP servers are disabled to manage a unique synchronized
# time between the entire network
# pool.ntp.org maps to about 1000 low-stratum NTP servers.  Your server will
# pick a different set every time it starts up.  Please consider joining the
# pool: <http://www.pool.ntp.org/join.html>
#server 0.debian.pool.ntp.org iburst
#server 1.debian.pool.ntp.org iburst
#server 2.debian.pool.ntp.org iburst
#server 3.debian.pool.ntp.org iburst

 

RPImaster is also configured as a (secondary) NTP server to grant the access to the WiFi second network through the bridging features.

 

# Clients from this (example!) subnet have unlimited access
# This device is the network bridge so relaunch the NTP server
# to the wlan0 connected devices.
restrict 192.168.1.0 mask 255.255.255.0

# If you want to provide time to your local subnet, change the next line.
# (Again, the address is an example only.)
broadcast 192.168.1.255

 

NTP client

The shown client is accessing the Meditech network through the WiFi so its NTP server is RPImaster. It is obvious that then same configuration settings (with the correct IP address) are valid also for the RPIslave1 NTP server.

 

# Access the secondary NTP server RPImaster via the WiFi connection
server 192.168.1.99

# pool.ntp.org maps to about 1000 low-stratum NTP servers.  Your server will
# pick a different set every time it starts up.  Please consider joining the
# pool: <http://www.pool.ntp.org/join.html>
#server 0.debian.pool.ntp.org iburst
#server 1.debian.pool.ntp.org iburst
#server 2.debian.pool.ntp.org iburst
#server 3.debian.pool.ntp.org iburst

 

Remember to restart the NTP service

After the configuration of the file /etc/ntp.conf restart the NTP service on every device

 

$> sudo /etc/init.d/ntp restart

Previous posts for this project:

 

 

Project Update

 

I was originally planning on using the PiFace Digital 2 to control relays to turn an analog LED strip embedded in the desk on or off. Since then, I came across a library to control addressable LED strips (or NeoPixels) via the Pi's GPIO. The library doesn't function with the Pi 2 (yet), but it does with the A+/B+ which I was planning to use for the desk controls anyway.

 

One concern I was having, is that the addressable LED strips operate at 5V. I wasn't sure a 3.3V GPIO pin from the Raspberry Pi would be sufficient to control the strip and that I might have to use some logic level conversion. Well, luck was on my side, as it appears that no conversion is required and the strip can be controlled directly from the Pi's GPIO.

 

A guide on how to control the LEDs and install the library can be found here: Sci Fi Your Pi: PiDesk - Guide: Controlling NeoPixels with the Raspberry Pi A+/B+

 

 

I can now proceed by defining custom animations based on triggers such as received emails or hashtags being used on twitter for example.

 

photo (7).JPG

 

Introduction

 

Addressable LEDs or NeoPixels are typically used in combination with an Arduino or similar microcontroller, due to the timing critical signal required to control them. An SBC such as the Raspberry Pi is not suited for such realtime GPIO activities, as the Linux operating system runs other tasks in parallel. Or at least that was the case until Jeremy Garff found a way to use the DMA (Direct Memory Access) module to transfer bytes of memory between different parts of the processor, without using the CPU and thus not being interrupted by the Pi's OS.

 

This procedure works for all Raspberry Pi models except version 2!

 

Software

 

Jeremy Garff has written a library called "rpi_ws281x", which can be found on his GitHub page: https://github.com/jgarff/rpi_ws281x. It makes use of the Pi's BCM2835's PWM module to drive the controllable WS281X LEDs found in NeoPixel strips and rings. The folks at Adafruit have created a Python wrapper for the library along with some Python example scripts, making it look and feel like the Arduino NeoPixels library. So if you're familiar with NeoPixels on the Arduino, you should be up and running with this version in no time.

 

To compile and install the library, follow the steps below.

 

First, install the dependencies required to download and install the library:

sudo apt-get update
sudo apt-get install build-essential python-dev git scons swig







 

Next, download the files and build the library:

git clone https://github.com/jgarff/rpi_ws281x.git
cd rpi_ws281x
scons







 

Finally, install the Python wrapper:

cd python
sudo python setup.py install







 

As you can see, these steps are very straightforward.

 

Hardware

 

Hooking up the NeoPixels to the Raspberry Pi is extremely easy, just make sure the power supply used is properly rated for the number of NeoPixels you intend to use. For testing, I used a 5V/4A power supply to power the Pi and the NeoPixels (12 and 60 LEDs).

 

Screen Shot 2015-05-20 at 10.29.31.png

Make sure the ground signals of the NeoPixel strip/ring and the Raspberry Pi are connected. If they are not, the LEDs won't function properly and will light up in unpredictable patterns.

 

Even though the LED strip/ring operates at 5V and the Pi's GPIO at 3.3V, it appears that it is possible to drive the LEDs without having to use logic level conversion.

 

 

Demo

 

I tested two components:

  • an Adafruit NeoPixel ring with 12 LEDs
  • an addressable WS2811 60 LED strip from eBay

 

Both performed as expected using the sample script (strandtest.py), which I edited to configure the correct number of LEDs:

 

# LED strip configuration:
LED_COUNT      = 60      # Number of LED pixels.
LED_PIN        = 18      # GPIO pin connected to the pixels (must support PWM!).
LED_FREQ_HZ    = 800000  # LED signal frequency in hertz (usually 800khz)
LED_DMA        = 5       # DMA channel to use for generating signal (try 5)
LED_BRIGHTNESS = 255     # Set to 0 for darkest and 255 for brightest
LED_INVERT     = False   # True to invert the signal (when using NPN transistor level shift)

 

As you can see, it is also possible to edit other parameters to match the LEDs used, but also parameters such as brightness.

photo 1.JPGphoto 2.JPG

 

After a full-discharge the battery used for testing has been recharged with the simple circuit mentioned in the post Powering the camera probe for few hours to test if the charging circuit, experimental and very simple, may have problems .

 

IMG_20150518_163635831.jpgTest conditions

 

  • 7,5 Vcc / 1500 mA LiIon rechargeable battery
  • Full charge done (about 3 hours) with the LM350 based battery charger
  • Power regulator based on LM7805
  • Raspberry PI B+ with camera running continuously with the frame capture program.
  • Frames written on a NFS remote mount to have the WiFi connection always working
  • PiFace Display2 showing a continuously updated message (the system status test) and backlight enabled
  • The entire system has been packed with cardboard and exposed to almost hot sunlight

 

The duration time has been calculated from the power-on until one of the components has not presented problems due to the low power.

The entire system worked continuously for 63 minutes then camera continued to stream data but the LCD display backlight gone off and the Raspberry PI power led (the red one) disappeared.

The PI temperature shown on the display never raised more than 43 C

The ZD1211 choice

As mentioned in previous posts, the Camera Probe (that is, the RPIslave2) will include a WiFi connection. To make the board more compact and use a cheap but reliable WiFi connection the choice was the ZD1211. In the Linux environment it is a sort of "standard" as the os firmware and kernel modules to support this device is widely diffused on many linux distributions, especially for the embedded linux devices.

 

As a matter of fact any kind of WiFi dongle supporting this firmware maybe considered working for our environment. I have tested in past several WiFi dongles based on the ZD1211 under Ubuntu 8 and 10, OpenWRT, Ltib and other very custom linux including a small linux embedded device developed for the Nintendo a couple of year ago. The other advantage is that this firmware is so diffused that it is probably one of the first that will be supported by any new linux distribution; and we can consider the raspbian a recent Debian distro without doubts.

 

In attach to this post there is the working procedure to install the kernel module (loaded automatically) to have the WiFi working on the Raspberry PI. At the actual date I have tested it on all the models of PI with no problems at all.

Introduction

As mentioned in previous posts the device RPIslave2 is equipped with the PI camera to capture still frames imaging diagnostics. This probe should be easily moved outside the Meditech unit; it remain connected to the remaining system via WiFi and should be battery-powered.

The powering characteristics should be the following:

 

  • When hosted in the Meditech unit the device is powered-up running in stand-by mode: some features will work but it is not possible to manage the camera.
  • While it is powered in the Meditech unit its battery is under charge with a simple circuit that should avoid battery damage when it reach full-charge.
  • As the device is extracted from the Meditech the charger is replaced without interruption by the device battery.
  • The actual battery - with the actual tests - should be about 1200-1500 mA
  • A 5V power regulator should be used to grant at least a stable 1000-1200 mA power at 5V
  • Maybe that a small cooler fan should be used when the device is working

 

IMG_20150517_181801356.jpgPower test circuit

The power test circuit is actually using a 7.5 LiIon rechargeable battery, 1500 mA. As shown in the image there are two parts (will be a single smaller circuit). The regulated 5Vcc power is granted by a LM7805 while the battery is also connected to a simple yet small recharging device.

At the actual date the recharging circuit assolve its function to stop powering the battery as it is very near to full charge due to the final resistor that should be calculated based on the power of the battery. I have considered to erogate a lower voltage from the charging unit respect the battery and - in this case of the 1500 mA battery - limiting to about 1000 mA the power. The recharging circuit is done for now with a couple of LM350. I consider anyway to make the recharging unit with a OpAmp (LM358 maybe fine) and a feedback with three leds: under charge, charged, low battery

 

Issues

The system has beed tested for many hours and responded well. I think that some alternative solutions should be adopted for the power regulator finding a way to replace the LM7805 with some alternative: this component grant a prefect current level but dissipates too much current, specially when the RPIslave is running with all the features (camera, WiFi unit, image streaming etc.)

Introduction

Inside the Meditech architecture the platform RPImaster is the deviced that has the main role of server coordinating the system of probes, storing data, streaming and more. In few word the RPImaster device will assolve the following tasks:

  1. Bridging between the internal network and the external internet connection
  2. Apache web server with http and https protocols implementing the php support
  3. MySQL database server
  4. Unique exit point to the WiFi access point
  5. High level probes using the Bitscope Micro in inverted mode
  6. Handle microscope inspection camera

 

Based on the Qt 5.3 environment the RPImaster unit will include also the custom User Interface, with the option of many simplifications and a more ergonomic usage

 

An important change in the UI strategy

(more details on the UI design and features, based on Qt5, will be discussed in further documents)

IMG_20150517_092040107.jpgThe successfully adoption of the Qt 5.3 environment installed in the device got the option to improve the original UI design strategy. As a matter of fact this means that instead of an external tablet dedicated to acquire the visual information from the Meditech unit, showing them to the local operator and eventually when needed sending them to a remote support unit has been eliminated: At this point any mobile Android device will be useful for the remote connection sharing its internet connection.

 

Production costs reduction

The choice of the Qt 5.3 version has been conditioned by the availability of all the libraries needed to assemble a well working environment that has demonstrated to be stable and very well responsive. Due the relatively limited resources of the Raspberry PI and other factors has conditioned the choice to use the Qt C++ bare environment also for the UI graphic design.

 

This change has also a positive impact on the costs:

    • Can be used any mobile device as the only granted feature should be Internet tethering (WiFi to 3G/4G)
    • Can be used any available WiFi access point
    • The average cost of a Android tablet (about 150$-200$) is replaced by a HDMI or - in the cheaper case - a LCD AV device

Data exchange security improvement

The RPImaster device will comunicate P2P with the remote assistance unit with a set of tools provided by another Meditech dvice with some modifications. The secured communication between the two units Through the Internet gives a lot of options to better secure the communication protocol while exchanging sensitive data (patient information, data analysis, location information etc.)

 

Qt 5.3 testing

The following video shows some calculation and graphic tests of Qt 5.3 C++ sample applications running on the Raspberry PI 2

 

Prelude

My initial design was to build a set of robots around the raspberry Pi and use other components in the kit to build the control systems. Since I am yet to receive the kit, I have moved to researching the Image capture and recognition part of the project.

 

Introduction

OpenCV (Open Source Computer Vision) is a library of programming functions mainly aimed at real-time computer vision, developed by Intel Russia research center in Nizhny Novgorod, and now supported by Willow Garage and Itseez. It is free for use under the open-source BSD license. The library is cross-platform. It focuses mainly on real-time image processing. If the library finds Intel's Integrated Performance Primitives on the system, it will use these proprietary optimized routines to accelerate itself.

 

OpenCV has gained a lot of attention for the last few years with the advent of the RPi and other single board computers, it is now possible to have simpler Image Manipulation tasks executed on DIY systems. One module in my scifi design is to use gesture recognition and for that I intend to use an RPI 2 dedicated to Image processing and sensing.

 

I am going to start with installing OpenCV on the RPI and then using it to simply acquiring an image. I will be using C++ for more advanced stuff since I am more comfortable with it as compared to Python but I will do the initial stuff using Python 2.7.

 

Lets get started...

 

Installing OpenCV

Now there are a LOT of guides out there for this BUT I am giving you the details of "my way” of doing the install. If I make a mistake and you have a better way, please leave a comment and I will update it here.

 

The first thing is to download the OpenCV library for Raspberry Pi and to do that we go to http://opencv.org/downloads.html

 

I chose the last stable version which was 2.4.10 and I think that should work for the most part.

 

We also need to install some dependencies run the following commands

 

sudo apt-get -y install build-essential cmake cmake-curses-gui pkg-config libpng12-0 libpng12-dev libpng++-dev libpng3 libpnglite-dev zlib1g-dbg zlib1g zlib1g-dev pngtools libtiff4-dev libtiff4 libtiffxx0c2 libtiff-tools libeigen3-dev

 

and then

 

sudo apt-get -y install libjpeg8 libjpeg8-dev libjpeg8-dbg libjpeg-progs ffmpeg libavcodec-dev libavcodec53 libavformat53 libavformat-dev libgstreamer0.10-0-dbg libgstreamer0.10-0 libgstreamer0.10-dev libxine1-ffmpeg libxine-dev libxine1-bin libunicap2 libunicap2-dev swig libv4l-0 libv4l-dev python-numpy libpython2.6 python-dev python2.6-dev libgtk2.0-dev

 

Next unzip it using

 

unzip opencv-2.4.10.zip
cd opencv-2.4.10
mkdir release
cd release
ccmake ../

 

press ‘c’ to configure and toggle the options you want. I did the following.

 

ANT_EXECUTABLE                   ANT_EXECUTABLE-NOTFOUND                                                                                              
BUILD_DOCS                       ON                                                                                                                   
BUILD_EXAMPLES                   ON                                                                                                                   
BUILD_JASPER                     ON                                                                                                                   
BUILD_JPEG                       ON                                                                                                                   
BUILD_OPENEXR                    ON                                                                                                                   
BUILD_PACKAGE                    ON                                                                                                                   
BUILD_PERF_TESTS                 ON                                                                                                                   
BUILD_PNG                        ON                                                                                                                   
BUILD_SHARED_LIBS                ON                                                                                                                   
BUILD_TBB                        OFF                                                                                                                  
BUILD_TESTS                      ON                                                                                                                   
BUILD_TIFF                       ON                                                                                                                   
BUILD_WITH_DEBUG_INFO            ON                                                                                                                   
BUILD_ZLIB                       ON                                                                                                                   
BUILD_opencv_apps                ON                                                                                                                   
BUILD_opencv_calib3d             ON                                                                                                                   
BUILD_opencv_contrib             ON                                                                                                                   
BUILD_opencv_core                ON                                                                                                                   
BUILD_opencv_features2d          ON                                                                                                                   
BUILD_opencv_flann               ON                                                                                                                   
BUILD_opencv_gpu                 ON                                                                                                                   
BUILD_opencv_highgui             ON                                                                                                                   
BUILD_opencv_imgproc             ON                                                                                                                   
BUILD_opencv_legacy              ON                                                                                                                   
BUILD_opencv_ml                  ON                                                                                                                   
BUILD_opencv_nonfree             ON                                                                                                                   
BUILD_opencv_objdetect           ON                                                                                                                   
BUILD_opencv_ocl                 ON                                                                                                                   
BUILD_opencv_photo               ON                                                                                                                   
BUILD_opencv_python              ON                                                                                                                   
BUILD_opencv_stitching           ON                                                                                                                   
BUILD_opencv_superres            ON                                                                                                                   
BUILD_opencv_ts                  ON                                                                                                                   
BUILD_opencv_video               ON                                                                                                                   
BUILD_opencv_videostab           ON                                                                                                                   
BUILD_opencv_world               OFF                                                                                                                  
CLAMDBLAS_INCLUDE_DIR            CLAMDBLAS_INCLUDE_DIR-NOTFOUND                                                                                       
CLAMDBLAS_ROOT_DIR               CLAMDBLAS_ROOT_DIR-NOTFOUND                                                                                          
CLAMDFFT_INCLUDE_DIR             CLAMDFFT_INCLUDE_DIR-NOTFOUND                                                                                        
CLAMDFFT_ROOT_DIR                CLAMDFFT_ROOT_DIR-NOTFOUND                                                                                           
CMAKE_BUILD_TYPE                 Release                                                                                                                
CMAKE_CONFIGURATION_TYPES        Debug;Release                                                                                                              
CMAKE_INSTALL_PREFIX             /usr/local

CMAKE_VERBOSE                    OFF                                                                                                                  
CUDA_BUILD_CUBIN                 OFF                                                                                                                  
CUDA_BUILD_EMULATION             OFF                                                                                                                  
CUDA_HOST_COMPILER               /usr/bin/gcc                                                                                                         
CUDA_SDK_ROOT_DIR                CUDA_SDK_ROOT_DIR-NOTFOUND                                                                                           
CUDA_SEPARABLE_COMPILATION       OFF                                                                                                                  
CUDA_TOOLKIT_ROOT_DIR            CUDA_TOOLKIT_ROOT_DIR-NOTFOUND                                                                                       
CUDA_VERBOSE_BUILD               OFF                                                                                                                  
EIGEN_INCLUDE_PATH               /usr/include/eigen3                                                                                                  
ENABLE_NEON                      OFF                                                                                                                  
ENABLE_NOISY_WARNINGS            OFF                                                                                                                  
ENABLE_OMIT_FRAME_POINTER        ON                                                                                                                   
ENABLE_PRECOMPILED_HEADERS       ON                                                                                                                   
ENABLE_PROFILING                 OFF                                                                                                                  
ENABLE_SOLUTION_FOLDERS          OFF                                                                                                                  
ENABLE_VFPV3                     OFF                                                                                                                  
EXECUTABLE_OUTPUT_PATH           /home/pi/opencv-2.4.8/release/bin                                                                             
GIGEAPI_INCLUDE_PATH             GIGEAPI_INCLUDE_PATH-NOTFOUND                                                                                        
GIGEAPI_LIBRARIES                GIGEAPI_LIBRARIES-NOTFOUND                                                                                           
INSTALL_CREATE_DISTRIB           OFF                                                                                                                  
INSTALL_C_EXAMPLES               OFF                                                                                                                  
INSTALL_PYTHON_EXAMPLES          OFF                                                                                                                  
INSTALL_TO_MANGLED_PATHS         OFF                                                                                                                  
OPENCV_CONFIG_FILE_INCLUDE_DIR   /home/pi/opencv/opencv-2.4.8/release                                                                                 
OPENCV_EXTRA_MODULES_PATH                                                                                                                             
OPENCV_WARNINGS_ARE_ERRORS       OFF                                                                                                                  
OPENEXR_INCLUDE_PATH             OPENEXR_INCLUDE_PATH-NOTFOUND                                                                                        
PVAPI_INCLUDE_PATH               PVAPI_INCLUDE_PATH-NOTFOUND                                                                                          
PYTHON_NUMPY_INCLUDE_DIR         /usr/lib/pymodules/python2.7/numpy/core/include                                                                      
PYTHON_PACKAGES_PATH             lib/python2.7/dist-packages                                                                                          
SPHINX_BUILD                     SPHINX_BUILD-NOTFOUND                                                                                                
WITH_1394                        OFF                                                                                                                  
WITH_CUBLAS                      OFF                                                                                                                  
WITH_CUDA                        OFF                                                                                                                  
WITH_CUFFT                       OFF                                                                                                                  
WITH_EIGEN                       ON                                                                                                                   
WITH_FFMPEG                      ON                                                                                                                   
WITH_GIGEAPI                     OFF                                                                                                                  
WITH_GSTREAMER                   ON                                                                                                                   
WITH_GTK                         ON                                                                                                                   
WITH_JASPER                      ON                                                                                                                   
WITH_JPEG                        ON                                                                                                                   
WITH_LIBV4L                      ON                                                                                                                   
WITH_NVCUVID                     OFF

WITH_OPENCL                      ON                                                                                                                   
WITH_OPENCLAMDBLAS               ON                                                                                                                   
WITH_OPENCLAMDFFT                ON                                                                                                                   
WITH_OPENEXR                     ON                                                                                                                   
WITH_OPENGL                      ON                                                                                                                   
WITH_OPENMP                      OFF                                                                                                                  
WITH_OPENNI                      OFF                                                                                                                  
WITH_PNG                         ON                                                                                                                   
WITH_PVAPI                       ON                                                                                                                   
WITH_QT                          OFF                                                                                                                  
WITH_TBB                         OFF                                                                                                                  
WITH_TIFF                        ON                                                                                                                   
WITH_UNICAP                      OFF                                                                                                                  
WITH_V4L                         ON                                                                                                                   
WITH_XIMEA                       OFF                                                                                                                  


WITH_XINE                        OFF

 

 

For the most part I just left things to the default except for enabling stuff with Jpeg, Png and TBB.

 

press c again to configure and then g to generate the make file.

 

This should drop you back to the command prompt.

 

Next build with

 

make

 

I did this whole thing on a RPi 2 with a class 10 card. Yes it makes a difference since class 10 cards have faster access rates.

 

It took around 3.5 hours and if you happen to do something wrong, do a

 

make clean
make

 

Lastly do a

 

make install

 

and then reboot. This should have you up and running.

 

Testing everything

 

I am assuming that you are using the RPi Camera and that you have enabled the RPi Camera using raspi-config. If not, then please refer- (https://www.raspberrypi.org/help/camera-module-setup/)

You need to either attach a monitor to the RPi or access it via vnc as we did in the last post. Start the windows manager by typing

 

startx

 

You should now have the windows system running and should be able to see the desktop. Start a new terminal and create a new folder by typing

 

mkdir opencv_tests

 

Next create a new file by typing

 

cd opencv_tests
leafpad test1.py

 

I am using leafpad as its just simple when using the windows env.

 

Type the following lines

 

# import the necessary packages
from picamera.array import PiRGBArray
from picamera import PiCamera
import time
import cv2

# initialize the camera and grab a reference to the raw camera capture
camera = PiCamera()
rawCapture = PiRGBArray(camera)

# allow the camera to warmup
time.sleep(0.1)

# grab an image from the camera
camera.capture(rawCapture, format="bgr")
image = rawCapture.array

# display the image on screen and wait for a keypress
cv2.imshow("Image", image)
cv2.waitKey(0)

 

Save the file and close LeaftPad which will drop you back to the LX Terminal

Now just type

 

python test1.py

 

And that should display an image in a new windows. Press any key and it will close the windows and return you to the command prompt.

 

Conclusion

This was a lot of fun and I think to start with you should use python even if you have used OpenCV in C in the past. Its much simpler and with the power of the RPI2 the lag is almost gone. I will be implementing some gesture recognition while I wait for the Kit to be delivered.

See you next time...

The Machine-2-Machine protocol, MQTT, has really forced me to rethink how to organize the whole communication process of this project. MQTT stands for Message Queue Telemetry Transport and is touted as the protocol for the developing world of IoT (Internet of Things). Developed by Andy Stanford-Clark and Arlen Nipper, MQTT has been around since 1999. I heard about it while scouring the Raspberry Pi forum for information on communicating between two Pi's over TCP/IP.

 

Here's what the official website says about MQTT:

 

MQTT is a machine-to-machine (M2M)/"Internet of Things" connectivity protocol. It was designed as an extremely lightweight publish/subscribe messaging transport. It is useful for connections with remote locations where a small code footprint is required and/or network bandwidth is at a premium. For example, it has been used in sensors communicating to a broker via satellite link, over occasional dial-up connections with healthcare providers, and in a range of home automation and small device scenarios. It is also ideal for mobile applications because of its small size, low power usage, minimised data packets, and efficient distribution of information to one or many receivers.

The central point of MQTT is the broker. The broker is like a base station for communication between sensors and subscribers. From my cursory research I've concluded that the broker really ought to be located on a computer that is robust and well, not running around inside a pizza box. MQTT expects connections to be bad on the endpoints and not as a result of the broker being down. The broker will hang onto the sensor data until connections are reestablished. What this means for PizzaPi is that I need a third dedicated Raspberry Pi server that can be kept safe and sound with a reliable internet connection. There are a number of brokers available but Mosquitto is the only open source version that I've found and it is the one I am running.

 

Here's my updated hardware infrastructure:

PizzaPi Infrastructure Diagram

 

The diagram shows how the various devices/users/subscribers interact with the broker and web server that will run on the kit RPi 2. The customer and pizza store will really only directly interact with the web server while the kit RPi B+ and my original RPi B will interact with the broker. Before, I had intended to have the web server run on RPi 2 which is currently hooked up to the PiFace Control & Display 2. The two RPi's would then communicate directly over TCP and then any end users like the pizza store and customer would get information from RPi 2. Clearly, this leads to reliability problems and only running the PiFace CAD is a waste of the upgraded memory and speed on RPi 2, hence the Pi shuffle.


I'll post again soon, the little GPS light is blinking, reminding that I need to get down to business!

IMG_20150513_142059796.jpg

This is the first preview of the microscope camera that will be one of the probes connectable to the Meditech unit for micro-surface investigation e.g. skin, pariculars and so on. The video shows an example at about 200X magnification. As much the magnification increases, as the DOF is short.

IMG_20150513_142107447.jpg IMG_20150513_142120712.jpg

The device will be protected in front by a transparent circle and thanks to the internal leds ring light it can be put directly on the subject surface.

 

Next step: stream the real-time images on a reliable protocol through other devices. The following video shows the microscope in action

 

Previous posts for this project:

 

 

Project Update

 

Last week, I was experimenting with the lift mechanism for the display of the Pi to be used as a desktop computer.

 

That was a temporary display for testing purposes. And even though the screen had a good resolution, it was too small to be fully legible because of its size.
As an alternative, I cracked open an old (and broken) laptop and recycled the display. I was able to recuperate the screen without too much trouble and went searching for a controller online.

 

eBay seems to be full of these things at a very cheap price when sticking to classic VGA and DVI outputs. Adding HDMI seems to triple the cost! Because I already own and don't use a HDMI to DVI adapter, I chose to buy a VGA/DVI controller board only, setting me back about 10EUR. The controller board even came with inverter, connectors and control buttons (menu, volume, etc ...). The recycled display itself is a 17" widescreen with a maximum resolution of 1440x900. It's not full HD, but it's not bad at all for a (nearly) free display.

 

After hooking everything up, the laptop display was functioning properly! One of the next steps for the display will be to make a lightweight enclosure to hold all the parts in place.

 

Till next time!

 

{gallery} PiDesk - Display

photo 1.JPG

Parts: Recycled display with new controller, inverter, controls and cables.

photo 5.JPG

Connected: Getting everything hooked up.

photo 4.JPG

Test: Display is working!

Introduction

Instead of using the samba protocol, that is more resource consuming and is needed only when the network files should be shared including Windows machines or other desktops or anyway for some specific needs, a better solution to share folders between several linux machines is through the NFS file system.

 

Server install and setting

Installing the server side on raspian should be take in account a couple of issues that affect this Debian distro. First of all we have two options to install the server: nfs-server and nfs-kernel-server but only the second is well working in this Debian distro without issues. Debian documentation site only says that this is strongly suggested on the most recent distributions The problem is that this choice should be warned because also the other NFS server alternative installs on raspian without problem.


Installation

So we proceed installing the NFS server, the NFS common utilities and the portmapper:


 

$>sudo apt-get install nfs-kernel-server nfs-common portmap

 


Note: the portmap package (port mapping service) must be installed but it is probably already present on the system; it depends on what other packages you have previously installed in your Raspberry. So don't worry if after the installation you see that the package is already installed and if you read that there is a problem starting the nfs server.


Configuration

After the packages installation has been finished we should configure the NFS server instructing how to share the folders that will be mounted by the remote linux machines, editing the exports file

 

$>sudo nano /etc/exports

 

Every folder that should be shared with the NFS clients should be listed in this configuration file. The following configuration line is related to the case we are managing; for further details on the exports conifguration file syntax read the exports NFS configuration documentation.

 

/home/stream xxx.xxx.xxx.xxx/0(rw,sync,no_subtree_check,no_root_squash)

 

Despite what I have found on the Internet in many place It is mandatory to specify the NFS server IP and the subnet /0 (for the case of a single device sharing a folder).

 

Restart and fine tuning

At this point, we should restart the NFS service with the command

 

$>sudo service nfs-kernel-server restart

 

Surprisingly the server seems unable to restart showing a message like the response listed below

 

$>sudo service nfs-kernel-server restart
[ ok ] Stopping NFS kernel daemon: mountd nfsd.
[ ok ] Unexporting directories for NFS kernel daemon....
[ ok ]   Exporting directories for NFS kernel daemon....
[....] Starting NFS kernel daemon: nfsd
[warn] Not starting: portmapper is not running ... (warning).

Showing the RPC call status with the command

 

$> rpcinfo -p

 

we see something like the following:

 

<br>rpcinfo: can't contact portmapper: RPC: Remote system error - No such file or directory

 

This occurs because of the issue mentioned above. The port mapping service is correctly installed on the system but seems that raspian is not enabled to start it by default during the reboot. The workaround to solve this issue is to explicitly add the binding of the port mapping to the boot sequence startup via rc.d configuration with the following command:

 

$>sudo update-rc.d rpcbind enable && sudo update-rc.d nfs-common enable

 

At this point after the reboot the system works correctly. Test restarting the NFS service after a reboot

 

$>sudo  /etc/init.d/nfs-kernel-server restart

 

to see the correct restart sequence notifications on the terminal

 

pi@RPIslave2 ~ $ sudo /etc/init.d/nfs-kernel-server restart

[ ok ] Stopping NFS kernel daemon: mountd nfsd.

[ ok ] Unexporting directories for NFS kernel daemon....

[ ok ] Exporting directories for NFS kernel daemon....

[....] Starting NFS kernel daemon: nfsdrpc.nfsd: address family inet6 not supported by protocol TCP

mountdrpc.mountd: svc_tli_create: could not open connection for udp6

rpc.mountd: svc_tli_create: could not open connection for tcp6

rpc.mountd: svc_tli_create: could not open connection for udp6

rpc.mountd: svc_tli_create: could not open connection for tcp6

rpc.mountd: svc_tli_create: could not open connection for udp6

rpc.mountd: svc_tli_create: could not open connection for tcp6


Take in account that he warnings related to the tcp6 and udp6 depends on the use of the IPV4 only and not the IPV6 protocol in the network, so it doesn't affect the server functionality. For permanent mounting the mount parameters should be added to the /etc/fstab file of the client computer.


Client installation and setting

Client installation is almost simple as we only need to install the portmapper and the nfs client.

 

Install the packages

 

$>sudo apt-get install nfs-common

 

Then create the folder where the remote mount should be mapped

 

$>sudo mkdir -p <client mount folder>

 

Now mount the remote folder (with the full path) on the client local folder on the client machine.

 

$>sudo mount xxx.xxx.xxx.xxx:/<server full path shared folder> <client mount folder>

 

That's all!

 

With the mount command you will see an output like this (showing my specific case as an example)

 

$>mount
/dev/root on / type ext4 (rw,noatime,data=ordered)
devtmpfs on /dev type devtmpfs (rw,relatime,size=437856k,nr_inodes=109464,mode=755)
tmpfs on /run type tmpfs (rw,nosuid,noexec,relatime,size=88432k,mode=755)
tmpfs on /run/lock type tmpfs (rw,nosuid,nodev,noexec,relatime,size=5120k)
proc on /proc type proc (rw,nosuid,nodev,noexec,relatime)
sysfs on /sys type sysfs (rw,nosuid,nodev,noexec,relatime)
tmpfs on /run/shm type tmpfs (rw,nosuid,nodev,noexec,relatime,size=176860k)
devpts on /dev/pts type devpts (rw,nosuid,noexec,relatime,gid=5,mode=620,ptmxmode=000)
/dev/mmcblk0p5 on /boot type vfat (rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,errors=remount-ro)
rpc_pipefs on /var/lib/nfs/rpc_pipefs type rpc_pipefs (rw,relatime)
/dev/mmcblk0p3 on /media/SETTINGS_ type ext4 (rw,nosuid,nodev,relatime,data=ordered,uhelper=udisks)
192.168.5.3:/home/pi/stream on /mnt/stream type nfs4 (rw,relatime,vers=4.0,rsize=65536,wsize=65536,namlen=255,hard,proto=tcp,port=0,timeo=600,retrans=2,sec=sys,clientaddr=192.168.5.2,local_lock=none,addr=192.168.5.3)

Where the last line shows the remote /home/stream folder mounted on the local /mnt/stream folder You will see also the mounted folder, that is as a matter of fact is very similar to a removable media with the disk free df command as shown in the example below.

 

$> df -h
Filesystem                   Size  Used Avail Use% Mounted on
rootfs                       877G  4.3G  828G   1% /
/dev/root                    877G  4.3G  828G   1% /
devtmpfs                     428M     0  428M   0% /dev
tmpfs                         87M  304K   87M   1% /run
tmpfs                        5.0M     0  5.0M   0% /run/lock
tmpfs                        173M     0  173M   0% /run/shm
/dev/mmcblk0p5                60M   15M   45M  25% /boot
/dev/mmcblk0p3                27M  442K   25M   2% /media/SETTINGS_
192.168.5.3:/home/pi/stream  5.8G  2.6G  2.9G  48% /mnt/stream

A component that is very important for my project arrived: the projector!

projector-box.jpg

The very generic-looking box

Projectors can get really expensive, but as you can see I went with a budget version. For around 50 bucks you can find yourself a Chinese-made projector that does 480x320 and has a brightness of 100 lumen. If you're thinking that's pretty pathetic, I agree. I decided to spend a little extra and got myself one that did 640x480 (at only 80 lumen) for 70 bucks. This way R2-D2's projector would at least meet the VGA standard.

 

After unboxing and inspecting it at work, one of my colleagues remarked that it made a rattling noise. Some part was loose inside the projector. He handed me a screwdriver and convinced me to take apart the projector I'd owned for all of 10 minutes.

 

2015-05-05 17.15.17.jpg

Don't worry, I put it back together again


The rattling sound came from one of the speakers. It was supposed to be hot-glued to the case but had come loose. Opening it up really drove home how cheaply it was made. The plastic lens is something you'd find in a pair of toy binoculars.

 

After I took it home I hooked it up to a Pi and saw if they would play nice. Initially, they didn't. I was using a bargain bin HDMI cable and I just couldn't get the Pi to detect anything. I swapped it out with an AmazonBasics cable that I had a bit more confidence in and after that it worked fine.

projector.jpg

Hooked up to HDMI and happy

The projected image itself is very blurry, even after fiddling with the lens. I could get it into focus but even then it wasn't really.. sharp. I logged in on the Pi with ssh and set up x11vnc. This allowed me to get a better look at what was going on.

Compare a screenshot of  my VNC client to the actual image being projected.

pi-vnc.png

pi-projector.jpg

It's hard to believe but that's exactly the same image being shown. The projector presents itself with 720p as its only available resolution and then squishes that widescreen image down to 640x480. The result is predictably awful. I don't really need amazing image quality for this project but this is just awful. If you're at all interested in buying a projector, I really can't recommend a device like this. Literally save yourself the headache and get something decent.

Introduction and usage

One of the most versatile probes of Meditech is the camera that cover a meaningful wide range of applications shortly described below

 

  • Screening of areas, like skin or injuries detail for post-intervention documentation or to describe particular environmental characteristics
  • Series of short timed shootings to see reaction to some kind of stimulations (e.g. allergy test, eye reaction and so on)
  • Image details extraction to enhance some specific characteristics
  • False-color representation and low-contrast images enhancement
  • Image measurement
  • More

 

The camera probe should be relatively small and lightweight, battery operated and - despite the other parts of the Meditech "block" (the main case) needs to be managed independently, eventually near the patient. Accordingly with these specifications the "camera probe", not yet sure what other associated probes can contain, should have the following charactristics

 

  • Waterproof impermeabilised (not for liquid immersion but should not be damaged by rain, water drops and so on)
  • Flip-up camera lenses enabling an automatic time-lapse image shooting
  • Battery operated so when the probe fits its housing in the Meditech main case it is automatically put in charge ready when it is extracted
  • One or two button(s) to control the required features
  • 16x2 alphanumeric LCD controlled by a 74HC595 Shift Register
  • A white light ring Led set around the camera lens
  • Battery charger

 

Setting the capture side

To capture the images in the different conditions that are needed the raspistill has all the needed features: shooting time settings, single repeated image or different named frames, frequency, duration, resolution and a lot of other controls. So a couple of commands for starting and stopping the camera has been created; these will evolve in a more complex one managing all the features the camera can do as mentioned above, controlled by the button.

 

startcam.sh

#!/bin/sh
# Start still frame and stream to the server

echo "`date +%y/%m/%d_%H:%M:%S`: stream_start" 1>>/home/pi/stream.log
# Mount remote share
sudo mount 192.168.1.99:/home/pi/stream /home/pi/stream >>/home/pi/stream.log

# Image numbering
# for test only
#raspistill -w 640 -h 480 -q 10 -o /home/pi/stream/pic%04d.jpg -tl 200 -t 9999999 -th 0:0:0 -n >>/home/pi/stream.log 2>>/dev/null &

#Fixed name
raspistill -w 640 -h 480 -q 10 -o /home/pi/stream/pic.jpg -tl 50 -t 9999999 -th 0:0:0 -n >>/home/pi/stream.log 2>>/dev/null &








 

stopcam.sh

#!/bin/sh
# Stop still frame

echo "`date +%y/%m/%d_%H:%M:%S`: stream_stop" 1>>/home/pi/stream.log
sudo killall raspistill >>/home/pi/stream.log 2>>/home/pi/stream.log
sudo umount /home/pi/stream








 

These commands append a status log to the file stream.log

To start the camera still sequence you can see that before a remote folder has been mounted: this was the problem s that required most the time to test due the way the streamer interpret the new files added.

For a continuous still capture, to save disk space and avoid problems the command use always the same name for every frame. The result is that every new shoot the same file is rewritten.

A more detailed description of the file sharing strategy over the net for both the client and server will be discussed in the next paragraph.

 

The sharing strategy

The question of sharing files over the network Meditech is based on (it uses three different Raspberry PI has been solved using the NFS file system The entire installation procedure will be discussed in a separate post.

The principle is that the RPIMaster Raspi device includes a large storage: 1Tb USB hard disk so one of the roles of this unit is to collect and store all the data that are produced for any reason by the entire system. Adopting this kind of centralised architecture give the possibility to increase the number of future optional probes without altering the system in any way.

The approach is the following:

 

RPIMaster is the centralised device with the large storage system

RPISlave2 is the device dedicated to manage the camera features

 

The still image streaming process

  1. RPISlave2 starts capturing images and storing them to the folder ~/stream that is a shared folder on the RPIMaster
  2. SPIMaster runs the program mjpg-streamer that sends to the IP address of RPImaster via http on the port 8090 every net image detected
  3. The Display control device (the Android tablet) access the image

 

The streamer application

A note it is the worth on the mjpg-streamers application. It is a lightweight linux utility that manages all the streaming process with several options: can send the files to a folder, on the http output and so on.The problem is that the plugin of the application that detect the image, when a new frame is detected but then for some reason id deleted of changed it stops the acquisition and the streaming port become unresponsive. This event occurs randomly the streamer on RPImaster does not need to be synched with the remote device that shares the images. So in come cases some millisecond after the images has beed reconised by the streamer on the RPImaster, the same image is replaced by a new frame in the RPIslave2 sequence.

To avoid this psoblem I had to modify the plugin so that - only for testing purposes - when this event occur it is only noticed on a log file. As soon as the streamer has been definitely tested the updated version will be shared on a post to be available to other users with the same problem.

 

The streaming command

Also in this case a bash simplified command has beed created to start the streaming process when needed.

 

#!/bin/bash
#Start streaming

echo "`date +%y/%m/%d_%H:%M:%S`: stream_start" 1>>/home/pi/stream.log
/home/pi/mjpg-streamer/mjpg_streamer -i "/home/pi/mjpg-streamer/input_file.so -f /home/pi/stream" -o "/home/pi/mjpg-streamer/output_http.so -p 8090 -w /home/pi/mjpg-streamer/www" 0>>/home/pi/stream.log 1>>/home/pi/stream.log 2>>/home/pi/stream.log >>/home/pi/stream.log 2>>/dev/null &

 

The following video shows how this architecture works

 

frellwan

FTP Code Working

Posted by frellwan May 11, 2015

I have ordered a USB to RS232 cable (USB-RS232 WEUSB-RS232 WE) to be able to communicate to the AB SLC PLC

I have ordered a USB to RS422 cable (USB-RS422 WEUSB-RS422 WE) to be able to communicate to the Fenner M-Trim controller

 

FTP Communication

While waiting for the above cables to arrive, I have been working to get the FTP communications working

 

     OEE data

     Data will be read from the PLC every minute and stored to a local file. The local file will be sent to the server every hour.

    

     Recipe data

     Recipe data will be stored on the same server. Each recipe will have it's own file on the server. A list of those files will be stored in a separate file with a specific name. The plan is to retrieve the single file then iterate through the lines of      the file, downloading the recipes to local storage on the pi.

 

 

The Twisted framework is being used to accomplish the communication in this project.The FTPClient protocol has been adapted to allow the 'APPE' instruction. The LoopingCall is being used to schedule the sending of the OEE data to    the server every hour (3600 seconds) and the receiving of the recipe file from the server (every 24 hours).

 

One drawback that I did find with the Twisted framework was when I manually disconnected the network cable from the pi, it took about 20 minutes for the code to realize the connection was lost. There wasn't any lost data as far as I     could see. The data was transferred when the connection was reestablished.

 

So week 1 was a success in establishing FTP communications. Later in the project I will make the times selectable through the PIFace controller.

 

   

 

class FTPClientA(FTPClient):
    """ **************************************************************
        Protocol subclass from FTPClient to add 'APPE' support allowing
        the ability to append data to a file.

        Also using connectionMade method to start data transfer loops
    ************************************************************** """
    def __init__(self, username = 'anonymous', password = 'anonymous@', passive = 1):
        FTPClient.__init__(self, username, password, passive)
        self.OEELoop = LoopingCall(sendOEEData, self)
        self.RecipeDownload.LoopingCall(rerieveRecipes, self)  

    def connectionMade(self):
        """ ****************************************************************
        Called when a connection is made.

        This may be considered the initializer of the protocol, because
        it is called when the connection is completed.  For clients,
        this is called once the connection to the server has been
        established; for servers, this is called after an accept() call
        stops blocking and a socket has been received.  If you need to
        send any greeting or initial message, do it here.
        ***************************************************************** """
        self.OEELoop.start(3600).addErrback(fail)
        self.RecipeDownload.start(86400).addErrback(fail)

    def connectionLost(self, reason):
        """ ****************************************************************
        Called when the connection is shut down.

        Clear any circular references here and any external references to
        this protocol. The connection has been closed.

        @type reson: L{twisted.python.failure.Failure}
        ****************************************************************** """
        print "connection lost"
        self.OEELoop.stop()

    def appendFile(self, path):
        """ ******************************************************************
        Append to a file at the given path

        This method issues the 'APPE' FTP command

        @return: A Tuple of two L{Deferred}s:
                 -L{Deferred} L{IFinishableConsumer}. You must call
                  the C{finish} method on the IFinishableConsumer when the file
                  is completely transferred.
                 -L{Deferred} list of control-connection responses.
        ****************************************************************** """
        cmds = ['APPE ' + self.escapePath(path)]
        return self.sendToConnection(cmds)

    appe = appendFile

class FTPClientAFactory(ReconnectingClientFactory):

    def buildProtocol(self, addr):
        self.resetDelay()

        p = FTPClientA(username='anonymous', password='anonymous@')
        p.factory = self
        return p


    def clientConnectionLost(self, connector, reason):
        """ *************************************************************
        Called when a connection has been lost after it was connected.

        @type reason: L{twisted.python.failure.Failure}
        ************************************************************* """
        print 'Lost connection.  Reason:', reason
        ReconnectingClientFactory.clientConnectionLost(self, connector, reason)

    def clientConnectionFailed(self, connector, reason):
        """ *************************************************************
        Called when a connection has failed to connect.

        @type reason: L{twisted.python.failure.Failure}
        ************************************************************* """
        print 'Connection failed. Reason:', reason
        ReconnectingClientFactory.clientConnectionFailed(self, connector, reason)

 

def run():
    # Get config
    config = Options()
    config.parseOptions()
    config.opts['port'] = int(config.opts['port'])
    config.opts['passive'] = int(config.opts['passive'])
    config.opts['debug'] = int(config.opts['debug'])
    
    # Create the client
    connector = reactor.connectTCP(config.opts['host'], config.opts['port'], FTPClientAFactory())

    reactor.run()

Hello

 

The parts arrived a week ago and I was glad to find all essential pieces inside.

 

The first step was to check the RaspberryPi_V2 and install Raspbian, 2015-02-16 at that time (I saw a new one - 2015-05-05 - came out since) and made usual configuration, expand FS,

enable camera and network configuration for remote access.

 

This was my first contact with the RaspberryPi_V2 and I'm pleased about it. Improved CPU, Micro-SD slot and the 4 USBs are so far the most important points for me.

 

Wireless Network Adapter

 

I'll use the RaspberryPi headless so I need a wireless network adapter connected on USB.

For this purpose I'm using the included WiPi, which performed very well. I have tried a handful of wireless USB adapters so far and I can say the WiPi is the most reliable and trouble free so far.

It really worked out of the box, connection is very stable and it worked connected directly to RaspberryPi_V2 and to external USB hubs, both powered and not powered.

 

 

Real time clock

 

Because RaspberryPi don't have an Real Time Clock (RTC) and by design my project is not always connected to network/internet I need an external RTC.

In this case I use the provided PiFace Real Time Clock.

Installation and configuration were pretty easy. One thing I to do before using the official PiFaceRTC instructions is to enable I2C from raspi-config.

Configuration steps are bellow:

 

- install a CR1220 battery on PiFace Real Time Clock

- install PiFace Real Time Clock module on RaspberryPi

- sudo raspi-config -> select "Advanced Options" -> Enable I2C -> Finish -> Reboot

- cd /home/pi

- with RaspberryPi connected to internet -> wget https://raw.github.com/piface/PiFaceRealTimeClock/master/installpifacerealtimeclock.sh

- chmod +x installpifacerealtimeclock.sh

- sudo ./installpifacerealtimeclock.sh

- sudo reboot

- set current date/time -> sudo date -s "1 MAY 2015 10:10:30"

- with RaspberryPi disconnected from network to avoid NTP update time, let RaspberryPi powered off a couple of minutes and then turn it back on. Now check if date and time are correct, for

my case they were

That's it, I now have a running onboard RTC.

One thing I don't like about this board is the way it is attached to RaspberryPi. Sometimes the RaspberryPi pins did not make a good contact with metallized holes from RTC board and then

date command returned a wrong time. Bending RaspberryPi pins here and there improved connection, but I believe I had to change the way boards are connected.

 

 

More about PiFaceRTC can be found on the following links:

http://www.element14.com/community/docs/DOC-68907/l/shim-rtc-realtime-clock-accessory-board-for-raspberry-pi#documents

http://www.piface.org.uk/assets/piface_clock/PiFaceClockguide.pdf

 

 

 

 

Input and Display

 

 

One of the important way I send commands and receive feedback to and from RPi is the PiFace Control & Display module.

Installation and configuration was straightforward using official instructions, no tricks involved

I planned from the beginning to use Python for this project and PiFaceCAD have support for both Python 2.7 and Python 3.

 

 

For testing purposes I performed following steps:

- mounted PiFaceCAD on RPi, as described in instructions

- enable SPI port -> sudo raspi-config -> select "Advanced Options" -> Enable SPI -> Finish -> Reboot

- sudo apt-get install python-pifacecad

- run test application -> python /usr/share/doc/python-pifacecad/examples/sysinfo.py -> Success. Information about IP, temperature and memory load are shown.

 

 

PiFaceCAD can be controlled with an infrared remote controller, but I will skip this step for now, as I don't need it for the moment.

 

 

After successful installation and configuration of PiFaceCAD module, I started to write the code for control menu. I hope to have it ready in the next couple of weeks.

So far I have no cons about this module, everything worked fine from the beginning, although it feels a little bulky as it is and most likely will undergo some surgery on the display side

to lower its profile.

 

 

More about PiFaceRTC can be found on the following links:

http://www.element14.com/community/docs/DOC-68972?ICID=piFacedigitalmain-techspecs

 

 

Well, so far I have RaspberryPi_V2 + RTC + PiFaceCAD working well together. In the next post I'll add more components to this core.

 

 

May the Force be with you

Application Information
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/04/22/some-information-from-my-application

ChipKit Pi Vs Arduino Pro Mini
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/01/quick-update-on-the-quadcop-and-the-chipkit-pi

Quadcopter Assembled (You call that a Quadcopter?)
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/06/quadcopter-assembled

QuadCop -The Control Switch
http://www.element14.com/community/videos/16202/l/control-switch-explanation

Quad Cop with ChipKit Pi - An "Experience" with Innovation Required

http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/07/quad-cop-with-chipkit-pi--an-experience-with-innovation

 

I rigged up my raspberry pi to my quad copter and wrote a quick script so I can push a button to start and stop the camera.  The camera has a LED on it so it works great to know it is recording.

 

I haven't flown anything RC for about 6 months just coming out of winter, so its not the most glamorous flight, but I wanted to see how well the camera works.  There was a 30mph gusting wind and its been like that all week, so the wind was a major problem during this flight.  It did however make me realize I may need a few adjustments to my control protocol.

 

There are some skips in the video, I am not sure if that happened during recording or during the conversion processes.  I'll figure that out later.

 

The camera output is in H264 raw format and you can use MP4Box (use apt-get) to convert it to mp4 format for playback on a windows machine and uploading to YouTube.  I did the conversion on the Raspberry Pi itself.

 

 

Oh BTW, I got the GPS working too without the main board, this will be nice since I will have the Chip Kit Pi installed.  I still would like it for the accelerometer though.  Pics and vid below!

 

Edit:  For clarification, I am flying this quadcopter manually, it is not auto flying.  The autopilot is still under development.  I just wanted to tryout the quadcopter and the camera to ensure everything is working.

 

Here is the script I used to turn on the camera.  Right now it is connected to a  button, but going forward it will be connected the RPi2 via GPIO.  The RPi2 will set a pin to "HIGH" and that will turn the camera on.  Then it goes back to low.  Setting it to HIGH again will turn the camera off.  The output filename is set with the date and time so each capture is in its own file.

 

 

import RPi.GPIO as GPIO
import os
import time
GPIO.setwarnings(False)
GPIO.setmode(GPIO.BCM)
GPIO.setup(18, GPIO.IN, pull_up_down = GPIO.PUD_DOWN)
while True:
        GPIO.wait_for_edge(18, GPIO.RISING)
        print("RecordingVideo")
        filename1 = time.strftime("%Y%m%d-%H%M%S") + '.h264'
        os.system('raspivid -t 99999999 -o /home/pi/vids/' + filename1 + '&' )
        time.sleep(1)
        GPIO.wait_for_edge(18, GPIO.RISING)
        print("Stopped")
        os.system('pkill raspivid')
        time.sleep(1)
GPIO.cleanup()




KIMG0161_s.jpg

KIMG0167.jpg

Might want to have a puke bag ready.  Flight starts at around :30 seconds into the video.

 

Previously:

Sci Fi Your Pi - Prince Dakkar's patent log taking chart compass

 

 

 

To realise my initial ideas for the project I have started work on designing the functionality of the device.

 

This really divides into three main function sets; indicating the users current position, setting the route to be navigated and, Indicating the route and direction of travel.

 

Indicating current position

The device will indicate the users position on a map. The position will be indicated by the intersection of two bars that will move over the inbuilt map (probably with some sort of elaborate decoration at the intersection).

 

To make this happen GPS data will be taken from the GPS module and then converted to set the correct distance to move across the map. Each bar will move independently with a chain or belt at one end and some form of free running support at the other.  There will be a bar for each longitude and latitude each having is position controlled by the Raspberry Pi controlling stepper motors or servos.

 

The main challenge is in ensuring the position shown on the map is the correct position. In my initial design i was planning to use a flat stylised map to fit in with the steampunk style. these maps are often non linear in their projection meaning that the bars would have to move different distances depending on the position on the earth. They are also sometimes interrupted as well which would add a further layer of complexity in managing the physical position of the bars to indicate actual position on the map.

 

Another solution is to use a Linear projection that means the movement for the bars will directly translate to changes in longitude / latitude. These maps may be less pretty but could make implementation easier.

 

It would be possible to use a non linearly projected map but to make the position accurate I will need to 'do some maths' to convert the changes in GPS position to change in position on the map. For this to work the map will need to be using a known projection and I will need to understand the conversion factor involved and how that changes as the position moves away from the equator.

 

It could be possible to manage this by using a globe rather than a flat map but this could prove rather difficult to use for actual navigation. This may not be a problem as it could be designed to be mounted permanently an adventurer's craft rather than something that is carried around. This is not how I originally planned the device but could look really interesting implemented this way.

 

Once this is overcome it is then a relatively simple task to get the bars to intersect at that position by moving them a set distance from the initial reference point of the map.

 

If map projection is of interest to you then the below video has a comprehensive explanation of the idea:

 

 

 

Route Setting

Know one's present position is only part of the challenge. It is also important to know how to get to the next location for your quest or adventure. The first step of this process is creating the ability to set the location of the destination (and current location).

 

To fit with the overall style this needs to be suitably dramatic and will not have a screen or keyboard input to allow a postcode or address to be entered. Most of the adventures seen in the books and films in this genre have a limited number of destinations (the center of the earth could be a bit difficult to map, but most others should be OK). So at the moment i am thinking that selecting the destination from a list (using some form of scrolling wheel?).

 

For start location I could either use the current GPS position or add to the theatrical nature of the device by having that selected in a similar manor as well.

 

To input the information into the device I intend to either connect different pins for each destination (with the other pin requirements this could leave a very small list of destinations) and use the Raspberry Pi to store the information. The second option I am thinking of is just use the destination and current location selections to connect different circuits to select the routes. for this option there will need to be someway of passing the destination to the Raspberry Pi to enable some of the functionality described below.

 

Indicating the Route and Direction of Travel

Once the destination (and current location) has been selected the gentleman (or lady) adventurer now need to know which way to go to get there. I am intending to have two parts to this; an overall route indicator and, a direction of travel arrow.

 

The overall route indicator will most likely be a string of LEDs from the current location to the destination. This could be managed by using a set number of strings from the set destinations and locations or could use a matrix of LEDs to allow more flexibility (partly depends on how location / destination is selected). This could either be controlled using the Raspberry Pi or use the selections made for location / destination to connect different circuits of LEDs to create the route. The effect is intended to be similar to the cinematic "travel by map" sequences satirised in the below Muppets clip. It will probably be yellow dots rather than a line but to have them flash in sequence is the sort of thing I have in mind.

 

 

 

 

The direction of travel arrow will indicate which direction to move from the current location to reach the target destination. The intention is to use the current GPS position and destination GPS co-ordinates to show which way to move. This is intended to be a group of LEDs under a decorative compass style cutout. The Raspberry Pi will compare the co-ordinates and then light the correct LED (or LEDs) to indicate in which direction the required destination lies.

 

compass.jpg


I would like to have the main cadinal points (N, E, S W) to start with and add in the ordinals (NE, NW, SE, SW) if possible. this will indicate a direction "as the crow (or airship) flies" which is entirely appropriate for an adventurer who will not be constrained by such boring constructs as roads or shipping lanes.

 

I intend to test each of these as a separate function then combine them all and construct a suitable enclosure. I will also be intending to add extra parts to enhance the theatrical nature of the device. This will most likely take the form of large handles or cog wheels that will initiate functions of control the way the device works.

 

 


dmrobotix

PizzaPi: Quick Update

Posted by dmrobotix May 10, 2015

Hello, everyone!

 

Sorry I have not posted much recently. I would be remiss in my contestant responsibilities if I did not keep up with my updates and give a quick heads up on how things are going with PizzaPi!

 

Last week I started my 10-week full-time internship at the lab I work at and its been a bit of a challenge getting used to working all those hours! I'm actually spreading these 10-weeks over a 20-week period, so in a few weeks I'll be off again and I'll be more active around here.

 

Today I tried to rebuild my setup and make sure everything is still in working order. All the parts are working, so that's definitely good. I installed a web server on the B+ and I'm investigating the use of MQTT to handle the transfer of sensor data to the other Raspberry Pi. It's touted as the protocol for the Internet Of Things. I've never used it before but I'm going to use it for this project.

 

Here's some links for more background:

MQTT 101 - How to get started with the lightweight IoT protocol

An Open Source MQTT v3.1 Broker

 

I'll write more about how I set it up between the two Pis later. Sorry I've been silent all week! I'm still in the game, just trying to do a good job at work, too! Oh, and what I do at the lab is mostly computer science. I'm developing a web database to house information and analysis of samples. I'd like to move more into the robotics side of the lab, but I have to do some intense schmoozing this summer to make that happen!

 

More later!

Hello all! Been busy getting everything connected and initialized for the build. Successfully connected Gertbot, MEMS board, and Pi Camera. I've got them successfully communicating through a Pi Rack onto the B+ (as the MEMS refuses to work with the Pi 2). I have noticed that whenever the Gertbot is driving my test motor that the MEMS readings really wig out. Ultimately I think I will have both Pis running in the project - a navigator and a driver. I'll have the B+ connected to the MEMS board, Pi Camera and GPS modules (the latter of which is still set to arrive) to work on navigation, while the 2 will be running the motors and mechanicals. I haven't figured out how I want to approach servo control yet - maybe the chipkit? Anyway, here's a picture of my little leaning tower of testing.  ultimately I'll mount everything securely and use ribbon cables to connect.

20150510_111457.jpg

taodude

I Ching Blog Week 2

Posted by taodude May 9, 2015

Finally getting to grips with using Python and the PiFaceCAD and accessing the Pi using SSH/PuTTY. but I'm having less success writing blog posts that have all the fancy formatting that others are using, probably because i am writing the blogs late at night as a summary of the previous day or so's activity.  Also, this Blog engine has not been programmed to autocorrect my dyslexic fingers.

 

I'm on the road this weekend and most of next week, and i haven't quite go the project to the point where I can cart it around and remote into it from my laptop.  So that means spending a bit of time writing the Python 3 modules that will do all the good stuff i need the Pi to do when i get back home.

 

Here's a summary of what has happened so far.

  1. The Python 3 code for modelling the splitting of the yarrow stalks has been written, but is not yet on Github.  This module is called to generate the quantum state of each hexagram line, so six invocations are required.
  2. I can use SSH and PuTTY to access the Pi without needing a monitor.  I don't need xwindows, because i can do everything i need using the command line or the Python interpreter, and the output appears on the PiFaceCAD LCD display.
  3. The Python code for presenting the initial hexagram outut to the PiFaceCAD and is on GitHub, but i have not managed to work out how to share it, either from GitHub or through this blog.  Dropping Python 3 code into this blogging interface has some interesting effects, and is thereby largely unreadable.  I also need to work out how to reduce the output to fit in a 16x2 display format (currently i need it to be 17x2).
  4. The IR remote from my Hauppauge PVR card can now be read by the IR port on the PiFaceCAD, so i am not restricted to the PiFaceCAD switches, although these will be the primary input owing to the massive variety of remotes that the LIRC database has records for.

 

Key pointsmfor this week are:

  1. Menu structure for the PiFaceCAD LCD screen.
  2. Using interrupts to trigger transition between menus when PiFaceCAD  switches are pressed.
  3. Investigate file sharing options from GitHub or other locations in the Cloud.

 

No pix this time, sorry.

After probably having broken my pi camera last week, I tried to revive it, but to no avail. At this point it's probably safe to call it a lost cause. I wasn't too worried over it. I have 2 USB webcams lying around and I figured I might be able to just use one of those instead instead.

In fact, these USB webcams have a built-in microphone. In absence of the Cirrus Logic audio card, that would really come in handy!

 

2015-05-08 23.25.44.jpg

Meet Laurel and Hardy

 

The left one is a Microsoft LifeCam VX-3000, the right one is an MSI StarCam 370i. They might look like a wacky duo but both have served me well and are fully functional webcams. That being said, I can't seem to make them work on the Pi.

They are both correctly detected when I connect them.

Here's dmesg and lsusb for the MSI camera:

 

[  252.049044] usb 1-1.2: new full-speed USB device number 6 using dwc_otg

[  252.151553] usb 1-1.2: New USB device found, idVendor=0c45, idProduct=60fc

[  252.151579] usb 1-1.2: New USB device strings: Mfr=0, Product=1, SerialNumber=0

[  252.151597] usb 1-1.2: Product: USB camera

[  252.205066] gspca_main: v2.14.0 registered

[  252.211258] gspca_main: sonixj-2.14.0 probing 0c45:60fc

[  252.214266] input: sonixj as /devices/platform/bcm2708_usb/usb1/1-1/1-1.2/input/input2

[  252.215734] usbcore: registered new interface driver sonixj

[  252.302998] usbcore: registered new interface driver snd-usb-audio

pi@raspberrypi ~ $ lsusb

Bus 001 Device 006: ID 0c45:60fc Microdia PC Camera with Mic (SN9C105)


And Microsoft:

[  417.699011] usb 1-1.2: new full-speed USB device number 7 using dwc_otg

[  417.801644] usb 1-1.2: New USB device found, idVendor=045e, idProduct=00f5

[  417.801672] usb 1-1.2: New USB device strings: Mfr=0, Product=1, SerialNumber=0

[  417.801690] usb 1-1.2: Product: USB camera

[  417.803249] gspca_main: sonixj-2.14.0 probing 045e:00f5

[  417.806077] input: sonixj as /devices/platform/bcm2708_usb/usb1/1-1/1-1.2/input/input3

pi@raspberrypi ~ $ lsusb

Bus 001 Device 007: ID 045e:00f5 Microsoft Corp. LifeCam VX-3000

 

The drivers appear to be correctly loaded and Iceweasel shows that both the camera and the built-in microphone are available.

camerasharingprompt.png

why yes, I would like that

 

As soon as I try to actually use either camera though, no video or audio comes through and I see hundreds of these errors in dmesg:

[12380.380712] gspca_main: ISOC data error: [8] len=0, status=-71

[12380.380721] gspca_main: ISOC data error: [9] len=0, status=-71

[12380.380730] gspca_main: ISOC data error: [10] len=0, status=-71

[12380.380738] gspca_main: ISOC data error: [11] len=0, status=-71

[12380.380747] gspca_main: ISOC data error: [12] len=0, status=-71

 

...and these:

[12452.042127] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

[12452.170150] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

[12452.298154] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

[12452.426173] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

[12452.554196] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

[12452.682204] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

[12452.810219] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

[12452.938235] sonixj 1-1.2:1.0: URB error -71, resubmittingsonixj 1-1.2:1.0: URB error -71, resubmitting

 

Searching the web for what that error -71 means lead me to this Debian bug report. The issue appears to be very generic and I don't think I could get any webcam to work on Raspbian right now. It's really too bad but for the time being I'm going to give up on getting a camera to work with the Raspberry Pi.

The software solution to create a bridge in the main device, able to manage the internal network and the external WiFi connection convinced me that this was probably the best solution to deliver all the needed features:

 

  1. All the Meditech RPI can be accessible logging to the Master RPI via ssh or from the graphical desktop
  2. The internal devices can access the Master RPI device via the internal LAN sending their data to the storage database (MySQL based) via http / https and PHP
  3. The internal devices if needed can access the external network via the routed WiFi on the Master RPI
  4. Only one unit should access physically the Internet
  5. The system is open to more advanced security protocols, i.e. proxy, not implemented at the moment.
  6. Every wired Ethernet connected RPI device can act as an independent unit
  7. The Meditech unit should connect to the Internet via the Display Units acting also as a mobile access point
  8. Some new feature will be ...

 

The inspiring source for this solution after many tests and discarding other more complex and less performing variants come from an article on hackhappy.orghttp://hackhappy.org/uncategorized/how-to-use-a-raspberry-pi-to-create-a-wireless-to-wired-network-bridge/ site. Seeing in detail the procedure is almost simple and is covered by few steps:

 

1. Install the needed components

 

apt-get -y install isc-dhcp-server iptables






In this case the dhcp server is not an essential element as all the connected RPI has a static IP address but will be useful for further - possible - external units connected to the system

 

2. Set the network interfaces configuration for nat

 

This part of the script remained untouched: edit the /etc/network/interfaces

 

auto lo eth0
  iface lo inet loopback
  iface [Device] inet static
  address [IP]
  netmask [Netmask]

  auto [Device]
  iface [Device] inet dhcp
  wpa-ssid "[SSID]"
  wpa-psk "[Password]"

  up iptables-restore > /etc/iptables.ipv4.nat




Note that the second interface is set in DHCP mode; it is the wifi (usually wlan0) that will connect to the access point for the Internet connection. For a elementary security connection role the Display Unit of Meditech (aka the tablet) will be connected via tethering with the rest of the network with a fixed pre-defined AP name that will be reported as wired information in the RPI settings. This will grant that only that particual device set as a WiFi AP can connect to the rest of the network.


3. Set the DHCP configuration

 

option domain-name "[Domain]";
  option domain-name-servers 8.8.8.8, 8.8.4.4;
  subnet [Subnet] netmask [Netmask] {
  range [IP Range Start] [IP Range End];
 option routers [IP];



The only DHCP set is for the external access, while the internal ethernet settings (on eth0) will be static IP addresses


After these settings you should execute the following commands (as root, so use sudo)


echo "INTERFACES=\"eth0\"" > /etc/default/isc-dhcp-server
 service isc-dhcp-server restart
 update-rc.d isc-dhcp-server enable
 echo "net.ipv4.ip_forward=1" >> /etc/sysctl.conf
 echo "1" > /proc/sys/net/ipv4/ip_forward
 iptables -t nat -A POSTROUTING -o [wlan0] -j MASQUERADE
 iptables -A FORWARD -i $wifid -o [eth0] -m state --state RELATED,ESTABLISHED -j ACCEPT
 iptables -A FORWARD -i [eth0] -o [wlan0] -j ACCEPT
 iptables-save > /etc/iptables.ipv4.nat
 /etc/init.d/networking restart


For those interested on the complete parametrised command from the original article it is in attach to this post.

 



The reason of this documental note is to underline the elements that should be took in account with the project respect the use of some connection strategies, i.e. the adoption of the Bluetooth for printing and - where it is possible - some probes and the WiFi connecting the diagnostic part with the display control.

 

I tried to keep this document as short as possible argumenting the aspects with some attached PDF and specialised sites links. A general-interest document is Know your regulations before you design medical electronics (see the attached document).

 

The Bluetooth question

Bluetooth is adapt and with few precautions is one of the better ways to connect small medical devices, especially probes, to the controlling parts. It is continuously growing and is well accepted as it involves very few risks for the patient: this last is a point to take in special attention when operating in conditions where the patient has an unknown anamnesis.

 

In general take a look on a actual state-of-the-art in the attached document Medical device connectivity from Wikipedia. In detail, there is an interesting article about the growing use of the Bluetooth technology on the Bluetooth.org SIG (Special Interest Group) in the two parts article published by Bluetooth.com. Links: Part one & Part two (documents are also attached to this post).

In the second part of the article mentioned above there is also a note about the problems related to the security of the communication, involving potential risks for the patient privacy. In this case the attached document Securing Legacy Mobile Medical Devices analyses in-depth this aspect.

In Meditech I have considered this potential risk as a must to be took in account referred to all the communication aspects adopted in the device components.

 

BLE New Bluetooth technology

As in some cases BLE, the Bluetooth Low Energy technology is not yet reliable and can't be applied due the reduced speed there is a lot of cases where this can be the solution, as a real good alternative to the traditional, more consuming Bluetooth 4.0 In the project initial definitions I have considered the adoption of the BLE I have already used just in a health-fitness development solution during the second half of 2014 as a real improvement option at least where it is possible. In fact it will not be present in the prototype for problems of sort deadline but it will be part of the production version of the same project.

 

More about the Bluetooth adoption in the health and wellness medical electronic devices market can be found in this other article from Bluetooth.com

 

The WiFi question

Despite what is mentioned in many social sites, the adoption of the WiFi wireless communication - also in this case with some precaution - is one of the fastest growing technologies in medical structures.

 

A good explanation of the low-risk and the advantages, pros and cons can be found in the attached article Building a Reliable Wireless Medial Device Network.

 

Introduction

 

In the last post, I described the hardware part for our robot which uses the Seeed Studios GrovePi+ since I am yet to receive my kit. In this post, I talk about the software part and making the robot move. Lets see what we can do.

 

Setting up the RPi

 

I have already mentioned that there are lots of posts on how to install Raspbian and my tweets on the install are available at https://embeddedcode.wordpress.com/2013/07/10/the-geek-getting-started-guide-to-the-raspberry-pi/ . Lets move onto the more difficult stuff.

 

Python

 

Python is a general purpose interpreted, interactive, object-oriented, high-level programming language that was created by Guido van Rossum in the late eighties. Python allows users to write code with clear, readable syntax with a growing number of support libraries for various tasks that is open source and has a community based around its development. I did a related post on writing python scripts at http://www.element14.com/community/groups/arduino/blog/2015/01/22/dynamic-living-room-lights-paho-for-python--writing-better-code

 

It usually comes preinstalled with Raspbian and you can verify it by typing

 

python —version

 

at the terminal.

 

Unlike my previous tutorials, we will work in the GUI and create a little guy of our own. Lets setup remote access to the RPi’s Windows env.

 

GUI over the local network.

 

VNC (Virtual Network Computing) is one way we can control and monitor a computers desktop from another computer over a network. Which in our case is going to be useful for wireless remote teleoperation of the robot and basic control of Raspberry Pi. I am assuming that we have setup a static IP for the RPi and it is connected to our local network. On the RPi we need to install tightvnc by running the command

 

sudo apt-get install tightvncserver




 

once it is done, we move on to starting a server by issuing the command

 

vncserver :1 -geometry 1280x800 -depth 16 -pixelformat rgb565




 

It should ask you to enter a password which will be used for remote access. If not then run the following command

 

vncpasswd




 

After the password is set its time to login to the sever from another computer.

 

On the Remote Machine

 

If you are running windows or linux, then download the appropriate version of ultravnc.

On a Mac OS, use the screen sharing app.

 

The next thing to do is connect to the RPi and start writing some python scripts.

 

Grove Pi+ or else

 

There are a number of ways you can control motors from an RPi but I have chosen to use the seeedstudios grove Pi + for this particular bot. In order to use it, we need to download some ready made scripts to test things out. Go to https://github.com/DexterInd/GrovePi and download the zip file.

 

There is a software folder which not only has python but also nodejs and C and Shell example code. Now you may choose to employ and arduino or even connect a motor driver directly for which you will have to write your own functions for movement. In another post, I will be using the gert board to control some stepper/servo motors but this time its gonna be the Grove Pi+ and friends.

 

Getting started.

 

I am not providing a tutorial but instead a step by step description of what I did and usually do when I come across a new platform. This should help you understand things a bit better.

 

Enabling I2C and SPI on the RPI

 

By default the I2C and SPI interfaces on the RPi are no enabled. We need to make some changes.

First type the following command in your command prompt

 

sudo raspi-config




 

This will start the utility as shown below

606A2968-61C2-4C90-832E-784CE58DA90C.png

 

Go to Advanced Options -> I2C -> Yes

 

The screen will ask if you want the interface to be enabled :

Select “Yes”

Select “Ok”

 

The screen will ask if you want the module to be loaded by default : Select “Yes”

 

Repeat the same for the SPI module.

 

In addition to this, we need to edit the modules file. Execute the command

 

sudo nano /etc/modules




 

and it will open the modules file in the editor. Add the following line to the end.

 

i2c-bcm2708
i2c-dev




 

Use Ctrl-X, Y and enter to save the file and exit.

 

Reboot. You should have the modules enabled and to check run the following command

 

lsmod | grep i2c_




 

This should list out i2c modules and the presence of i2c_bcm2708 in the list will indicate that all is as it should be.

 

Great so now we have the I2C and SPI all setup and we can move to testing out the motors

 

I2C tests

 

For details on I2C refer to https://learn.sparkfun.com/tutorials/i2c The first thing I need to

The motor driver in question looks something like the image shown below. In this case, we want to test out the motors first and for that the python script is as follows

 

 

#!/usr/bin/env python

# Description:    Grove Motor Drive via I2C
# Author :        Inderpreet Singh
import smbus
import time
import RPi.GPIO as GPIO


# Global Stuff Here


DRIVER_ADDR = 0x0f
MOTOR_SPEED_SET = 0x82
PWM_FREQUENCE_SET = 0x84
DIRECTION_SET = 0xaa
MOTOR_SET_A = 0xa1
MOTOR_SET_B = 0xa5
NOTHING = 0x01
ENABLE_STEPPER = 0x1a
UNENABLE_STEPPER = 0x1b
STEPERNU = 0x1c


# The Once Code Here
# Check what revions for SMBus ID
rev = GPIO.RPI_REVISION
if rev==2 or rev ==3:
    bus = smbus.SMBus(1)
else:
    bus = smbus.SMBus(0)


# Function Defs here
def MotorSpeedSetAB(SpeedA, SpeedB):
    bus.write_i2c_block_data(DRIVER_ADDR, MOTOR_SPEED_SET, [SpeedA, SpeedB])


def MotorPWMFrequencySet(Freq):
    bus.write_i2c_block_data(DRIVER_ADDR, PWM_FREQUENCE_SET, [Freq, NOTHING])


def MotorDirectionSet(Dir):
    bus.write_i2c_block_data(DRIVER_ADDR, DIRECTION_SET, [Dir,NOTHING])




# Things to do once
time.sleep(1.0)
        MotorSpeedSetAB(250, 250)
        time.sleep(1)


# The Looping Code Here
try:
    while True:
        # Loop Things here  
        MotorDirectionSet(0b00001010)
        time.sleep(5)


        MotorDirectionSet(0b00000000)
        time.sleep(2)


        MotorDirectionSet(0b00000101)
        time.sleep(5)


        MotorDirectionSet(0b00000000)
        time.sleep(2)
except:
    print 'Somthing went wrong or you pressed Ctrl+C'


finally:
    print 'Cleaning up Things...'
    GPIO.cleanup()




 

This works for me and the motors move forward and then backwards as they should. I need a better battery though.

 

You can replace the functions with your own to control the speed and direction control.

I wanted to upload a video but unable to… yet.

 

Lets have some fun with this one...

 

 

A Little Tinkering

 

The world of windows has shiny buttons and stuff and you can use the mouse to interact with objects. We need a Graphical User Interface albeit a simple one here as well. Hence we start with TkInter.

 

Tkinter is the standard GUI library for Python. Python when combined with Tkinter provides a fast and easy way to create GUI applications.

 

This link is a lot of help. http://www.tutorialspoint.com/python/python_gui_programming.htm

 

I created a new script as follows:

 

# Description:    Grove Motor Drive via I2C
# Author :        Inderpreet Singh
import smbus
import time
import RPi.GPIO as GPIO
import Tkinter

# Global Stuff Here
DRIVER_ADDR = 0x0f
MOTOR_SPEED_SET = 0x82
PWM_FREQUENCE_SET = 0x84
DIRECTION_SET = 0xaa
MOTOR_SET_A = 0xa1
MOTOR_SET_B = 0xa5
NOTHING = 0x01
ENABLE_STEPPER = 0x1a
UNENABLE_STEPPER = 0x1b
STEPERNU = 0x1c


# The Once Code Here
# Check what revions for SMBus ID
rev = GPIO.RPI_REVISION
if rev==2 or rev ==3:
    bus = smbus.SMBus(1)
else:
    bus = smbus.SMBus(0)


# Function Defs here
def MotorSpeedSetAB(SpeedA, SpeedB):
    bus.write_i2c_block_data(DRIVER_ADDR, MOTOR_SPEED_SET, [SpeedA, SpeedB])


def MotorPWMFrequencySet(Freq):
    bus.write_i2c_block_data(DRIVER_ADDR, PWM_FREQUENCE_SET, [Freq, NOTHING])


def MotorDirectionSet(Dir):
    bus.write_i2c_block_data(DRIVER_ADDR, DIRECTION_SET, [Dir,NOTHING])


def quit():
        global TkTop
        TkTop.destroy()


def Forward():
        MotorSpeedSetAB(250, 250)
        MotorDirectionSet(0b00000101)


def Backward():
        MotorSpeedSetAB(250, 250)
        MotorDirectionSet(0b00001010)


def Right():
        MotorSpeedSetAB(250, 250)
        MotorDirectionSet(0b00000110)


def Left():
        MotorSpeedSetAB(250, 250)
        MotorDirectionSet(0b00001001)


def Stop():
        MotorSpeedSetAB(0, 0)
        MotorDirectionSet(0b00000000)


# Things to do once
time.sleep(1.0)
        MotorSpeedSetAB(250, 250)
        time.sleep(1)
GUI = Tk()
GUI.geometry(“250x250”)
GUI.title(“Robot Control”)


B1=Button(text=“Forward”, command=Forward)
B2=Button(text=“Backward”, command=Backward)
B3=Button(text=“Right”, command=Right)
B4=Button(text=“Left”, command=Left)
B5=Button(text=“Stop”, command=Stop)


B1.grid(row=0, column=1)
B4.grid(row=1, column=0)
B5.grid(row=1, column=1)
B3.grid(row=1, column=2)
B2.grid(row=2, column=1)


mainloop()




 

This code creates a window as shown below and allows easy control of our robot using a simple GUI

 

Screen Shot 2015-05-08 at 6.21.31 pm.png

 

 

So now we have a little robot control and we should be able to use VNC to connect to it remotely.

 

More on this next time… Stay tuned

augusto.diniz.l

C-3p1 - update

Posted by augusto.diniz.l May 7, 2015

So, since my kit will take a month to arrive, i'm dealing with the things that i can...

 

Shelf - well since i can't import one from anywhere (as far i was tempted to that, but taxes here is a robbery - around 100%), i have a buddy here in Brazil that's making those helmets in fiberglass, i'm getting one from him.

helmet10380050_605498039546791_1501166140017548346_o.jpg

threepio sound: gathering from my personal collection, and "cleanning" it with adobe audition...

on that first stage, i will just add treepio voices in english and potuguese, later i will add more voices

 

software - well since the most can be gathered from web, i'm focusing on some tracking sistem that will be arduino-based (sound tracking) (still waiting the servos (i've fried that i had)...

 

so, for now, that's it

 

wish me luck

trenchleton

Kit Content

Posted by trenchleton May 7, 2015

Just wondering - is anyone else still waiting on part of the kit? I haven't received the RPi A+, audio card, Microstack GPS, or Microstack baseboard. Did the kit get changed, does anyone know? Beggars can't be choosers - I was just hoping to utilize a couple of those other parts.

Previous Posts

 

Application Information
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/04/22/some-information-from-my-application

ChipKit Pi Vs Arduino Pro Mini
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/01/quick-update-on-the-quadcop-and-the-chipkit-pi

Quadcopter Assembled (You call that a Quadcopter?)
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/06/quadcopter-assembled

QuadCop -The Control Switch
http://www.element14.com/community/videos/16202/l/control-switch-explanation

 

CHIPKIT-PI-3-800x533.jpg

 

I continue to find issues with the ChiKit Pi and work around them as I can.  Here is a brief list and then a more in depth explanation for those who are interested.  My goal is to show our Hosts the amount of innovation it takes to use the ChipKit Pi and to also help others who may choose to use it.

 

1)  Digital Pins 4,5,6,7 and analog pins A3,A4 and A5 are not present.  While there are pinouts on the the CKP for  the digital pins, they are connected to nothing.  SDL and SCL are present but not in the form of A4 and A5.

 

2)  There is a bug in I2C as a slave, where the callback routine only gets 1 byte at a time.  This means if I send  3 bytes in a block, the call back will be called 3 times instead of 1 time.  This isnt a show stopper but requires one to create a block protocol similar to a serial communication.

See https://github.com/chipKIT32/chipKIT32-MAX/issues/310 for the bug submission

 

3)  If your code uses Serial.println statements, you need to change to Serial1.println to see the output on the pi.

 

4) The Servo library doesn't work for me, however the SoftwarePWMServo does, and work well it has.  its very similar just different function calls.  Its not object oriented though.

 

5) The serial pins for the Pi are covered up by the CKP.  There is a small header on  the top of the board next to the header that connects to the pi.  Some of the GPIO pins are brought here from the pi, but they are NOT in the expected location.  You need a pinout for this.   However the Serial pins are not broken out.  So to get around this I soldered wires to the serial pins so I can connect my GPS directly to the Pi.

 

6) Number 6 isn't found yet but I am sure it will be 

 

None of these bugs are show stoppers.  However, they require innovation to work around.  I need to solder on wires, develop a block protocol for I2c, and remap all my pins in software to work around the pins I used that are not present on the CKP.

Now for the more in depth explanations.


ChipKit Pi (CKP) notes

The CKP gives us a nice breakout for the PICPIC32MX250F128B in a format similar to the arduino uno.

However, because the PIC32 included on the board does not have as many pins as the AVR328p on an UNO, some of the

pins are present on the breakout but not actually connected to anything!  This had me stumped for a bit until I

found the pinout located here:

 

http://wiki.kewl.org/dokuwiki/boards:chipkitpi

 

Once again I assumed this breakout was 100% arduino compatible and I assumed wrong.  This meant I had to do some

reading, and it serves me right for not doing so up front.   Thats ok, I still really like it.

 

My original pinout mapping for the Arduino Pro Mini used most of the pins. The chipkit Pi is missing Digital Pins

4-7 and A4,A5.  On the Arduino A4 and A5 are used for I2C SDA and SCL respectively.  The SDA and SCL are present on

the breakout and can be used to bring the CKP onto the I2C bus.  Its unclear to me if these will actually function

as analog pins as well.  A3 is nowhere to be found.

 

In my original code for the Aruino, I needed 7 digital pins to read PWM, 4 digital pins to write PWM (software

based), SDA, SCL, and 2 digital pins for signaling the RPI.

 

So 13 digital pins are needed in addition to the two for I2c.  However the CPK only has 12 digital pins if I also

count A0 and A1 as digital pins.  What to do?

 

Right next to the header where the raspberry pi
                    JP5
                +---J4---+
NC              01      02 GPIO0/GPIO2
GPIO1/GPIO3     03      04 NC
GPIO4           05      06 GND
GPIO18          07      08 GPIO17
GND             09      10 GPIO21/GPIO27
GPIO23          11      12 GPIO22
GPIO24          13      14 3V3
GND             15      16 GPIO10
GPIO25          17      18 GPIO9
GPIO8           19      20 GPIO11
GPIO7           21      22 GND
                +--------+

You can see some of the GPIO pins covered up by the Raspberry Pi are available for use.  I "assume", and there is

that word again, this is a stright passthrough from the RPi.

 

Here are the items that need a digital pin:
Radio Channel 1 - Forward/Reverse
Radio Channel 2 - Left/Right
Radio Channel 3 - Climb/Dive
Radio Channel 4 - Rotate left/Right
Radio Channel 5 - Auto or manual pilot mode
Radio Channel 6 - Macro mode start/stop
Radio Channel 7 - Perform sensor sweep indicator

 

Motor ESC* 1
Motor ESC* 2
Motor ESC* 3
Motor ESC* 4


Automode Indicator - Connection to RPI to turn it to automode
Macromode Indicator - connection to RPI to tell it to start/stop recording
Sensor Sweep Indicator - connection to RPI to tell it to note a sensor sweep at these coordinates

*ESC = electronic speed controller, which given a PWM sets the speed of a brushless motor.  In all technicality

these pins actually connect to the flight controller not the ESCs themselves.


One solution is to read one of channels 5-7 through one of the GPIO pins on the RPFS.  Given the Raspian is not a

real time OS, this is generally not recommended.  However channels 5-7 are simply "off/on" so they go from 1ms to

2ms and nothing in between.  I feel that the variance that RPi will have is acceptable and will be less 100

microseconds anyways.  This would be an issue if I needed more precise readings but on these pins I do not.  if I

go this route I will take the reading 3 times in a row and use the average in case there is some 1ms delay for some

reason.

 

Another option is to instead of using 3 digital pins to signal the RPI, write commands through I2C.  I originally

avoided this approach due to my original design of using the Arduino Minis.  With all the reading and writing PWM

signals on top of some I2c stuff, I felt is was being overtaxed already.  The CKP has so much power though I think it will be strong enough to handle more parsing.

 

With the bug in I2C its going to be more work to implement additional I2C commands.  In the get hub bu submission at https://github.com/chipKIT32/chipKIT32-MAX/issues/310
There is a work around but it requires blocking the ISR.  Given I need to update servos and read PWM input all ISRs must run fast.  It would be better to work around the bug using non-blocking code.

 

Blocking code = When an ISR is called interrupts are disabled, so no other ISR can run.  Servos are updated via an ISR so if an ISR takes too long, the servos will start to jitter.

Introduction

The networking part has two main roles:

 

  1. Meditech device internal communication between the specialized RPI units that are part of the system and, optionally other RPI connected as add-on
  2. External wireless communication with the display unit that is also the Internet gateway of the entire system when needed

 

Why to avoid other attracting alternatives for the internal boards connection than LAN

The main issue arised discussing with some Element14 community members about the choice of adopting the wired Ethernet LAN as the standard method connect the Meditech RPI units. In the discussion Epitaxial vs 1N4002 Diodes clem57 suggested to adopt the SPI protocol for the RPI internal connections as apparently more reliable. Frankly - as can be read in my responses to the discussion mentioned above - something sounded strange to me. But I am not a great expert of the SPI protocol usage so I had to document about it. Until now I have used intensively it only to manage the communication between microcontrollers and chip (e.g. analog potentiometer, shift registers and so on).

 

An alternative, detailed point of view

Then I had the opportunity to make a deeper discussion about this aspect of the project with violet that got me a more wide view of her personal point of view about the usage of the SPI protocol in this specific context: connect and exchange data with a variable number of separate Raspberry PI devices doing specialised tasks.

 

Following there is just the essential about her idea how to connect the devices (note that the Pi roles are only an example to explain the design):

 

So Raspberry Pi #1 is responsible for all of the user interface. If you are going to use a screen, this would control it. It's the only raspberry Pi to connect to an ethernet (Once the other two have had all of their software installed, they shouldn't ever need any updates). It also controls the printer and takes input from any controls or buttons that are on the device.

 

RPI2 communicates with RPI1 through an SPI bus. because SPI is controlled through software like python scripts, it means that if RPI#1 sends a request, then your software gets to choose what to do with it. Because of this, you can make your program only handle whatever requests that you want and probably only if it's sent with a password. The harddrive likely contains sensitive patient information so having this layer is important because you can provide security with it. See it as a firewall.

 

RPI3 communicates with RPI2 through a seperate SPI bus. It collects data then sends it to RPI#2. Any processing on the data should happen here, RPI#2 doesn't have a great amount of processing work to do so this is the best one for processing. If RPI#2 is in storage mode then the data can stream onto the harddrive. If RPI#1 is connected and has a valid password entered, RPI#2 can also stream the data to RPI#1 which decides which of its outputs are going to recieve it.

[...]

 

These considerations were accomplished with a clear schematic image reported below

 

diagram Luci.jpg

My point of view

The intuitive reason - and probably my decent knowledge on how the networks works - that the adoption of the SPI protocol is not the correct solution has been influenced by several aspects.

  • Just because SPI is under software control while the LAN after the setting is managed by the system it is not a disadvantage but it seems a good way to develop the software in a more clean and linear way. When the Pi devices are correctly set, they can communicate following the established rules, with some dynamic variable elements then this a problem that can be forget. If something is not as expected the LAN settings can be modified until the system interconnects exactly as needed.
  • As I have already worked with SPI almost intensively developing on-board components, I have privileged this kind of bus and always like those chips that can be controlled by the SPI bus. But nothing more, this is the case of different machines that should dialogate.
  • To adopt the SPI bus - always remaining in a context of intuitive vision - seems to me like forcing the devices to do a taks with the wrong tool. As a matter of fact, this is a Serial Protocol Interface, probably slower and less performing in a context of multi-machine connection: this is just the role of the network. It seemed like a sort of step-back in technology.
  • Then I have considered that if SPI is so reliable and works so well between microcontrollers and chips managing bounch of data between computers, with a 10/100 Ethernet unused it sounds really bad.

 

After the discussion mentioned above I have checked to see if these concepts was confirmed by more reliable documentation then my personal view.

 

The Serial Peripheral Interface or SPI-bus is a simple 4-wire serial communications interface used by many microprocessor/microcontroller peripheral chips that enables the controllers and peripheral devices to communicate each other. Even though it is developed primarily for the communication between host processor and peripherals, a connection of two processors via SPI is just as well possible.

[...]

The SPI Bus is usually used only on the PCB. There are many facts, which prevent us from using it outside the PCB area. The SPI Bus was designed to transfer data between various IC chips, at very high speeds. Due to this high-speed aspect, the bus lines cannot be too long, because their reactance increases too much, and the Bus becomes unusable. However, its possible to use the SPI Bus outside the PCB at low speeds, but this is not quite practical.


This is what I have found on almost all sites with tutorials explaining how the SPI protocol works and how it should be adopted. The text above is from this tutorial that I attach in pdf to this post for your convenience.


The most important reason to adopt the Ethernet connection

The most important reason that I have finally decided to adopt the Ethernet connection for the internal data flow is because as I mention i the previous posts Meditech is not a fixed system: it is conceived to work with a minimal set of probes organised in a multi-board Pi processors that must be able to accept other specific probes without nothing to be changed. The better solution I see for this kind of cases is that every unit when present share the same networked architecture.


Bridged networking

Then the network architecutre has been defined as follows:

 

  • RPI Master hosts the bridge-util Linux utilities with the internal network assessed on the IP set 192.168.5.0/255 (the use of the 192.168.5 class is because it is a private network that it is very difficult that is managed by commercial routers)
  • RPI Master include a DHCP server available for extra devices connected to Meditech in future, for any reason. These can be connected to the pre-existing architecture without changing nothing.
  • RPI Master access to the Internet via the Wi-Pi and act as the master router of the Meditech internal wired Ethernet.
  • RPI is also the LAN gateway.
  • All the internal RPI have a static IP of the same 192.168.5.x class pointing to the RPI Master as their gateway
  • All the Meditech RPI has the SSH enabled.

 

To make the architecture able to connect only with the desired access point the AP connection information are defined statically in the network configuration files. This can be parametrised if a different WiFi AP connection approach is needed.

 

If you are interested, take a look to the next chronological post where it is explained the command set used to enable the router-bridge on the RPI Master.

Mobile unit for on-field interventions

tricorder-tsfs-3views.jpg

The medical tricorder Meditech is a modular specialized version of the standard Starfleet tricorder (it will not be in service on the starships before the 23rh Century).

It is equipped with sensors and analysis software tailored for medical diagnostic purposes. These sensors will collect, store and organise data to be used, especially in emergency and crytical situations on the Earth and other exoplanets, when assessing a patient's condition.

 

eFX-Tricorder small.jpgThe future centuries medical tricorders will be more compact and equipped than the Meditech unit that is under development in 2015, the actual century. Meditech tricorder will be a full working one with some limitations in size and features conditioned by our actual technology. The Meditech device is a portable unit of about 25cm per side controlled by a separate handheld to display data and organise patient information. These information can be sent to a remote central unit for helping the intervention of the remote operator. At the actual date our technology has not yet made available very-long-distance instant communications so in the today scenario we assume that it will be impossible to use this model sending data to any orbiting starship, if any.

Instead this device is able to use the Earth internet network to connect the on-field operating unit with a remote specialised site, e.g. hospital, ambulatory, medical center and so on.

 

One of the features of Meditech, the model is much smaller than the heavy-duty versions available on the first starship vessels, is the availability of a slide-up camera probe for fast high resolution images an videos for any purpose. The same probe includes a frame comparing software to check detail changes during short times, e.g. catching the eye reaction to a flash light.

 

A series of standard medical probes, some coming from the past century but with improved functionality can track most of the most important vital signals of the patient (hearth rate, hearth frequency, body temperature, eye reactivity, glucose sensor, non-invasive probes like echography, ECG and others) The signals coming from the probes are collected and organised, then sent to a local display unit to facilitate a first diagnosis if the operator is enabled to do it.

Alternatively the visualised data can be sent wirelessly via 3G / 4G or other remote networking technologies to a specialised unit able to give assistance to the medical operator acting on-site.

Most of the signals can also be monitored continuously.

 

2qx6hyo.jpgThe display unit consists of a 6-9 inches handheld acting also as mobile access point to the Internet. Meditech is equipped with a small printing unit managed by both the main device and the display device depending on what task should be done.

The display unit graphic interface is designed to create the better user experience:

 

As there are many different situation where the Meditech can be applied as helpful diagnosis device the main unit is completely modular: The system can host add-on probes to manage specific diagnostic information and it is designed to be able to host at least three more extra-probes (alternatively) that will be designed and created in the future.

 

The Meditech mobile unit can be used everywhere including in closed areas and any Earth sanitary structure.

 

The remote assistance unit

 

startrek-tricorder-article-reuterse-500x283.jpgMeditech unit can be paired with a remote unit installed in a medical center or available to specialised medical personnel that can support the remote operator in the patient diagnosis.

As a matter of fact the remote assistance unit is a twin Meditech device that - depending by the conditions - can optionally host the probes too. The unit includes a HDMY 15 inche (or more) flat screen, keyboard, mouse and audio features.

The remote assistance unit exclude instead the display unit handheld (replaced by the HID devices, i.e. keyboard, mouse and monitor).

 

So the Meditech is convertible depending on the usage and it is not needed any kind of special installation, change, upgrade etc.

This feature grant the better flexibility of the system that can work almost on any, known or unknown environment. A further evolution of the actual design is to setup a unit protected against damaging environmental factors e.g. humidity, direct sun exposition, unavailability of a stable AC recharging power supply and more.

 

Note1: Images are mostly from the Star Trek wikipedia Memory Alpha. The shown images are not decorative only as are also the source of inspiration of the User Interface design for the device.

 

Note2: Meditech device, codename tricorder will be available as the first prototype for the end of August, 2015. It's a pleasure to see a future idea transformed in reality just now.

Hello all! I am behind everyone else I realize - I will do my best to make up for this in the coming weeks! I'm a full-time student going into finals, so I've been quite busy!

 

I also thought I'd share another side project which has robbed me of my pi time. I'm involved in the Science Outreach Club at my college. We do hands on activities for students in our (quite rural) area. It gives them the opportunity to experience STEM in a way they might otherwise not have been able. I spearheaded an upcoming event in robotics, and have been diligently working to build a small fleet of 8 autonomous mobile robot kits. They use Servocity's Sprout "Runt Rover" as a frame, as it allows them to be assembled without tools. I used Picaxe boards with LD239 motor controllers to control the bots, and added SRF-04 Ultrasonic Sensors to allow them to sense and navigate. I've spent most of my free time this last week slaving over a soldering iron to that end.

 

On to the task at hand...I have set up my RPi, and am getting acquainted with it - it's actually my first time using one, and it's been fascinating. I've gone through what I've recieved so far of the kit, anyway, and looked at what exactly I want to hook up. As the full details of my proposal are perhaps outside of the scope of the timeframe and budget of this challenge (though I intend to continue to work on it after the challenge has closed) I have felt that I need to prioritize aspects of it. Specifically, I'm going to work on mobility and sensing first. That being said, I will be using the GertBot for motor control, and using the Microstack GPS, Xtrinsic Sensor Board, and Picamera module for basic sensing. I'll be interfacing all three with the RPI 2 via a Pi Face Pirack GPIO extender. I'm currently enrolled in a computer vision course taught by Prof. Peter Corke which I believe will give me a definite foot forward in terms of image acquisition and interpretation from the Picamera. For the frame I am considering using a Servocity "Bogie" rover, as it utilizes the rocker-bogie suspension that I want to implement. We'll see - it may be too small for this application.

 

I really am looking forward to spending more time on this project, and I will be sure to keep everyone posted!

Previous Posts

 

Application Information
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/04/22/some-information-from-my-application

ChipKit Pi Vs Arduino Pro Mini
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/01/quick-update-on-the-quadcop-and-the-chipkit-pi

Quadcopter Assembled (You call that a Quadcopter?)
http://www.element14.com/community/community/design-challenges/sci-fi-your-pi/blog/2015/05/06/quadcopter-assembled

QuadCop -The Control Switch
http://www.element14.com/community/videos/16202/l/control-switch-explanation

 

 

The Control Switch

 

I've been using the ChipKit Pi and have really enjoyed learning its abilities.  I originally thought it was an Arduino clone of some sort, but I realized quickly this is not the case.  It does have some Arduino type libraries but it quickly diverges into its own paradigm.

 

Here is a video to show what I am doing followed by a more in depth explanation below.

In this video:

  • Demonstration of the control switch
  • QuadCopter Test platform
  • How to power Raspberry Pi with a 6.6V battery
  • Overall powering of all equipment
  • Test flight plan with Raspberry Pi model B+ and Camera
  • Rambling!

 

 

 

 

My original design called for Arduino Pro Minis, however I have decided instead to use the ChipKit Pi to replace the functionality of the minis.  One such functionality is called the "Control Switch".

 

From my application:

"This is a custom configured Arduino Mini that is connected to both the Radio receiver (Rx) and The Rasberry Pie Flight System (RPFS).  It is used to generate PWM signals to the flight controller either by reading the RPFS or the Rx depending on if the flight mode is manual or auto.  It is responsible for switching signals when it detects mode changes between auto and manual.  All PWM input from the Rx and PWM outputs to the flight controller are done with the control switch.  Digital output is fed back to the RPFS to indicate the modes read off the Rx.  This relieves the RPFs from having to detect pulse widths from the Rx as well have having the Rx connected to two different systems."

 

 

So what this entails at a high level is that I can control the QuadCop Manually, and then I can tell it to start flying in automode.  This means that my manual control must be overridden.  Further, I can take back control at any time.  This requires the reading of PWM signals, and the generation of PWM signals.  Both of these are done differently on the ChipKit Pi than they are done on the Arduino.  However because the Pic32 is so much faster, the Interrupt Service Routines (ISR) are much faster, and have less overhead.  This means a smoother signal can be generated when generating multiple signals.

 

The Control Switch is reading 7 PWM Signals and writing 5 PWM signals to control the QuadCop when in manual flying mode.

 

There are several pieces of code out there that do the above, but I wanted to write my own from scratch using C.  I prefer to understand everything that is going on under the hood with not black box concepts.

 

When in autoflight mode, the RPFS takes over.  The RPFS will be described more later, but at a high level it is reading the GPS, and executing the waypoint macros.  It does this by sending a "control byte" to the control switch via I2C.  The control switch reads and parses the control byte out and then moves in the desired directions.

 

There are 8 movements a quadcopter can do:

Forward

Reverse

Left

Right

Climb

Dive

Rotate left

Rotate right

 

These motions can be represented in 1 byte, each bit representing a direction.  Of course we cant request both forwards and backwards motion at the same time and such a condition is checked for validation of the control byte that is received.   The RPFS sends a control byte, a register command, and a Control Byte Check (CBC) over the I2C bus.   The CBC is simply the remainder when dividing the control byte by 17 (control byte mod 17).

 

One thing that is missing is speed information.    Per my initial design,  speed will be ignored and the Quad will move at a constant slow speed.  The RPFS has a command available to increase the speed in any direction in the case of a strong wind not allowing the QuadCop to move in a certain direction at the set speed.

 

The control switch will be powered by the RPFS which is turn controlled by a 5V linear regulator attached to a 6.6V LIFE pack commonly used in Radio Control applications.

Previous posts for this project:

 

 

Project Update

 

Had some fun with stepper motors today! I've been setting up the Gertbot and building a first prototype for the screen lift mechanism.


The Gertbot is really easy to use, it was up and running in a matter of minutes. To know how, check out this post: Sci Fi Your Pi: PiDesk - Guide: Stepper Motors with Gertbot

 

The first prototype of the screen lift makes use of two stepper motors and threaded rods, as described in my application. The end result will be different in the sense that the mechanism should be completely hidden and only the screen should come out. The current prototype has threaded rods all the way to the top side of the screen, which should not be the case in the final mechanism. mcb1 suggested using a car window mechanism (thanks Mark!). I'll be running that track in parallel, figuring out where to get the parts first.

 

Another thing is that the current prototype is rather slow, requiring about 20 seconds to bring the screen up. An idea to increase the speed without having to change the electronics is to use threaded rods with a bigger pitch. The rods I'm using now have a 1mm pitch. I suppose that using rods with let's say a 1.5mm pitch, would reduce the lift time by a third. Does that sound right ?

 

In case you haven't checked out the other post, here's the lift prototype as it is now:

 

 

Until next update!

 

Introduction

 

The Gertbot is an add-on board for the Raspberry Pi, compatible with all existing models.

 

It brings power and motor control capabilities to the Raspberry Pi, but can also be used with other controllers supporting serial communication. Because the Gertbot has it's own CPU dedicated to the real time activities, the Raspberry Pi is only required to pass high level commands such as "DC motor 1, ON", "Stepper motor 2, 200 steps at 500Hz". The drivers come in two flavours: Python and C/C++. There is also an optional GUI, not requiring the use of code to use the Gertbot.

 

To know more about Gertbot, visit: http://www.gertbot.com

 

In this post, I'll be demonstrating how to control two stepper motors using the Gertbot Python drivers.

Setting up

 

Getting everything set up is really easy and can be done in a matter of minutes.

 

The first step is to connect the Gertbot to the Raspberry Pi, but before doing so, ensure the Pi is powered off. Take the Gertbot and connect the 26-pin header, starting from pins 0 and 1. The Gertbot should slide in nicely next to the ethernet port, without touching it.

 

The result should be as follows:

photo 5.JPG

 

Next, the Gertbot Python drivers need to be installed on the Raspberry Pi. To do this, power on the Pi and using a terminal, browse to where you'd like to deploy the drivers.

 

The Gertbot Python drivers can be downloaded from the Gertbot website using following command:

 

The output of the command should be similar to this:

 

pi@PiDesk ~ $ wget http://www.gertbot.com/gbdownload/src/gertbot_py.tgz
--2015-05-04 19:28:43-- http://www.gertbot.com/gbdownload/src/gertbot_py.tgz
Resolving www.gertbot.com (www.gertbot.com)... 93.93.130.166
Connecting to www.gertbot.com (www.gertbot.com)|93.93.130.166|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 8537 (8.3K) [application/x-gzip]
Saving to: `gertbot_py.tgz'

100%[================================================================================>] 8,537 --.-K/s in 0.01s

2015-05-04 19:28:43 (752 KB/s) - `gertbot_py.tgz' saved [8537/8537]

 

The downloaded drivers are compressed. In order to be able to use it, the file must be extracted:

  • tar -xvzf gertbot_py.tgz


pi@PiDesk ~ $ tar -xvzf gertbot_py.tgz
gertbot.py

 

There you have it, the "gertbot.py" file. By importing this file in other Python scripts, it is possible to call certain functions that will facilitate the control of the motors.

Script

 

At the bottom of the Gertbot's download page, some example scripts are provided. Starting off from the "Simple Rover" Python script and looking at the "Stepper Rover" C code, I came up with my own simple Python script to control two stepper motors. The script allows the control of two stepper motors to either go in one direction or the other. This is for example useful to control the Z-axis of a 3D printer. It's possible to define the number of steps and the frequency at which the steps are taken.

 

 

As for the hardware, I used two 200 steps, bipolar, 12V NEMA17 stepper motors and connected them to the Gertbot as per the documentation. The underside of the Gertbot also indicates how to connect the motors.

 

photo 4.JPGphoto 3.JPGphoto 1.JPG

 

Demo

 

As a demo application, I've built a prototype of a lift to slide a screen in and out of place. The lift is controlled using the Python script documented earlier.

 

View all my posts on this project here:

http://www.element14.com/community/tags#/?tags=quadcop_project

 

Got an update coming on some technical hardware, in the interim,  Got my quadcopter assembled!

 

The kids wanted to be in a video so here it is, cheesy as it gets.  This is the Quadcopter I will be using for testing of the QuadCOP.  I plan to get something sleeker later in the project.

 

 

DSCN1961 (640x480).jpg

DSCN1964 (640x480).jpg

DSCN1969 (640x480).jpg

The early stages of software development often takes time to produce tangible results, particularly when you are climbing the learning curve of several technologies at once with crampons and ice-axes, but no rope .

 

However, thanks to the documentation provided for the PiFaceCAD Examples — PiFace Control and Display (CAD) 2.0.7 documentation, and the documentation that comes with Python, plus a small injection of brain power by yours truly, we have a first example of the Hexagram Display Outtput page on the PiFaceCAD screen.  It might not look much, but the two Hexagrams are custom bitmaps created using Python lists, and the output algorithm proves the conversion from the original (old) Hexagram as cast by the computing algorithm to the 'New' Hexagram created by the changing Yin and Yang lines.

TDC_2015_0325 cropped.jpg

Next task is to sort the Welcome Menu page and the control/parameter parameter passing from the compute engine to the display module, to test the random number generation.  It won't be much of an Oracle if it always gives the same answer .

 

As i have also got the IR module to recognise the remote from my Hauppauge PVR card, i intend to test both IR remote and PiFaceCAD buttons for Menu Navigation.  The PiFaceCAD Internet Radio example software should help there.

Introduction

The need is to create a reliable and fast, simply to modify command set (as a matter of fact should become a Python library) to control the printer features. The small thermal ESC/POS printer, supports a well known protocol able to generate all the needed printout we need.

Adopting a Bluetooth enabled printer has the great advantage that the same peripheral can be used to print from inside the Meditech device and from the Mobile Display Unit without changing almost nothing.

 

Searching on the available documentation in Internet it seems that the most reliable way to manage the bluetooth hardware layers with python is using the lightblue cross-platform. I have tried it and it really support any kind of Bluetooth layer. There are some issues that can be solved but as the final result I have excluded it. The reason is that the architecture should be as simple as possible. In our case we don't need the full availability of the Bluetooth features but only a serial printer enabled and ready to work wireless, available at short and medium range.

 

The most simple choice revealed to be setup the serial layer over the Bluetooth interface. With the advantage that by the point of view of the entire system the printer remain a serial device as it should be.

 

On the RPI side I have used a common bluetooth dongle. As for what I see, the same procedure will work with almost any other USB Bluetooth dongle.

 

The setup procedure

The entire setup procedure is almost simple and can be replicated in minutes on any Raspberry PI running raspian. I am working with python 3.2 so should be took in account that some instructions in the standard libraries - i.e. the serial library - manages data in a slight different way than the previous python versions. I suggest anyway to adopt this version because the printing data are sent through the serial as bytes(...), including the encoding format. Specifying UTF-8 we are sure that every character is sent as an 8 bit packet to the peripheral. This aspect is important because of the printing protocol to avoid unexpected results when sending data including control characters.

 

Installing the components

If not yet present it is necessary to install the components as shown below:

 

sudo apt-get install bluetooth bluez-utils blueman
sudo apt-get install python-serial










 

After this operation the linux Bluetooth components and the serial library for python are installed on the system and we can proceed.

 

Pairing and stable connection with the printer

The next steps are shown as single command lines, but it is not complex to create a bash script to make all together in a single shot.

 

hciconfig










 

Use this command to see  how the Bluetooth device is recognised by the system. In my case it is handled as hci0

 

Now the printer should be powered and visible by the RPI Bluetooth then launch the command

 

hcitool scan








 

After waiting a while also the printer will appear in the identified devices. The result is something like


xx:xx:xx:xx:xx:xx    PRINTER_NAME

 

that are the device Bluetooth address and the Bluetooth printer name. At this point we should pair the printer with the RPI using the command

 

sudo bluez-simple-agent hci0 xx:xx:xx:xx:xx:xx





 

After few seconds the command ask to introduce the printer Bluetooth pin then the device is paired with the RPI.

 

At this point we should make the binding between the two bluetooth stable also after the reboot. We can do it editing the rfcomm.conf file

 

sudo leafpad /etc/bluetooth/rfcomm.conf




 

When the file opens in the editor (you can use the vi editor is there is not any graphic desktop running) add - or uncomment and change if already present - the following lines:

 

# Change rfcomm0 accordingly with your Bluetooth dongle setting
rfcomm0 {
    bind yes;
     # Replace xx:xx:xx ... with the printer Bluetooth address
    device xx:xx:xx:xx:xx:xx;
    channel 1;
    comment "Serial Bluetooth printer";
}



 

Save the file and exit. Now the last command is to enable the binding with the peripheral immediately with the command

 

sudo rfcomm bind all



 

Now the printer is connected and ready to receive data from the applications.

 

Python printing example

Open the Python 3 Ide and from the editor insert the following test script

 

#! /usr/bin/python

import serial
from time import sleep

bluetoothSerial = serial.Serial("/dev/rfcomm0", baudrate=9600)

testS = "Raspberry-PI Project \nMeditech Printer Test"
testV = input()

bluetoothSerial.write(bytes(testS, 'UTF-8'))
bluetoothSerial.write(bytes(testV, 'UTF-8'))


 

Run the script and it should print a test.

Introductory note

In this document I will show the results, in a separate post I will publish the setting procedure.

 

As mentioned in a previous post, a small thermal printer will change the rules of the game tracking the history of an emergency intervention. In unconventional situations I have personally experienced that it maybe useful that some data are on paper instead of in digital format only. The other point is that some health status reports are better to be printed for a lot of reasons (discussed in future during the first tests). So Meditech will include a small thermal printer to produce - in some situation on-demand and as a mandatory task in others - short documents. Below there is a video showing the RPI printing a long list (the /usr/bin folder file list, for curious) to see a reliability, responsivity and speed test.

 

As the printer, with the proper control codes, can also work in graphic mode it will be used also to print some graphic representation of the data: available in multiple copies, on-field and immediately available.

 

In the previous post Meditech: Powering the unit  michaelkellett rose the problem of electrical safety and general medical devices compliance about the possible risks adopting some components, especially related to the power supply architecture.

 

It is clear that as Meditech will be a medical device patient safety from any possible device derived injuries should be considered in depth. The actual phase of the project, that is the definition of the parts and the general architecture, consider with more attention the bare technical aspects than the specific medical safety compliance. This aspect will be reviewed later, probably the better choice is on the first full-working prototype. There are anyway some aspects that I have always in mind.

 

Powering system

It is almost obvious that this will be the most critical aspect related to potential shock damage to the patient. In the power architecture design I have already took some precautions, like setting the charging unit removable and available only in some conditions while the device is not working with the health probes.

 

The prototype design anyway is highly modular and the powering system can be independently redesigned despite the rest of electronics.

 

Medical safety standards for electronic design

This aspect can't be covered by a single general application as the regulatory instructions will vary by country. The approach is that a generally "safe" prototype should be open to be localised following the specifications in every country.

 

A good source about the medical electronic devices esign can be found in the article of Jerry Tower, in Electronic Design blog. The article can also be found in attach to this post.

Printer is here!

 

IMG_20150505_245855888.jpg

Well this is the Meditech printer (or what it will become in very few time, hopefully).

 

Hardware details

 

  • Printing method: thermal on paper rolls 5 inches wide (about 55mm)
  • Printing speed: 90 mm / min
  • Power supply: 12V CC
  • Communication: Bluetooth 4.0
  • Printing protocol: ESC/POS standard protocol (aka EPSON protocol)
  • Monochromatic graphic support

 

State of the art

Next step is to make it printing on the Raspberry PI while for now the RPI has been set with the Bluetooth support and is able to pair the printer, saw as a serial device.

Powering Meditech

As Meditech is almost complex several power levels should be provided. To simplify the scenario the elements that should be powered in the base architecture are listed below

 

  • Storage: hard disk(s) will be low power 2.5 inches devices but it is to exclude the adoption of some kind of static SDD due the too high price per gigabyte. Maybe in the future these prices dramatically decrease so this kind of reliable solution become affordable.
  • Processor units (at least two): we should take in account that the RPI devices are equipped with some add-on electronic that consumes power too, e.g. Wi-Pi, ChipKit PI and other custom interface components.
  • Probes: should be powered to operate and not always it is possible to adopt low-energy profiles, as in the case of the blood pressure that should be equipped with  a reliable yet small air-pump.
  • Thermal printer: not used continuously, this device power consumption affect the system power performances.
  • Network switch: just another power-consumption device.

 

Power supply architecture

The more reliable solution I can see is the adoption of a multiple power-source solution:

 

  • A small ATX power supply gives the power when a 110-240V AC source is available. When Meditech is powered with AC the internal battery is under charge, despite if it is in use or not.
  • A car 12V power plug converter to be used for normal work and battery charge should be available as an alternative to the ATX power suppply
  • A laptop 18V CC should be sufficient to power all the components for a reasonable period of time (at least 2-4 hours)

 

Until now it should be possible to provide only +12V and +5V to power the entire system. A dedicated power control unit, including power level indicator and some logic, current regulators and battery to main power switch should be designed.

 

Add-on modules

As mentioned in some previous post the Meditech architecture can host add-on modules. To avoid to limit the module usage due to power limitations, especially when the system is used outdoor and is battery-operated, every module should include an autonomous battery powering system that is under charge when the module is connected to the main device and an external power source is provided.

frellwan

Ready to start!!

Posted by frellwan May 4, 2015

Received most of the kit parts - really all of the parts that I need for my project.

 

I need to order the following cables to communicate to peripherals:

 

USB to RS232 (9 pin Din connector)

USB to RS422 with wire end exposed.

 

pi_setup.JPG

 

So to recap my project:

 

Today’s manufacturers are looking to continuously improve their processes to be able to increase quality and reduce waste and cost. Part of the continuous improvement initiatives includes Overall Equipment Effectiveness (OEE) and Total Productive Maintenance (TPM) programs. Many manufactures look to process data to help with these continuous improvement efforts. While many new machines now have data collection capabilities, many older machines do not.

 

One of the biggest issues limiting manufacturers from analyzing data from machines is the limitation of communication channels on older PLC’s. These older PLC’s typically have limited number of slow serial communication channels that were originally meant for programmers to connect to the PLC to troubleshoot or make modification to the code. As more digital components started being introduced into the manufacturing area (digital drives, encoders, some transducers, etc) the need for analogue operator input decreased. Dedicated HMI applications have replaced these analogue controls in favor of more precise digital signals. These HMI’s communicate with the PLC through these limited number of serial channels. This does not leave any available communication channels for the transmission of data that would be helpful in continuous improvement efforts.

 

For this project I will use the raspberry pi to act as a communication hub. It will:

 

send and receive data to/from the PLC via a RS-232 communication channel with the DF-1 protocol

send data to the Fenner M-Trim via a RS-422 communication channel with straight ASCII commands/responses

send data (received from the PLC) to an OEE server via an Ethernet connection with FTP

receive data from a recipe database via an Ethernet connection via FTP

send email/text alerts when vibration data from PLC reaches a certain threshold

 

The Piface Control and Display will be used to setup the raspberry pi communication channels

{gallery} My Gallery Title

IMG_20150430_005509.jpg

PCD8544: Front of PCD8544 LCD Display. Back of Nokia 5110 LCD Display

IMG_20150502_205157.jpg

Nokia 5110/3110: Front of LCD Display

IMG_20150502_224616.jpg

Speaker Assembly: Component removal

IMG_20150502_224645.jpg

Stereo Speakers: With amplifier

R2D2: Helper, Butler, and confidante'IMG_20150417_113428.jpg

It has been busy this week, although the blog has taken some time to put together. I've been gathering a few more supplies necessary for the construction of the container for the "Picorder". While that is in progress, I've set in motion some basic tests of the device. I've been working on the display functions mainly at the moment. I've attached some more photos and video of some of their results. If you'll notice the display pin outs are very different so determining the correct pin out to program the Pi correctly was 'interesting'. In the meantime, I've been disassembling some more devices for parts. The speaker assembly had a damaged plug so I decided to use it's components for the Picorder. The original Tricorder has sound features so I intend to have it here also. In my last blog I stated I would be doing some sketches for the housing unit, however, I have a couple selections en route that I found online so I may not need to completely construct the case. I'm tracking the code I'm using to so it'll be published at a later date in its entirety along with a completed parts list. Enjoy the image gallery and video!

As the RPI master will act as a server and data collector, centered on MySQL database it has been clear from the start that a large storage solution has to be adopted. This can be done in two possible ways:

 

  1. Adding an external USB hard disk for storage only leaving the entire OS on the SD card
  2. Replacing as much as possible the SD card moving the entire RPI OS and storage system on the external HD

 

Considering the pros and cons, the decision was to find a way to move the entire system on an external USB hard disk. The most important factor conditioning this solution was the software to be installed on the system: Also using a 32Mb microSD I can't be sure there will be sufficient space in the future hosting all the packages and components needed, especially in this development and experimenting phase, where installed components are redundant untile all the things are not clearly defined.

 

Then there is the aspect related to the development environment: in many cases first of all the ChipKit PI module, the possibility to develop some parts on the RPI platform demonstrates to be a winning solution.

Based on some information I got on the Internet I have tried to identify all the issues and tricks to successfully create a RPI system with the Raspian-wheezy OS running on a USB external hard disk. The details on the installation procedure and where should be paid attention are described in the document I have published under the group Embedded Linux (the link is here: Raspberry PI: USB hard disk boot )

 

A copy of the document is attached to this post. The following image shows the actual - experimental - solution with a 1Tb HD: I have used a 5 inches Sata disk just because it was here unused.

 

IMG_20150503_214312133.jpg

 

Important note on the kind of HD to use

Despite that the HD in the image was the only large HD available at the moment, I have done some other tries with other kind of 2,5 inches HD discovering that these seems not reliable because the Raspberry PI USB can't power them properly. This aspect should be investigates further because - as explained in several sites - the USB ports of the RPI can erogate up to 1.2A if the device is powered with 2A 5V. I have used a wall-mount power supply where the declared power is 2A but nothing has changed. Maybe that it is not sufficient and as soon as possible I will try with a different powering system.

Previous posts for this project:

 

 

Project Update

 

Last week I started experimenting with a component which isn't part of the kit. As described in my very first post, I plan to use the Touch Board for capacitive touch applications.

Because the board also has an onboard mp3 player, it is perfect to play certain sound effects when triggering the sensors.

 

First I got to set up the board and familiarise myself with how it works and then I worked out a little demo for my futuristic desk.

 

You can find more information and some thoughts on the Touch Board here: Sci Fi Your Pi: PiDesk - Review: Bare Conductive Touch Board

 

For a little taste of how this will fit into my project, you can watch the video below:

 

 

Who can guess where I got the (temporary) sound effects from ??

It is time to discuss the adopted development strategy for the entire project whose main complexity IMHO is that involved different technologies and need almost different approaches. Especially because the final point it to harmonise all the involved components.

In this first period I took the time to explore as much as possible how every component will work and how can be integrated in the echosystem.

 

First key points assuming as definitive

At the actual date there is a certain number of fixed points that, until proven the contrary, I assume that are at least one of the better solutions to be adopted.

 

More than one Raspberry PI: networking

I have tried to write down a scheme using a single RPI board trying to see it by many different points of view but the final choice is that at least two units will be used. There are two important reasons:

  1. Very different set of components should work together
  2. The need to have a modular system that can grow with different features and options

The immediate next problem has been to choose the better solution to create a collaborative network between the different units. Thanks to clem57 michaelkellett and others (see the discussion Epitaxial vs 1N4002 Diodes) the initial idea was the final choice. The RPI machines will be connected through a small low power switch creating an internal network. There will be a PI master server collecting data and other tasks, and a PI slave connected to a set of probes.

 

Note that the LAN wired network will connect only the Meditech RPI devices leaving some ports available for further modules and/or a connection to an external network.

 

One of the roles of the master PI is acting as a router between the wired LAN and the WiFi access, enabling the access by the interface unit. Below are all the RPI master roles identified until now:

  • Access point (from the display unit)
  • Apache web server with php5
  • Main data collector
  • Hardcopy printout of some data
  • MySQL database for data storage
  • Local data storage unit (with external Hard Disk
  • LAN - WiFi routing platform

 

Three main classes of probes

Meditech should include at least a set of probes for non-invasive healt analysis considered essential for almost any intervention (more details on the biomedical aspects will be discussed in separate posts providing detailed documentation). The probes, for project convenience, has been grouped in three main classes

 

1. Heart and general health sensors

Heartrate frequency, blood pressure and ECG will be grouped together. The probes should be interfaced to the slave RPI with the Chipkit-PI board for a first low level data management and acquisition, then data are sent to the RPI slave platform for the final processing and math management. Collected data are delivered on-demand to the RPI master that will store them on its local database.

This is important to manage these data-set separately because in most of the cases the information should be acquired in continuous to monitor the state of the patient.

 

2. Non-invasive internal analisys

In the hope to have time to develop the components, the echography and other ultrasonic-based probes should be managed by the RPI master. The use of the RPI master for this task should be experimentally verified and at the moment can be considered an 80% trustable choice. The reason to put in charge this task to the already busy RPI master is that the data collection is not be done in continuous; the Bitscope device will be reversed to its common use (an oscilloscope) used as the pre-processing data acquisition; as a matter of fact, it is an independent embedded reliable device that can dramatically reduce the CPU work of the RPI master. In fact if tests demonstrate that the process is too heavy for the RPI master these probes will be managed separately by a third unit, also connected in the internal network.

 

3. Glucose analisys

Essential to make a precise and fast check in case of suspected diabetic crisis or diabetic inducted coma, this component too is in charge of the RPI master. This is managed by a dedicated hardware and the signals evaluation and calculations don't need to be monitored in continuous (not a very-high priority process) so this task can be done on-demand by the master server.

 

Hard-copy low priority tasks

The RPI master will collect the probes data but not only. It will act as a sort of black-box where the history of every intervention is stored along a timeline (a set of records on the database). Some strategical setting information like

 

    • operator,
    • current location,
    • short-range movement (I mean geo-coords in the order of meters or less, too small to be managed by Google Maps),
    • chronological order of the operations,
    • general conduct profile of the operator,
    • starting and ending intervention time and
    • health status of the patient

 

will be permanently stored on the database. These information are also printed in hard-copy format on a serial thermal printer (55 mm wide) integrated in the Meditech system.

 

This feature and its usage approach is helpful for the medical assistance personnel working in emergency conditions (or out of the traditional hospitalisation structure). The intervention process should follow a rigid protocol that will vary by country so this part can be customised depending on the adopted procedure.

An example that will be preset in the prototype can be the following:

 

  1. The mobile unit receives the call with a description of the location and the kind of intervention the location is saved as the initial location on the system with date and time, together with the current unit location
  2. The mobile unit reach the intervention location. Meditech is already set with the operator ID and other personal information. The trip is saved creating an history of the speed, stop points, etc. There are many factors influencing this phase (traffic, distance, period of the year, weather conditions etc.) All the key-factors are recorded during the trip.
  3. When the operator start working on the location, the probe use and other information are stored together with the analysis results. The patient heath status check also should follow a procedure and the probes usage and consequent decisions involving the use or interaction with the Meditech unit are recorded until the operator does not declare the end of the intervention.

 

There are two moments the system will automatically print a status report: when the intervention call starts (time, current location, location to reach) and when the intervention is declared closed for any reason by the operator. During the intervention the operator, based on his own knowledge and experience can decide to obtain an hard-copy of additional information on the patient health status i.e. the response of his health check.

 

Introduction

 

Last year, I backed a project on kickstarter by a company called Bare Conductive: the Touch Board. The Touch Board is an Arduino compatible board, based off the Arduino Leonardo, which adds new features such as:

  • capacitive touch chip
  • MP3 Player
  • LiPo battery charger
  • stereo output
  • microSD slot
  • ON/OFF switch

 

The Touch Board can also be set up as a MIDI device.

 

In The Box

photo (2).JPG

 

In the box of the kickstarter reward (£45, early bird £40), there was:

  • a Touch Board with 2GB microSD card
  • a quick start guide
  • a thank you note
  • an electric paint pen (10ml)
  • an electric paint jar (50ml)
  • stencils for various shapes and lines

 

Optionally, a LiPo battery and/or microUSB cable could be added for respectively £5 and £2, making the kit even more complete.

 

Setting Up

 

The Bare Conductive website contains a lot of tutorials and projects using the Touch Board and/or electric paint. One of the tutorials is about setting up the Touch Board, starting with the Arduino IDE. The setting up tutorial can be found here.

 

Preconfigured

 

The Touch Board comes pre-installed with an audio guide that can be accessed by touching the electrodes on the board. Here's a video of the audio guide:


Arduino IDE

 

The Touch Board requires the Arduino IDE version 1.5.6 or later. This can be downloaded directly from the Arduino website. Once the correct version of the IDE is installed, a hardware plugin and some libraries need to be installed in order to be able to work with the Touch Board.

Hardware

 

The hardware plugin ensures that the Arduino IDE is able to recognise and program the Touch Board. It can be downloaded from Bare Conductive's GitHub page and needs to be put in the Arduino IDE's hardware folder. It is possible to verify the hardware plugin has been correctly installed by starting the Arduino IDE and confirming it is available in the list of boards.

Screen-Shot-2015-03-06-at-21.19.21.pngScreen-Shot-2015-03-06-at-20.49.42.png


Note: There's even a "Thank you backers" file in the repository in which I found my name. Awesome!

Libraries

 

A total of three libraries need to be installed/updated:

 

The MP3 chip and microSD card libraries are bundled together in a single zip file. Once the libraries are installed by extracting them in the Arduino IDE's libraries folder, the programming environment is ready.

Screen-Shot-2015-03-06-at-21.19.47.png

Code

 

The code for the pre-installed audio guide is available for download. Once opened in the Arduino IDE, it can be used to understand how the application works and how the board could be reprogrammed for other purposes. Because the Touch Board is recognised by the Arduino IDE, the serial monitor can be used for troubleshooting purposes. In the case of the pre-installed application, it shows which electrode has been touched and which audio track has been played.

Screen-Shot-2015-03-06-at-21.27.07.pngScreen-Shot-2015-03-06-at-21.27.42.png

 

Electrodes

photo-11.jpg

Anything conductive can be used as an electrode for the Touch Board.

 

This could be a nail, aluminium foil, electric paint or even a piece fruit. When something new is attached to an electrode to be used as a button, the Touch Board needs to be reset. This is to ensure the Touch Board uses the new input value of the electrode as a baseline, in order to properly detect contact or proximity. The audio guide which is programmed by default, requires the electrodes to be touched in order to play the audio file. This behaviour can be modified by loading the proximity sketch found in the "Making Distance Sensors" tutorial.

 

The difference with the audio guide program is the following:

 

// this is the touch threshold - setting it low makes it more like a proximity trigger
// default value is 40 for touch
MPR121.setTouchThreshold(8);

// this is the release threshold - must ALWAYS be smaller than the touch threshold
// default value is 20 for touch
MPR121.setReleaseThreshold(4);

 

This changes the thresholds for all electrodes. If this is only required for specific electrodes, this can be specified as well:

 

// configure touch threshold for electrode 0
MPR121.setTouchThreshold(0, 8);

// configure release threshold for electrode 5
MPR121.setReleaseThreshold(5, 4);

 

I've been experimenting with materials such as copper tape, aluminium foil and electric paint. Alligator clips can be used to connect the sensors to the Touch Board. Alternatively, the connection could be made by applying electric paint to the electrode, this method is called "cold soldering".

MP3

 

Changing the sound effects is as easy as replacing the mp3 files on the microSD card with new ones. The files are named "TRACK000" to "TRACK011" with extension ".mp3". "TRACK000" is linked to electrode "E0", "TRACK001" to electrode "E1", and so on. There is enough space on the microSD card to keep different folders with different sound effects, and swap the files depending on the project. Careful though, there are also other non-mp3 files on the microSD card required by the Touch Board. Removing those files could affect the functionality of the board. More information on replacing the audio files can be found on the Bare Conductive website, in a dedicated tutorial.

Project

 

I'm currently working on a project in which I'll be using the Touch Board to integrate capacitive touch controls into the surface of a desk, triggering certain actions with accompanying sound effects. A first test is demonstrated in the video below, in which hidden controls are triggered by hovering a hand over a piece of cardboard.



Conclusion

 

The Touch Board is great and easy to use. The amount of content available surrounding this board is phenomenal!

The fact that the board is 100% Arduino compatible makes things so much easier as well.

 

The price is a bit on the high side to my opinion, but I'm glad I backed the project and have this board as part of my development kits.

 

Finally, the list of backers in every Touch Board related download is a very nice touch (no pun intended) to show appreciation towards the kickstarter backers.

Hi all,

 

     This post is nothing but some of my thoughts on ideas that I have proposed for this challenge. And it will be short too.

 

     By the way, this week I received KIT from element14, and as like all I got very excited to see and work with them.

 

I am not going to describe about KIT contents as they are available on element14, and I don't want to duplicate them here. All I am going to give is just a link where you can find the information about these KITs.

Sci Fi Your Pi Design Challenge - Kit List

 

If you haven't read about my concepts then here is the snapshot of same

 

  • My first idea is the one where Tony Stark interacts with his computers with hand gestures. Picks up file from One computer monitor and Puts in other Computer and starts interacting with. In my project I would love to do similar thing with Images/ Pictures first. If possible same thing I would love to do with Videos and supported file. I feel it's a very cool project.
  • My second idea is a Surface Table: we can find this in many movies from Mission Impossible to Amazing Spiderman to Iron Man once again. Those are mostly touch based, but I am looking to use the Microchip's MGC 3130 GestIC kit that I have from element14 to interact with displays. Here As I don't have such big monitor, I would use my laptop display as surface display.
  • My third idea is a Wrist Computer: if time permits I would love to implement a wrist computer, just like personal assistance we would have seen in movies. A Raspberry Pi + PiFace (though it's bulkier) would do this job. Also if possible a GPS integration with this computer enable a Door unlock mechanism based on my location. And Door lock unlock can be handled by Gertboard.

 

IRON MAN Computer Interactions -

     To start with I am very excited to try IRON MAN movie computer interactions. And I am sure it will be coolest thing to see. In kit we got two sensor Kits one is XTRINSIC SENSE BOARD from freescale and MICROSTACK ACCELEROMETER board. The initial task is to recognise some gesture with these sensors. I will be posting the implementation details in next few blogs. Then, once I am able to detect some natural gestures then next part is to map it with some computer action. There would be wireless communication between the interactive computer and the module on wrist or in palm.

     The task described above would be my first task in this challenge. I would update as I keep accomplishing the tasks.

 

If you are still confuse with what I will be really doing in IRON MAN Computer Interaction, then wait for my video .

 

Regards,

Shrenik

kcrajesh

PiBo - a soft start

Posted by kcrajesh May 1, 2015

I dreamt of using Windows 10 on the Raspberry Pi 2 for my project. There is more to go for this but the possibilities are amazing. Microsoft announced the Windows Core yesterday and have made available a preview download of the core. I was able to get it to boot the RPi 2, write some code on Visual Studio and debug them on the board. The core has still a way to go for completion, but the version they have out is pretty good.

 

Here is an image showing the default app. The core does not have the Windows Experience, the application is responsible for the UI. The default app has a few preferences and shows the IP address

 

pi1.jpg

 

Here is a second Image running my test application.

 

rpi2.jpg

 

I will attempting GPIO next. Stay tuned.

 

Introduction

 

As described in my previous post, my project has sub modules. Each has a Sci Fi Purpose and it should all come together to complete project VIRUS. OR I may divert all together and do what I see more fun to do. In this post, I start with making a robot. I will try and write these posts out in the form of tutorials so that you at home can follow along. Lets make robots!

 

Where to start

 

2Q==

In the movie Real Steel, the robot was recovered from a junkyard but unfortunaterly, the local junk yard here only has rotting tyres from old cars. Hence I started with scrap wood and to be very honest, this is not my first robot BUT this WILL be my third iteration on the Raspberry Pi and I will name it MinionBot Mark III. The purpose is to make a general purpose Robotic platform from scrap stuff and what not. The first thing is to make the chassis and Bill of Materials. Roughly it will be...

- Lots of scrap wood.

- wood saw.

- Extra thumb?

- Motors

- Wheels

- Screws

- Softdrink Cans(Empty)

With all that stuff, I start my little experiment.

 

Making a Chassis

 

In order to explain the robot, I made a video which will possibly evolve into a series of videos. I already cut out a wood board into a octagonal shape with slots for the wheels and then used the soft drink cans to make Tin strips for clamps. Pretty cool huh? Here is the video of me assembling the base.

 

 

The end result is shown in the image below. I can make a motor driver like I did in a previous challenge but it does not seem to be what the masses want and hence I went on with a readymade solution- A Grove Pi+ This little pi Hat is available from www.seeedstudios.com and comes will all kinds of stuff. A motor driver widget is available and all of it is just plug and play. The code to make it run is also available from seeed and is quite simple. Once you have humpty dumpty ready, we move on to prepare a RPi

 

IMG_9796.jpg

 

Setting up a Raspberry Pi

 

Now there are a gazillion tutorials on this subject and I have written some myself. If you need help setting up your pi, navigate to https://embeddedcode.wordpress.com/2013/07/10/the-geek-getting-started-guide-to-the-raspberry-pi/ and you have my version of setting up an RPi. I discuss how to setup a static IP as well which will save you a LOT of time later on. To connect to the network, I prefer using Wifi since we want THIS robot to be wireless and moving around like a small kid. For this I use the element14 wipi which is available from element14/farnell

 

The additional Hardware

 

I am still waiting for the kit to arrive and in the mean time I am using what I have. I am using the seeedstudios grove system of modules and I did a small review of the starter kit which is available here(https://embeddedcode.wordpress.com/2015/02/14/seeedstudios-groove-pi-pre-review-part-1/)

 

In the next post I will be going through the software part and MY version of setting up the template for our robot! Stay Tuned!

Other posts on my project:

http://www.element14.com/community/tags#/?tags=quadcop_project

 

I am working on a real update for my project.  I have several parts connected up and nearly 1000 lines of C code written for my arduino and my Raspberry Pi2.  I have some things to demonstrate and some explanations to do.

 

I decided to take a tangent and work with the ChipKit Pi.  I remember a few years back I wanted a Chip Kit UNO because it was a 32 bit processor that is much faster than the AVR used in the Arduino, however I never got around to it.  I didn't recognize what Element14 had sent me until I looked at it closer!  I am very excited to own one of these.

 

I have been testing some code on an Arduino nano328 to read my radio's PWM signals and communicate via I2C to my Raspberry Pi2.  Things are working great so I spent some time seeing if I can get the same code to run on the ChipKit.  The code compiled with minimal changes.  I used the MPIDE on the Raspberry pi to compile it.

 

As a quick test, I ran a simple speed test on all 3 parts involved and here is a video outlining my results.  This is the FIRST video blog I have ever done so bare with me, they will get more polished over time.  Watch the video and read some more comments below.  I kept wanting to call the ChipKit Pi the ChipKit UNO so you will hear me pause every time I say it.

 

https://www.youtube.com/watch?v=4tPfn6qDRAs&feature=youtu.be

 

To summarize:

ChipKit Pi:  98ms

Arduino Nano: 4350ms

Raspberry Pi 2: 15ms

 

Notes

Phantom I2C objects

Serial pins are different for uploading sketches and for viewing output from Serial.print statements.

Wire.read() is Wire.receive() in the ChipKit version.  I commented the wire parts out int eh CHipKit Pi code below.

I am not certain the MPIDE is turning on compiler optimizations.  Will research.

Speed increase of the ChipKit Pi:  44.3 times faster.

The Raspberry Pi2 is 6.5 timers faster than the ChipKit pi, and 290 times faster than the Arduino.

 

Keep in mind this is just a purely simple processing test and ignore I/O  and other functionality.

 

PS:  QuadCOP parts get tomorrow, the FUN stuff including all the motors and the frame!  Should be able to do a test flight (manual, no pi etc.) this weekend.

 

 

 

Raspberry Pi Code (use time command to get results):

 

#include <stdio.h>

int main()
{
        int a;
        int b;
        b = 1;
        int l1 = 100;
        int l2 = 10000000;

        for(int i=0;i<100;i++)
        for(a = 0;a <=3000;a++)
        {
                b = b +  a;
                b = b * 2;
                b = b % 1657;
        }
        printf("Result: %d\n",b);
return 0;
}







Arduino Code: (spits out information on serial)

#include <Wire.h>
int lastAnswer;
void receiveEvent(int howMany)
{
}
void requestEvent()
{
  Wire.write(lastAnswer); // respond with message of 6 bytes
                       // as expected by master
}

void setup()
{
        Wire.begin(4);                // join i2c bus with address #4
        Wire.onReceive(receiveEvent); // register event
        Wire.onRequest(requestEvent);
        Serial.begin(9600);
}

void loop()
{
        Serial.println("starting test");
        unsigned long ms = millis();
        int a;
        int b;
        b = 1;
        int l1 = 17;
        int l2 = 3000;

        for(int i=0;i<100;i++)
        {
        for(a = 0;a <=3000;a++)
        {
                b = b +  a;
                b = b * 2;
                b = b % 1657;
        }
        }
        lastAnswer = b;
        ms = millis() - ms;
        //printf("Result: %d\n",b);
        Serial.println("test completed");
        Serial.print("Answer: ");
        Serial.println(b);
        Serial.print("Time: ");
        Serial.println(ms);
}








 

 

 

 

 

Chip Kit Pit code:

int lastAnswer;
void receiveEvent(int howMany)
{
}
void requestEvent()
{
  //Wire.write(lastAnswer); // respond with message of 6 bytes
                       // as expected by master
}

void setup()
{
        //Wire.begin(4);                // join i2c bus with address #4
        //Wire.onReceive(receiveEvent); // register event
        //Wire.onRequest(requestEvent);
        Serial.begin(9600);
}

void loop()
{
        Serial.println("starting test");
        unsigned long ms = millis();
        int a;
        int b;
        b = 1;
        int l1 = 17;
        int l2 = 3000;

        for(int i=0;i<100;i++)
        {
        for(a = 0;a <=3000;a++)
        {
                b = b +  a;
                b = b * 2;
                b = b % 1657;
        }
        }
        lastAnswer = b;
        ms = millis() - ms;
        //printf("Result: %d\n",b);
        Serial.println("test completed");
        Serial.print("Answer: ");
        Serial.println(b);
        Serial.print("Time: ");
        Serial.println(ms);
}