As is often the case with hobby projects, plenty of unexpected difficulties have slowed down development of the Universal LED Animator. Luckily enough, all my orders arrived from Element14. I now have plenty of dev boards and some working code connecting them together via Bluetooth - not top quality and definitely not production value, but it works. I am now able to discover all Animators (Blend boards) from a Beagle Bone Wireless, connect to them and tell them to get into config mode, i.e. to show a particular color pattern that will allow OpenCV to uniquely identify all the LED strips.


New boards

I used the Element14 shopping cart to get two more Blend boards, a Beagle Bone Wireless and some tools and components that will power up my workshop to make my life easier when working on this project. I am now flooded with boards:



I already elaborated on the unlucky fact of the Blend's developers discontinuing the series in my previous post. Fortunately, those still are perfectly functional Arduino Leonardos. And to raise the spirits, I started playing with the Beagle Board, which is a very powerful and interesting alternative to Raspberry Pi. The most easily noticeable differences are less USB ports on the Beagle, and far more GPIOs. It ships with a full-blown Debian already on board and works out of the box - it requires no SD card and all it takes is plugging it to your PC with the included micro USB cable. The system is already there on the built-in Flash memory and the board creates an Ethernet-over-USB connection that you can use to check out a few flashy features it presents on its web interface (including a web-based IDE called Cloud9, which I might use one day for another project) or simply to SSH to it in order to give it your WiFi credentials, which I did. I then plugged it out and powered it over a wall-wart, and it has been sitting in its corner for about a week, kindly doing its work when asked with very few issues. It's also worth noting that the whole project is open source, which results in interesting related products being released, like the robotics-friendly Beagle Bone Blue or the Beagle Core intended for industrial applications. My intention is to make the Beagle Bone my command center for this project; it will run OpenCV for identifying the LED strips and will be the master device for all the BLE connections. Probably I will put more responsibilites on it but decisions related to this will be made later.


Working with the Blend


My goal was to make the Blend respond to commands from the BBW, stopping whatever LED animation it was playing at the moment and going into config mode - with the first two pixels colored red, the last two colored green and all intermediary pixels displaying a single color provided by a Python script running on the BBW. With this color uniquely chosen for each strip and associated by the script with the MAC address of the Blend board, it is possible to identify each LED strip's position and shape in the picture that the user uploads to be analyzed by OpenCV. The red and green-colored ends of the strip will make it possible to find the beginning and the end of the strip, so that the user doesn't have to guess which way a directional pattern should be going (e.g. in a "marching pixels" type of animation, we don't want some strips to display the pixels moving in a direction opposite to the others).


After I managed to put some nice patterns on the Blend-powered LED strips in my previous post, the next stop was blending in (uhm) the BLE functionality. I first looked at the examples provided by the RBL_nRF8001 library and found out that they are mostly related to Android apps developed by the board's creators, so my task was to reverse-engineer them. The very first step  I ran the HelloWorld sketch provided and moved on to the Beagle Bone to poke around the BLE interface that the example provides. Some googling told me that the Linux utilities for messing with BLE are hcitool and gatttool.

First, I used hcitool to find the MACs of Blends around me:

# hcitool lescan

LE Scan ...

DF:67:58:1E:19:EB LED Anim

FF:D8:56:7F:C9:1B LED Anim

Knowing the MACs, I could tell gatttool to connect with one via its interactive mode:

# gatttool -t random -b FF:D8:56:7F:C9:1B -I

[FF:D8:56:7F:C9:1B][LE]> connect

Attempting to connect to FF:D8:56:7F:C9:1B

Connection successful



Hooray! It took a while to find a combination of parameters that works. Now the tough part was to get some ASCII data from the BBW to the Blend. I found a doc online mentioning the service and characteristic IDs that I should look for. I checked all available characteristics:

[FF:D8:56:7F:C9:1B][LE]> char-desc

handle: 0x0001, uuid: 00002800-0000-1000-8000-00805f9b34fb

handle: 0x0002, uuid: 00002803-0000-1000-8000-00805f9b34fb

handle: 0x0003, uuid: 00002a00-0000-1000-8000-00805f9b34fb

handle: 0x0004, uuid: 00002803-0000-1000-8000-00805f9b34fb

handle: 0x0005, uuid: 00002a01-0000-1000-8000-00805f9b34fb

handle: 0x0006, uuid: 00002803-0000-1000-8000-00805f9b34fb

handle: 0x0007, uuid: 00002a04-0000-1000-8000-00805f9b34fb

handle: 0x0008, uuid: 00002800-0000-1000-8000-00805f9b34fb

handle: 0x0009, uuid: 00002803-0000-1000-8000-00805f9b34fb

handle: 0x000a, uuid: 00002a05-0000-1000-8000-00805f9b34fb

handle: 0x000b, uuid: 00002902-0000-1000-8000-00805f9b34fb

handle: 0x000c, uuid: 00002800-0000-1000-8000-00805f9b34fb

handle: 0x000d, uuid: 00002803-0000-1000-8000-00805f9b34fb

handle: 0x000e, uuid: 713d0003-503e-4c75-ba94-3148f18d941e

handle: 0x000f, uuid: 00002803-0000-1000-8000-00805f9b34fb

handle: 0x0010, uuid: 713d0002-503e-4c75-ba94-3148f18d941e

handle: 0x0011, uuid: 00002902-0000-1000-8000-00805f9b34fb

handle: 0x0012, uuid: 00002800-0000-1000-8000-00805f9b34fb

handle: 0x0013, uuid: 00002803-0000-1000-8000-00805f9b34fb

handle: 0x0014, uuid: 00002a27-0000-1000-8000-00805f9b34fb

The characteristics with handles 0x000e and 0x0010 are RX and TX respectively. Now all it takes is to try and send the Blend a byte:

[FF:D8:56:7F:C9:1B][LE]> char-write-cmd 0x000e 65

The Blend reports all Bluetooth-related events to the serial port, so I can read its status and confirm that it got the humble ASCII 'e' I sent from my Beagle Bone:


Evt Device Started: Setup

Evt Device Started: Standby

Advertising started

Evt Connected

Evt Pipe Status

Pipe Number: 3


The next steps were: writing a basic communication protocol (one-way for the time being), developing a way to send it from the BBW to the board, and then automating this procedure with Python.


Early draft of LED Animator's communication protocol

The "protocol" will be in fact just a bunch of ASCII bytes representing commands and values. How elaborate it will get depends on how much time I have to develop bonus features, but the minimum is a command to enter config mode and change animation sequence, as well as an RGB parameter to pass the Animators their palettes. The protocol follows an extremely simple "command+parameter+end message" syntax, using single chars to denote command and parameter types. An example follows:

More commands are planned, and I am also considering adding more parameters, e.g. brightness and animation speed.


Coding in Python for the BBW

I found a GATT SDK for Python and decided to give it a try. After some time spent messing with the examples, I arrived at the following script that works without issues, but is pretty messy:

import gatt
import random

CHAR_RED = "r"
CHAR_CFG = "a"
CHAR_END = "e"
SERVICE_UUID = '713d0000-503e-4c75-ba94-3148f18d941e'
CHARACTERISTIC_UUID = '713d0003-503e-4c75-ba94-3148f18d941e'

def randomPalette():
    r = random.randint(0, 2)*64
    g = random.randint(0, 2)*64
    b = random.randint(0, 2)*64
    pal = (r, g, b)
    return pal

class AnyDevice(gatt.Device):
    def services_resolved(self):
        print("Config mode entered.")

    def btSend(self, str):

        device_service = next(
            s for s in
            if s.uuid == SERVICE_UUID)
        rx_characteristic = next(
            c for c in device_service.characteristics
            if c.uuid == CHARACTERISTIC_UUID)

    def animConfigMode(self, pal):

class AnyDeviceManager(gatt.DeviceManager):
    def device_discovered(self, device):

manager = AnyDeviceManager(adapter_name='hci0')
print("Starting discovery...")

print("Found animators:",len(manager.devices()))
print("Trying to connect...")
for dev in manager.devices():
        device = AnyDevice(mac_address=dev.mac_address, manager=manager)


Which outputs this:

Starting discovery...

Found animators: 2

Trying to connect...

Config mode entered.

Config mode entered.


I'd love to post a video of the results of the script's operation, but I'm now moving homes and my abilities to put together a decent scene are very limited - those will come with my next blog posts about OpenCV.


Naturally, the above code lacks a lot of critical features; among other things, it's perfectly possible that pure red or pure green will be generated, which are colors reserved for LED strips' tips. That would confuse the computer vision algorithm and make it unable to notice the strip. We don't want this, and we want a better working BLE connection, but that's coming up later. The Arduino code running on the Blends is an even bigger mess, and it will be published in an upcoming blog post dedicated to its inner workings, but now I have the required minimum to comfortably test the OpenCV so I'll focus on that.


Up next

Coming up are my first steps with the super-powerful computer vision library called OpenCV. I'll try to make the Beagle Bone analyze a manually uploaded photo of a scene featuring LED strips in config mode. If there's enough time, I'll try to add basic animation configuration features.