In this post, I go down the path of creating my own project using the Ultra96 V2 board. I have created a semi-wireless color picker for my kitchen tile backsplash based on the Ultra96, Click Mezzanine, and LSM6DSL accelerometer/gyroscope. Based on the orientation and movement of the Ultra96, the colors in my kitchen backsplash change accordingly. I have two LED strips embedded beneath the glass tile in the kitchen controlled by a Freetronics EtherTen (Arduino + Ethernet shield). I take readings from the LSM6DSL module on the Clickboard, convert them to RGB values, the send them via MQTT over my network. The EtherTen listening on two topics for these messages and displays the values accordingly.

 

Here is the "before" video just showing the tiles. See the "after" video at the far, far end of this post.

 

The Process:

 

I had a few ideas for what I'd like to do for this section of the program, and I tried to get to multiple projects, but was only able to complete the first one within the time period.

 

I started by expanding on the 'sensor interface' program that we worked with on various lessons to read more than just the temperature data. This alone ended up being a little more involved than I first figured it would be. This was all done in SDK without using the Petalinux setup; which was the ultimate goal; but the idea was to start simple then build it into a bigger project.

 

The simple thing was to expand the int array for the TX buffer and RX buffer sent to the accelerometer/gyro to be 15 bytes long instead of just 3 bytes so that we could read all the required values.

 

    u8 cmd[3];
    u8 rx[3];
     //This became
    u8 cmd[15];
    u8 rx[15];

 

Then, I tried sending the longer request string of data over the SPI port, but kept getting zeros back. See code sample below.

        XSpiPs_SetSlaveSelect( & SpiInstance, 0x01);
        XSpiPs_PolledTransfer( & SpiInstance, cmd, rx, 15); // This didn't work (by itself)
        XSpiPs_SetSlaveSelect( & SpiInstance, 0x00);

 

Line 2 above from the sample code does a 'write --> read' on the SPI bus using the write buffer "cmd" and a read buffer 'rx', with a length of 15 bytes (1 as a 'command' byte and 14 'payload' bytes). I was thinking that this would work since the memory locations of the accelerometer and gyroscope readings are adjacent to the temperature location. (datasheet is located here)

However, after some futzing around and getting zeros back instead of real data, I came to the realization that I was just getting zeros for the gyroscope in locations 22h and 23h which was the first one that I tried to use. I had to RTFM for a little while and I realized that only the accelerometer was configured as per the sample code, and not the gyroscope:

 

//started as:
    cmd[0] = (u8) 0x10;
    cmd[1] = (u8) 0xA0;
    XSpiPs_SetSlaveSelect( & SpiInstance, 0x01);
    XSpiPs_PolledTransfer( & SpiInstance, cmd, rx, 2);
    XSpiPs_SetSlaveSelect( & SpiInstance, 0x00);

// this became:
    cmd[0] = (u8) 0x10;
    cmd[1] = (u8) 0xA0; //JO This enables the Accel (Linear Acceleration)
    cmd[2] = (u8) 0xA0; //JO Added this to enable the Gyroscope
    XSpiPs_SetSlaveSelect( & SpiInstance, 0x01);
    XSpiPs_PolledTransfer( & SpiInstance, cmd, rx, 3); 
    XSpiPs_SetSlaveSelect( & SpiInstance, 0x00);

 

This code snip writes the value A0 to memory location 0x10h; which configures the accelerometer to run at 6.66 kHz as per the manual.

That meant that the gyroscope was disabled. To enable it, I wrote out "A0" to memory location 0x11H so that it would run at 6.66 kHz as well.

I then used a simple method to convert the 2's compliment data into a real number. Note that this code is basically straight copy/paste from the sample code, but I moved it to be its own stand-alone method for re-usability.

 


s16 Convert_Value(u8 low, u8 high){
     u16 value = 0;
     s16 opValue;
     // Merge High and Low bytes into temp word
     value = ((high << 8) | (low));


     // Check for negative
     if ((value & 0x8000) == 0) //msb = 0 so not negative
     {
          opValue = value;
     } 
     else {
          // Otherwise perform the 2's complement math on the value
          opValue = (~(value - 0x01)) * -1;
     }
     //printf("low: %x High: %x\r\n", low, high);
     return opValue;
}

 

Once I had some actual values coming in, I had to convert them to real world numbers. The datasheet shows the conversions based on the settings (i.e. sensitivity range) for the data.

 

So then I take the output value from the conversion method and scale it according to the data sheet.

 

        Gyro_X = Convert_Value(rx[3], rx[4]);
        Gyro_Y = Convert_Value(rx[5], rx[6]);
        Gyro_Z = Convert_Value(rx[7], rx[8]);


        Accl_X = Convert_Value(rx[9], rx[10]);
        Accl_Y = Convert_Value(rx[11], rx[12]);
        Accl_Z = Convert_Value(rx[13], rx[14]);


        float LA = 0.061; //linear acceleration (Accelerometer) sensitivity in mG/LSB
        float AX = Accl_X * LA /1000;
        float AY = Accl_Y * LA /1000;
        float AZ = Accl_Z * LA /1000;


        float GSensitivity = 8.75; // G's offset calibration
        float gX = Gyro_X * GSensitivity /1000;
        float gY = Gyro_Y * GSensitivity /1000;
        float gZ = Gyro_Z * GSensitivity /1000;

 

 

Here is a snip showing the reading coming in, converted, then printed to the serial port. The Ultra96 was sitting still on the desk, but up on its side; which is why the Z is about at zero and X is around 1G.

 

This was all done using the SDK with the hardware created in a previous lesson.

 

The next step was to bake this into a Petalinux image and use MQTT to send the data out.

 

The Next Step

Once I had the above work completed, I went to the Petalinux lessons and started working on modifying them. These base lessons had us read temperature data from the Click module and post it to the IBM Bluemix MQTT broker. I was /thinking/ that an easy modification would be just redirect the MQTT connection to a local broker and do something with the data. It ended up taking longer than expected, but I did get it working.

 

I started by commenting out the MQTT portions of the code and doing just the expanded reading of the Click Module (LSM6DSL Accelerometer/Gyro). This was more complicated than just Copy/Paste from what I did above since the exact implementation was quite different for the Petalinx version. It took a few hours to make the code changes to work in Petalinux.

 

The next step was getting MQTT working. The sample code uses the Paho MQTT library with a wrapper library, but the calls are different in the Petalinux version from the Software lessons. I really struggled to find documentation on the libraries used and had to manually dig through the source code to get things working. For instance, they note in the code as per below:

 

* @param auth-method - Method of authentication (the only value currently supported is "token")
* @param auth-token - API key token (required if auth-method is "token")

 

I wanted to use a simple username/Password combo with my existing MQTT server, but the wrapper library didn't appear to support this. Over a few evenings, II installed a new Mosquito MQTT broker on the VM and set up TLS with Server and Client keys for security to try complying with the library as it was presented; but I could never get it to work properly. I eventually was able to back out some of the code in the wrapper and use a normal username/password for authentication.

 

At this stage, I was able to read the accelerometer and gyro and post to MQTT; so progress was moving along!

 

RGB 4 U2?

The next step was converting the value of the two sensor components into RGB values. These values would be used on each of two RGB LED strips in my kitchen. I wrote a simple method to take in the various accelerometer and gyroscope values and scale them from their source range into a range of 0-255. I also wrote in an offset but found out later that I wouldn't need it. This would allow a neutral gyroscope to be at "128" in Red, Green, Blue and go up and down from there.

 


int scale(float min, float max, float current, int offset){
//scale the values to a range of 0 to 255 for RGB values
float range = max - min; //range of values
float value = (current + range) / range -0.5; // scale value within the range as a percent
value = (value * 255.0) + offset; // scale to a 0-255 value and add in an offset


if(value > 255) value = 255;
if(value < 0) value = 0;
return (int)value;
}

//sample calls:
        // assume acceleration from -15.00 to +15.00
        // assume gyro from -3.00 to + 3.00
        // scale the values from 0 to 255
        int r_accel = 255 - scale(-1.0, 1.0, values[4], 0);
        int g_accel = 255 - scale(-1.0, 1.0, values[5], 0);
        int b_accel = 255 - scale(-1.0, 1.0, values[6], 0);

        int r_gyro = scale(-15.0, 15.0, values[1], 0);
        int g_gyro = scale(-15.0, 15.0, values[2], 0);
        int b_gyro = scale(-15.0, 15.0, values[3], 0);

Post to MQTT

Here is the final calls that I make to post to MQTT.

 

 sprintf(hex1, "%02X%02X%02X", r_accel, g_accel, b_accel); //format the R, G, B values for accel. into a Hex string
 sprintf(hex2, "%02X%02X%02X", r_gyro, g_gyro, b_gyro); //same as above but for the Gyro.

//post to MQTT
rc= publishEvent(&client, "RGBhex1","json", (unsigned char*)hex1, QOS0);
rc= publishEvent(&client, "RGBhex2","json", (unsigned char*)hex2, QOS0);


rc= publishEvent(&client, "status","json", (unsigned char*)json_str, QOS0);

 

This will post a message with a hex encoded version of the accelerometer data for Red, Green, and Blue to the topic "RBGhex1" and the gyroscope data to the topic "RGBhex2". I also use a JSON string posting to a third topic for debugging. The original intent was to just use the JSON message, but I found an issue with the limited resources of the Arudino Uno once I included the Ethernet, MQTT, FastLED, and JSON libraries taking over 75% of the resources without any other code running. I played with options for another evening or two before I dropped the JSON string and just used the 6 character hex encoding. This got resource usage down to the ~ 63% range on the Uno (EtherTen).

 

Embedding in Petalinux

I really had planned to use a fully embedded Petalinux image, and I got pretty close. This would have been a custom kernel build with my application pre-installed. I would have still needed to add it to a start-up routine so that it comes up by default; but I never got that far. I was able to follow the Petalinux lessons 1-6 substituting my application instead of the one from the lessons. The issues that I ran into were numerous for someone with limited experience with this development environment. I changed SDK to use the Production build instead of the Debug as a start. This got the build size down much smaller as we learned in the lessons; but came back to bite me later (read on!) just like in the lessons (see this blog post).

 

I spent multiple evening working with the BitBake file and the Makefile to get the "includes" correct but the complier kept puking.

 

This is the relevant portion of the Bitbake file that I used; where I had to manually add the #include files mentioned in the code.

 

SRC_URI = "file://JamesMain.c \

file://iotfclient.h \

file://iotfclient.c \

file://paho/MQTTClient.c \

file://paho/MQTTClient.h \

file://paho/MQTTPacket.h \

file://paho/MQTTConnect.h \

file://paho/MQTTSubscribe.h \

file://paho/MQTTUnsubscribe.h \

file://paho/MQTTPublish.h \

file://paho/MQTTPacket.h \

file://paho/MQTTLinux.h \

file://paho/MQTTFormat.h \

file://paho/StackTrace.h \

file://paho/MQTTLinux.c \

file://Makefile \

  "

And here is the complete Makefile that I used:

 

APP = jamesMqtt

 

 

# Add any other object files to this list below

APP_OBJS = JamesMain.o

APP_OBJS += iotfclient.o

APP_OBJS += paho/MQTTClient.o

 

 

all: build

 

 

build: $(APP)

 

 

$(APP): $(APP_OBJS)

$(CC) -I$(INC_DIR) $(LDFLAGS) -o $@ $(APP_OBJS) $(LDLIBS)

 

 

clean:

-rm -f $(APP) *.elf *.gdb *.o

I added the -I$(INC_DIR) (include Directory) thinking that it would automatically grab the subdirectories, but it didn't work for me. The command "Petalinux-build" kept failing and puking.

 

So where did I end up?

I finally succumbed to the deadlines of time and effort and complete the project without the Petalinx embedded image. I am able to run the new code to read the Clickboard, post to MQTT, and have an Arduino read the message and update the tile backsplash in our kitchen.

This post (and project) has gone on long enough (**i actually tracked my hours at over 30 hours for this and 80 for the total training ), so here is a video of the final system working.

 

The video already!

Here is the video already! Thanks for reading this far!

 

Reflections

Did I get what I wanted out of this training? Was my project[s] a success? The answer is "sorta, but not really." I the original Path to Programmable, my final project was about 1000 times less than what I wanted (a line following robot) as it only was an application to talk between the PS & PL. I have no doubt expanded on that based solely on the training provided. I am interfacing via SPI to a peripheral and posting to MQTT. This is largely based on sample code, but that is how most things are started. I encountered a lot of issues along the way, but was still able to come up with a functional design. I really really wanted to use the FPGA side [at all?] much more but I wasn't able to get that far in this training. I was still able to use the PL and Petalinux quite a bit and got very familar with the SDK and the remote Debugging abilities.

 

As with the original Path to Programmable from last year, this course is set up a lot like one in college with a 'lecture' portion and a 'lab' portion. I don't feel like I learned quite as much as I may have from a true course, that may just be because of my lack of FPGA/SoC and embedded experience on the front side of taking the course. I have followed the other student's blogs and they went much farther than I was able to in much shorter time periods. If I had been told it would take over 80 hours to read a sensor and post to MQTT, I would have thought that was crazy; but here I am.

 

For those interested, here is the Arduino Encoder Simulator that I hacked together for the second project of this training but wasn't able to get to. The idea was to create (using the FPGA) a system to look at the A/B/Z pulses coming off of an encoder and do some analysis on them. Count the pulses per rev; look for dropped pulses of A versus B, and other things. This project will have to wait for next time, but I did start on it for this training and this is the source code.

 

I'd like to send out my thanks to Element14 and Avent/Xilinx for putting this together and taking the chance on selecting me to participate in this for the second year.

 

Until next time,

- James O'Gorman