Table of Contents

Wearable Gesture Control - #1 Introduction

Wearable Gesture Control - #2 Install & Setup Template Application

Wearable Gesture Control - #3 Integrating LCD & Accelerometer

Wearable Gesture Control - #4 Collecting data for Edge Impulse


See all my posts here


In this blog, we will be collecting our gesture data so that we can train our model on Edge Impulse. Firstly, we will create a data forwarder so that we can send the data. Next the data is collected and trained to create a model.

This is the process of data forwarding



PSoC code for data forwarding


If the user button is pressed upon turning on the device, it will go into a section of the code for Data forwarding mode. It consists of an infinite loop where it readings (x, y and z axis)to the serial UART console.

To summarize, we are sampling the sensor at 100Hz which will be the data sent to Edge Impulse.


Here is a snippet of the code:

The full changes are in my Github commit:


Create a new project in Edge Impulse


This will be the project we will forward the data into

Install command line tools for data forwarding

Now that we have the board sending data over serial, we need to forward it using Edge Impulse's command line tool.


The instructions are found here on their official page:


Set up the tools according to their instructions too


Plug in the board and run the command (edge-impulse-data-forwarder).

In my case, it prompted for my username, password and project name.

The 3-axis data is detected and I named it as x,y,z respectively.


Click on the link as shown in the command line, or visit your project devices page. On Edge Impulse, the device has been recognized successfully.


Collect the data


Under data acquisition, I did the data collection. I collected 20 seconds flick, 20 seconds shake and 20 seconds idle.


Shake is swinging the device back and forth


Flick is a quick turn of the wrist


Lastly, idle is simply placing the device on the table without touching it.


Training the model

I left most settings of the model as the default recommendation. For your reference, this is my impulse settings.

On the next page, generate the features. We can see that the 3 classes are very spaced apart in the graph, so there should be no problems in our model.

Lastly, I trained the classifiers and the accuracy is 100%. There may be problems due to the limited amount of data, and the model has probably overfitted to the data. However, I think I will proceed with this for the purpose of this project as a proof of concept.


Live classification of the model


Now we have the model trained, I plugged in the board with the data forwarding again. We will be checking if we can classify gestures properly with this model. I started the testing sample and did a flick gesture

These are the results. You can see that the flick gesture was detected successfully in the middle section, and the sections before and after are detected as idle.





With the model up and running, in the next blog, I will deploy the model code to the PSoC 6 board.