ALSO READ BLOGS:
Happy new year!
I have returned after a long hiatus! As such, this will be a correspondingly lengthy post.
As mentioned last time, we will now discuss the steps required to get WiFi up and running, and sort out the software to be used. For the WiFi bit, everything is covered at this address, so scroll down to the section titled “Configuring the Edison” and replace the network name and password as necessary. There’s nothing more to it.
This robot is to have video streaming capabilities, as well as the ability to drive motors forward and backward based on control signals received. Therefore we can easily determine that two distinct pieces of software are needed.
The first would be for the video streaming capability. I chose to use mjpg-streamer as it’s easy to use and fairly resource-friendly. In addition, a diverse range of software supports its output stream format, from browsers to VLC. The best way to use this program would be to obtain the sources and compile it, but before doing so, there are certain package dependencies that need to be met first.
The second program would be responsible for receiving commands from the user, and instructing the motors accordingly. In terms of receiving commands, since operation/control is performed over a network, there are a number of choices, depending on the desired functional topology i.e client-server or peer-to-peer. The former topology imposes some limitations depending on the protocol used between client and server e.g if using HTTP, then polling must be used since the communication flow is request-centric. Peer to peer topologies are a bit less troublesome and would be a natural choice for a project of this nature. However, I decided to use MQTT as the control protocol, which adheres to a physical client-server topology, and a flexible functional topology (depending on how it is used, it could be client-server or peer-to-peer). There are MQTT libraries for many of the popular programming languages, it is fast, lightweight and easy to configure, and requires minimal programming to use. In addition, using MQTT would allow me to easily change the type of controller used e.g I could use desktop, mobile or web applications as controllers so long as they send the right message over MQTT. In terms of implementation, MQTT requires a server program (technically called a broker) which mediates all communications between clients connected to it. In my physical topology, this broker would run on the Edison, and the controller program would connect to it. The listener program (which receives commands from the controller program) would also connect to the broker and would perform any actions requested by the controller program. There are many choices of broker, but I elected to use the open-source Mosquitto™ broker, which I’ve used before. I’d need to compile it as well (the version in the repos is always older than the latest release) which means there are more dependencies to be resolved.
In terms of actually controlling the motors, it would be necessary to leverage the Edison’s PWM and GPIO pins, the former to determine motor speed, the latter to determine the direction. Intel provided the MRAA library (pronounced “em-raa”) which is an abstraction layer for controlling board hardware such as SPI, I2C, GPIO and PWM peripherals. In the spirit of getting my hands dirty, I decided to go this way since I’d never used it before and this seemed like a good time to start. Once again, MRAA would need to be compiled from source, but dependencies would need to be satisfied first.
So let’s get started.
INSTALLING BUILD TOOLS
At the barest minimum, you’d need the following packages to attempt to build basically anything at all: make, build-essential, gcc and cmake. Therefore, this process began with a “sudo apt-get update”, which was to be followed by “sudo apt-get install make build-essential gcc cmake git”. First command completed without a hitch. The second command gave the following error:
Weird. I figured I didn’t need the bcm43440-bt package because I wasn’t really using Bluetooth and had already inserted the necessary Bluetooth files into the image. Same thing for u-boot-fw-utils, since I’d already finished with u-boot and could easily interact with it at boot time. So I followed the recommendation, which asked to remove the two packages. I assented, and when that was completed, was able to install the build tools without a hitch. This is how it went down.
Next, it was onto compiling mjpg-streamer and the other programs, which will be covered in the next half.
Next, the mjpg-streamer sources are required. They can be found here. I downloaded them (as an archive) and unpacked them. Next, I transferred the files to the Edison using WinSCP. Next, the dependencies needed sorting out. They are libjpeg-dev and libv4l-dev. Therefore, those are installed using the command: “sudo apt-get install libjpeg-dev libv4l-dev”. Once that completes, a small fix needs to be made. Apparently a symbolic link to one of the Video4Linux (V4L) files is needed to get mjpg-streamer to compile. The link can be created using the following command:
“sudo ln -s /usr/include/linux/videodev2.h /usr/include/linux/videodev.h”
Once that is done, we can now go ahead and compile mjpg-streamer. Within the mjpg-streamer source tree (i.e the folder containing the folders plugins, scripts, www), execute the “make” command. For some reason, this failed to complete the compilation at some point. The error wasn’t really clear to me, but it would appear the failure occurred during the compilation of something related to GSPCA, which is not used here. Here’s an image showing the error.
Despite that, the main mjpg-streamer binary, the UVC input plugin (which is what is used to capture video from USB video devices like webcams) and the HTTP output plugin (which is what is used to provide a browser-viewable MJPG stream) were successfully compiled. I plugged a webcam in, and executed the following command (from within the mjpg-streamer source tree):
“./mjpg-streamer –i “./input_uvc.so –f 30 –r 640x480” –o “./output_http.so –w ./www”
This will provide a webcam stream at 30FPS, with a resolution of 640x480, which can be accessed via the IP address of the Edison on port 8080.
Next, I navigated to the Edison’s IP address, port 8080 by entering http://<edison-ip>:8080 and was greeted by the mjpg-streamer interface. Clicking on the Stream option, you can then view the live webcam stream.
Step one, done.
Next, we need to compile the Mosquitto MQTT broker, which will permit the use of the MQTT protocol as described previously. To do this, we have to get the dependencies, some of which have to be compiled from source themselves first. The installable dependencies are: openssl libssl-dev libc-ares-dev and uuid-dev, so an apt-get install openssl libssl-dev libc-ares-dev uuid-dev should do the trick. Frankly, you can compile Mosquitto without any hassle at this point. However, I wanted it to have Websockets support (for reasons that will be revealed later) so I additionally needed to install libwebsockets, which I needed to compile from source. Luckily, it was a straightforward process and is as documented here. I had to get the libwebsockets sources using git though (for some reason direct downloads weren’t working) so that’s something to watch out for. Once the compilation completes and you’ve run a make install, we can now go onto the final step.
In the Mosquitto source tree, there’s a file called config.mk. With your favorite editor (nano works great) locate the line that reads WITH_WEBSOCKETS:=no and change that no to a yes. Once that’s done, save and run make from within the source tree. When that completes, you should have the mosquito broker itself inside a folder called src in the main source tree. To do a test run, type ./mosquitto while within the src folder. If everything went well, you should see messages similar to those below:
This may not be particularly useful at the moment, so a Ctrl-C will end the process. Mosquitto is a broker, so in practice you probably want to run it at boot or in the background/as a daemon. We’ll talk more about it later.
Step Two, complete.
For the final bit, we will need to setup the MRAA library which will permit us to write code to control the Edison’s hardware. We’ll need to install the dependencies first – there’s only one this time, and its called swig3.0. Once that’s done, we can then pull the latest MRAA sources from git, by using the command:
Once we have the sources, we can then follow the steps as documented here to complete the build. Since Python and Node.Js support weren’t needed, I opted to disable their support in the cmake step. When the build completed, I navigated to the build/examples directory within the MRAA source tree, and ran the blink_onboard example. Note that root privileges are needed for this. An LED on the Edison board began to blink, and with that, step three was completed.
The next post will begin to talk about the hardware components.