Robots with Wheels

Enter Your Electronics & Design Project for Your Chance to Win a $100 Shopping Cart!

Back to The Project14 homepage

Project14 Home
Monthly Themes
Monthly Theme Poll

 

Another step towards the completion of RoboBobLet is to add a camera system. It would be possible to add a simpler digital camera to RoboBobLet and then use the Arduino to do some simple image processing but that would take more time than is available. An alternative is to use the Pixy Camera system (Pixy (CMUcam5) – Charmed Labs ) which uses an on board DSP to do all the heavy image processing. The default Pixy can be easily set to recognise blocks of colours and to provide this information directly to an Arduino (Hooking up Pixy to a Microcontroller (like an Arduino) - CMUcam5 Pixy - CMUcam: Open Source Programmable Embedded Color …). The Pixy camera is provided with a ribbon cable to connect it directly to the ICSP connection on the Nano (and other Arduinos I think) and uses the SPI interface. After down loading the Pixy library ZIP file and adding to the Arduino IDE a number of example files become available and I used the Hello_World one. This provides numerical values about the number of identified colour blocks, the (X, Y) values of the centres and their widths and heights. An example of the data obtained is shown below.

 

 

Once every second all the recognised block data is transferred to the Nano with the Hello_World programme just displaying this on the IDE Serial Monitor display. The figure above shows three blocks being identified all of sig (signature) 2. The signatures have to be previously trained and I had trained the Pixy to recognise small blue wooden blocks of size 30 x 60 cm. The figure below shows the three blocks being recognised.

 

 

Although it does not look like it, these three blocks were all aligned at the same distance from the camera of 14 cm. The two outer blocks look slightly further away but this is a feature of the camera lenses and the lighting used. The whole system is shown in the video below. I have added a white front piece to the front of the Pixy camera, produced using my 3D printer and TinkerCAD, which I will use later to fix the laser rangefinder to that is already on RoboBobLet. Hopefully the two sensors will work together.

 

 

The Pixy Camera is provided with a USB interface as well as the SPI (and other serial) interfaces, which works with the PixyMon programme provided. This enables the image being collected by the Pixy Camera to be viewed. Regretfully the USB interface to PixyMon and the SPI interface to the Nano cannot be used at the same time so it is not possible to be completely certain about the objects being recognised. I used blue bricks as this is not a common colour in most environments, but you will notice that a pink ruler is placed adjacent to the blocks as the camera was picking up the blue strip on my laptop and causing problems. Differing lighting conditions also cause great problems, as is the way with most camera systems.

 

When thinking about how to use the data from the Pixy Camera it seemed it might be possible to try and determine the range to the blue objects by using the difference in image width (or height). This will only work if the detected blue object is actually one of the bricks. By considering the aspect ratio (height to width) of the detected object which should be consistent at all distances then it might be possible to confirm that object as being a brick. Once this was done then the width (or height) of the blue brick in the image could be used to determine distance. The screen dump below shows an (Excel Online) spreadsheet of the aspect ratio of the blue brick as different distances.

 

 

The distances ranges from 14 cm to 53 cm at intervals of 3 cm. The graph shows a fairly consistent aspect ratio between 2.0 and 2.2. The actual aspect ratio is 2.0. It was necessary to insert the (0,0) point in order to obtain a nice graph. The Online version of Excel does not seem to provide many options for formatting graphs so this was the best I could achieve. It would seem that there is be reasonable confidence in assuming that any blue object detected by the Pixy camera with an aspect ratio in the region 2.0-2.2 is one of the blue bricks. Then by using the width (or height or possibly both) it should be possible to use the information from the Pixy camera to provide some information about the distance. This is illustrated in the following screen dump with both the width and height plotted against distance.

 

 

There is clearly a good correlation between distance and object width and height. All that is needed now is to amend the programme in the Nano to determine this. A lookup table might be the easiest approach. As the laser rangefinder will be fixed directly onto the Pixy camera it should also be possible to use it's distance information to confirm that from the camera.

 

The next step in RobBobLet's evolution now is to integrate the Pixy camera into the chassis and get all the sensors to work together. There is still time to do this before the end of this challenge (hopefully).