Robots with Wheels

Enter Your Electronics & Design Project for Your Chance to Win a $100 Shopping Cart!

Back to The Project14 homepage

Project14 Home
Monthly Themes
Monthly Theme Poll


The journey for RoboBobLet for the Robot with Wheels challenge has come to an end for me as there is no more time left. I added V53L0X laser rangefinder to the 3D printed front piece added to the front of the Pixy camera and connected everything to the Nano. Apart from one head-scratching moment when one of the mounting pillars for the strip board was shorting two of the SPI signals (which prevented the Pixy camera collecting data) this final assembly went together quite easily. I would like to say this was the result of good design but I think it was more likely just a coincidence. Below is a photograph showing all the connections at the back of the Pixy camera. You cannot see what is connected to what but I thought I would put this image in to give some idea of the wiring complexity for this final assembly.

Showing the Wiring Complexity of the Completed RoboBobLet Assembly


The next image shows the front of the Pixy camera. The white is the 3D printed front piece that fixes to the mounting holes in the Pixy camera. The lens of the Pixy camera is the squarish black plastic object at the top and below that is the V53L0X laser rangefinder sensor. I mounted them in this manner so that I could obtain range data for whatever the Pixy camera was pointing. at

Showing the Pixy Camera with Laser Rangefinder


The Pixy camera was trained to detect blue objects. I chose blue because I had a number of children's wood toy bricks painted blue, so I knew they would all be the same colour, shape and size and this simplified the programming requirements. It is always good to start as simple as possible and work towards realism. The existing programme was amended so that when scanning the laser rangefinder through 180 degrees it also checked the image from the Pixy camera and displayed the X coordinate of the first detected blue object. The X coordinate is in camera Pixels which has not yet been related to the scan angle or the RoboBobLet chassis orientation. I made sure that there would only ever be a single blue object to be detected and this avoided all the problems that could occur with multiple objects. Again, multiple blue objects is something that  could be considered later. If no blue object is detected then the X coordinate value of 0 is output to identify this situation. The screen dump below shows a plot of laser rangefinder distance and blue object X coordinate, both plotted against the scan angle.



The top blue plot is the laser rangefinder data and shows the typical data obtained from scanning a single object. There are some spurious single anomalous ranges showing as downward spikes, which is much as seen previously. I did not make any attempt to filter these out as time has run out. Below in orange is the plot of the X coordinate value which effectively appears as a ramp. This is because as the camera is scanning it will detect the blue object at some point so the X coordinate will increase above zero and the edge of the blue object will then progress in a mostly linear fashion across the camera area until the blue object disappears out of view on the other side of the camera image. The blue object was placed at a distance of approximately 300 mm directly in front of the RoboBobLet chassis and this is illustrated correctly in this data. The spike in the camera X coordinate is probably due to lighting effects or maybe just not detecting the blue at that specific moment. Getting the Pixy camera to provide stable results was a challenge as it seems to take at least four consecutive sets of readings taken in quick succession before the image data would stabilise.


Finally, below is a video showing the operation of RoboBobLet using the Pixy camera to avoid a blue object. I did not use the laser rangefinder data as I just didn't have enough time left to create the programme. The video shows that it detects the blue object and then turns away from it. When the object is out of view the mobile robot continues to move forwards. Unfortunately RoboBobLet suffered an injury a few days ago when the table he was on collapsed and he was crushed by a larger robot. Regretfully this damaged two of the continuous rotation servo motors so that they are erratic in operation. This means that RoboBobLet can randomly change direction at any time if the damaged motors stop operating correctly. If you listen carefully you can hear the gears crunching when RoboBobLet is moving forwards. I will need to get some more continuous rotation servo motors to fix this problem.




Well, that's the end of my contribution to the Robots with Wheels challenge. I have managed to achieve almost everything I set out to achieve, apart from the 9 degree of freedom sensor (but I did integrate that into a different robot (see NotRobobBobLet post)) and some 3D printed body shell parts (but I did 3D print the camera front piece, so that almost counts). There is still a great deal of work left to do with RoboBobLet but this is mainly in the software programming area, in order to make the best use of all the data potentially available from the sensors in order to obtain useful responses in real world environments. Still, there is always next week, and the week after, then the next week .....


Thanks to Element 14 for providing the opportunity to publish what I have done. Maybe I'll do another design challenge sometime, but not straight away!