Again from Japan, new improvements in robotics in an effort to fuel the new robot uprising – I mean – research. This tiny robot isn't groundbreaking, but it could open up new avenues for using less processing power to walk, run, and somersault. While other bipedal robots use computational ZMP-based methods to balance and control their movement, this robot uses high-speed cameras.


The robot was created at University of Tokyo Ishikawa Watanabe Laboratory by the same researchers who created the rock-paper-scissors robot, Janken. They are using a sensor fusion/vision feedback method they call ACHIRES - Actively Coordinated High-speed Image-processing Running Experiment System.

run dont walk.png

Shot of this little bipedal robot running.  (via Tokyo Ishikawa Watanabe Laboratory)


While ACHIRES isn't the catchiest name, it could dramatically decrease the cost and processing power needed to keep bipedal robots from toppling over, which would be hilarious to watch but not so great for robotics entrepreneurs. The robot seen demonstrating their new concept has a leg length of only 5.5 inches and can only run for about 10 seconds. The tiny runner is also supposed to be able to somersault, but this feature isn't demonstrated in the video so perhaps it is less reliable at the moment. It can only run at a speed of 2.6 mph, but I suppose this is an impressive speed if your legs are only 5.5 inches long.


ACHIRES uses high-speed cameras to capture the robot. This footage is then used to adjust the robot's movement real-time to keep the robot balanced and control its movements. In the event of a somersault, the video is also able to foresee how the robot should land, and prepare the robot to gracefully land on itss two feet. It is a nifty and simple idea that could catch on, however there are some gaping holes in this technology. For one thing, if the entire robot needs to be filmed at all times, then the robot is confined to one particular area. In this case, the robot can only run in place. Still, the potential savings in reducing on-robot complexity could lead companies to start troubleshooting this technology for adoption in the future.




See more news at: