Wallet sensor in its "final form." (via hack-a-day


Caleb Craft was searching the web when he noticed references to a small group of people who can pickpocket just about anybody. Being the clever hacker he is, he decided to integrate an alarm into a wallet that would sound off if someone ever tried to swipe it. There are currently solutions to this problem already available, however, and as Caleb points out most of the wallet alarms available will not go off until light hits the device. If a pickpocket swipes your wallet and does not open it right away, the alarm would be useless. This is where Caleb used his ingenuity to create a better solution.


Finding a small magnetic window alarm Caleb quickly went to work on how to incorporate it into a wallet. The original alarm works using a buzzer and magnetic reed switch. The reed switch works by an applied magnetic field which in turn, opens or closes a switch. Furthermore, Caleb decided all he would need to change on the alarm would be the thickness of the design. Leaving the casing around the buzzer, which acts as an amplifier for it, the reed switch was de-soldered and moved to the side of the unit. This allowed the design to slim down just enough to make it unnoticeable while the wallet sits in your back pocket.  The switch was placed in the wallet towards the spine and is easily accessible for the user to reach. His hack included using hot glue to secure the alarm to the wallet and a small magnet that gets secured in the back pocket.


Caleb's design worked as planned and fit comfortably in his pocket without any unusual bulkiness. When the wallet is removed the alarm continuously goes off and would probably scare any thief into dropping the alarming wallet. Caleb states that the only improvement he would suggest for anyone would be a stronger magnet so the wallet would not need to be aligned perfectly in the pocket. If any thief decides to try to swipe Caleb's wallet, they could be sure they are in for an unusual surprise.




See more news at:


We have seen the Microsoft Kinect used for many things besides gaming. However, the latest use of the Kinect, which comes from a Japanese clothing store, United Arrows, may create one of the creepiest advertisement schemes yet. Using the Kinect along with a specialized motor and 16 strings, the store's mannequin mimics potential customer's movements from outside the store. The idea is to get people looking at the store's products, which will be outfitted on the mannequin.


The Marionettebot, as it is called, is capable of various amounts of movements. For example, the mannequin is capable of jumping, walking, and moving its hands and feet. Located in the heart of Tokyo this display has not failed in drawing the attention of people strolling by. Many people stop, dance, wave, and do all sorts of different movements to see how well the mannequin follows along.


The Japanese have a very unique culture. What seems unusual for the rest of the world may be an every day affair in Japan. Not too long ago researchers from Keio University created plants that interact with their environment. The plants were fitted with sensors and programmed to display emotions. Sensing movements made by humans, the plants displayed their “emotions” through strings attached to their branches and servos. Seems like the Japanese have a knack for making things move. I wouldn't be surprised to see their tech used in future marionette stage shows. With Kinect 2.0 on the horizon, and the Leap Motion controller, even finger tracking is an option - opening up even more control in puppetry.



See more news at:


Atlas, a Boston Dynamics creation (via DARPA)


As robotic technology increases more and more emergency civil agencies, first responders and military units incorporate them into their respective arsenals. Doctors will soon be able to use remote imaging robots in situations where they cannot get to the patient. Firefighters will soon be able to quickly image an internal structure and find those trapped in burning buildings using a robotic ROV and factories will be able to use robots that can help in industrial or hazardous situations when the need arises. Of course then we come to military applications involving robots and the immediate stigma arises of ‘Terminator-style’ robots involved only with destruction. Nothing could be further from the truth. Sure, there are drones and remotely operated robots that are equipped with weapons platforms, however these allow soldiers to stay out of harm’s way when the situation warrants. A vast majority of military contractors are actually looking to develop robotic platforms whose primary function is to save lives. For the better part of 20 years or more one of these agencies, DARPA, has been developing a variety of robots that are capable of providing multiple life-saving services on the battlefield and have made significant advances over those years.


The defense agency recently unveiled their Atlas humanoid robot at Boston Dynamics headquarters located in Massachusetts. The robot is the result of DARPA’s Virtual Robotics Challenge/DARPA Robotics Challenge, designed to develop advanced robots to assist humans through natural and manufactured disasters. Actually, like all robots, the frame is basically only a shell that houses its advanced algorithm brain that controls the hulking mass. That ‘brain’ was written and developed by several teams through the VRC challenge while several others from Boston Dynamics were responsible for the construction of the Atlas. Besides Boston Dynamics, several other agencies and institutes were involved in the DRC including NASA, Carnegie Mellon University, SSCAFT Inc. and Virginia Tech-- all had a hand in developing the robot in one form or another. The Atlas itself is a an agile anthropomorphic robot outfitted with an onboard computer, 28 hydraulic actuators for movement (arms, legs, torso and head), a hydraulic pump complete with cooling system, and 2 fully functional hands built by Sandia National Labs and iRobot. The robot’s head is packed with stereo sensors for accurate data acquisition and LIDAR (analyzing refracted light through laser illumination). Atlas is powered by an onboard battery system to move the massive 330lbs, 6-foot 2-inch body that is controlled by a remote human operator. When it comes to movement, the Atlas is no slouch as it is capable of climbing over obstacles using its hands and feet, dynamic walking and even calisthenics! The teams responsible for the robot’s construct and development have until December of this year (2013) to make any refinements before they head off to compete in the Military Robotics Challenge at the Homestead Miami Speedway in Florida in December. Atlas is set to go up against NASA’s Valkyrie and Robosimion robots along with several other entries from various institutions including an entry from the University of California (Santa Barbara). The challenge will determine a winner that is capable of working in real-world scenarios involving simulated disasters.


Atlas is giving Boston Dynamics' other project, PETMAN, a run for its money.




See more news at:


Tired of cruising from place to place on your typical 4-wheeled transportation machine? Have you already seen all that the sky has to offer high above on your private jet? Or maybe your private yacht isn’t quite meeting your adventurous expectations anymore. Don’t know what else to spend loads of your money on? Well then I believe Spymaster’s new underwater mode of transportation is for you.


At the recent Harrods Technology 2.0 showcase, Spymaster made a dazzling announcement of a brand new private submarine it is now taking orders on that is sure to spark the interest of many wealthy adventurers around the world. Don’t be fooled by the model image of Spymaster’s Orcasub, it has already been announced that the underwater vehicle will be available at an entry price of $2-million USD.


The Orcasub’s full scale two-seater design is built to a 22-foot long size, 4 ton weight, capable of dropping to a depth of 2000 feet below sea level. Its construction is much like that of a plane; two floor pedals and a joy stick allow drivers to control thrust, lift, drag, and can even perform aerial-like maneuvers such as banks and curves. The Orcasub also provides 80-hours of life support and 60,000 Lumens of LED headlight power to illuminate the dark, deep-sea waters - all powered via on-board battery.


And, of course, if you have some “extra” money laying around, Orcasub will also be making a $9.32-million version that can dive up to 6000 feet below sea level. That may not be deep enough to go pay the Titanic a visit, but it sure shows the depth one is willing to explore into their wallet to experience what the Orcasub has to offer. In addition to every purchase, Spymaster will cover five days’ worth of Orcasub training to help buyers acclimate themselves with their new toy.


Well it seems we can build just about anything nowadays given the right amount of money. Might be time to start pushing those resources toward even more life-enhancing technologies, as the Orcasub shows us just a smidge of what we are capable of. For now, we know what’s going to be on the holiday wish-list of many rich folks worldwide.



See more news at:


Volvo’s new Pedestrian and Cyclist Detection System monitors the distance between pedestrians and vehicles while also tracking their movement path. (via Volvo)


Motor vehicle related accidents make up a large amount of preventable injuries and fatalities that occur in heavily populated, urban environments around the world. With the recent advances in remote sensing technologies, the ability to prevent such incidents is at an all time high. For example, Volvo, who first introduced a pedestrian detection system in its 2010 lineup of vehicles, has just announced an upgrade that includes cyclist detection with full-auto brake.


The 2013 Geneva Motor Show attracts a huge audience of car enthusiasts to witness the unveiling of vehicles equipped with state-of-the-art technologies. Volvo’s new accident prevention system, aside from other enhanced bells and whistles, will be part of a new vehicle lineup focused on quality and attention to detail to meet customer needs.


The upgraded pedestrian and cyclist detection system utilizes two forms of in-vehicle environmental awareness: a radar unit integrated into the vehicle’s grille scans the area ahead while a rear-view mirror mounted camera detects the type of objects in front of the car. The two technologies work in synergy with a central control module that directs the vehicle to take immediate preventive measures if need be.


By continuously monitoring the traffic situation around the vehicle, the significance of pedestrian or cyclist injury can be reduced or avoided altogether via a full auto-brake system response. For example, if a cyclist riding ahead of an oncoming vehicle were to quickly swerve in from of the Volvo, the advanced sensor system would immediately detect the distance between the car and cyclist, track the moving path of the cyclist, and apply a full-brake command. The system will also apply to vehicles driving in the same lane.


Though likely not of the flashiest in-vehicle enhancements found at the Geneva Auto Show, Volvo’s new innovative system is well constructed to help diminish very preventable on-road accidents worldwide. Volvo’s cyclist and pedestrian detection system will come standard in its V40, S60, V60, XC60, V70, XC70 and S80 models starting in mid-May of this year (2013).



See more news at:


Squito, toss this camera for eyes on a dangerous location (via serveball)


Photographers know how difficult it can be to get those perfect shots. More so if there trying to capture images during extreme conditions such as fires, foreign insurgent uprisings or ejecting out of a fighter plane traveling at over 400 knots (ok maybe not so much that last one, but it could happen). Some photographers go to great lengths to capture images that they’ve been known to construct intricate railing systems or housing their expensive gear in protective casings to not only capture those shots but to limit the damage equipment can incur doing so. While some photographers and videographers have used some rather crazy ideas to garner a one of a kind shot, none (or just a few?) has thought to actually throw the camera in the air to grab an image or video clip. That’s just plain insane, well actually that used to be the thought until now as Serveball has developed a camera system that is actually designed to be thrown. Sure it may not be the first (the military has small robotic camera systems that can get in some pretty tight spaces), but it’s loaded with features that would make any camera enthusiast take a second look.


Their camera, known as Squito Throwable 3600 Panoramic Camera, is outfitted with a series (3 total) of miniature cameras encircling the casing allowing it to take panoramic images and video while ‘in flight’. The sphere is also equipped with embedded positioning and orientation sensors (IMUs) along with micro-controller and image processor that allow the camera to take the various images taken from each camera and stich them together forming the panoramic view. The onboard sensors also function to stabilize the images and video while maintaining a positive lock on the subject when the camera is spiraling/spinning through the air. Both the images and video are sent wirelessly to the user’s mobile device or desktop for immediate viewing. The Squito, about the size of a tennis ball, is designed for use in recreational sports, aerial point of view shots and fly-through video applications (unfortunately, it can’t be used as a baseball). The company has another version of the throwable camera, known as the Darkball, for more extreme operations such as SAR, tactical military operations and first responder applications. The camera system for this model also includes near infrared and thermal imaging systems that allow it to be used in total darkness or in a foggy or smoke laden atmosphere such as burning buildings or a bog on a creepy cool autumn evening. The company states that both the Darkball and Squito can survive repeated high-g impacts without affecting, or right out destroying, the camera’s internals. There’s no word yet on when the cameras will be released or how much the cost will be for each, but you can expect them to be available in the near future once their development phase is over.




See more news at:

small 1.jpg

"oh, small helicopter... why can't we fly away?" Micro-helicopter, Nano-Falcon, wows audiences at the Tokyo Toy Show (via CCP)


Everyone has an inner child no matter how old one becomes. Some things we grow up playing with as children can be just too fun to ever really stop playing with them. Some things we may have never had the chance to experience while growing up are readily available today thanks to technology. For instance, toy helicopters and drones. Just a few years ago, toy helicopters could cost upwards to the hundreds of dollars to purchase. Only true hobbyist and enthusiasts were capable of purchasing or building them for recreational use. However, thanks to the steady increase of technology, the prices of these drones has dropped dramatically, and the availability of them has increased substantially. Some are in the tens of dollars.


A new type of remote-controlled helicopter was revealed just recently at the Tokyo Toy Show. The toy show occurs annually and everything from action figures, LEGOs, and video games are on display for the world to see. One thing that caught many people's eyes was a tiny toy helicopter. The helicopter, which the creators claim to be the smallest one ever made, was created mostly using parts from old smart phones. The chopper sizes in at 6.5cm long and weighs only 11 grams. It has a range of 15 feet and goes by the name of the Nano-Falcon.


small 2.jpg

Nano-Falcon, sure to be a stocking stuffer this holiday season. (via CCP)


The creators say it was made specifically to target Japan's adult population. Head of sales at maker CCP, Naoki Nakagawa said, “Japan's aging population made us think of developing a toy targeting adults. Ten or twenty years ago, helicoptor-toys could cost a lot of money. Those who couldn't afford it at the time can now make their childhood dream come true at a reasonable price.” The Nano-Falcon is selling for 4700 yen in Japan which is equivalent to about $47.00 in the U.S.


Another drone, which is a step up from the Nano-Falcon, is Parrot's AR. Drone. The French company, Parrot, has built their drones for serious augmented reality fun. The quadcopters are designed to be controlled through mobile operating systems such as tablets and smart phones. The first drone was revealed at CES in 2010. In addition, the apps to support the drones include different types of operating modes. Users can joy ride in AR Freeflight, race other drones in AR Race, or can interact with other drone owners in combat simulations.



AR. Drone 2.0 "black box" capable of recording two hours of HD video footage. Also an interface to the drone itself through smartphones or tablets. (via AR Drone)


The drones themselves run off an on-board Linux computer. This computer communicates with the user's device through Wi-Fi capabilities. A 15-Watt brushless motor keeps the drone airborne while an ultrasonic altimeter provides vertical stability. Additionally, the whole system is powered by an 11.1 Volt lithium polymer battery. The battery can handle up to 12 minutes of flight while also powering the dual cameras the drones feature. Each drones consist of a vertical camera and a front facing camera. The front camera is a VGA sensor made with a 93 degrees lens while the vertical camera features a 64 degrees lens and is capable of video recording up to 60 frames per second.


Although the drones made their first appearance at CES in 2010, they did not stop there. It seems each year since they have been back at the annual event featuring not new drones, but rather hardware and software upgrades for their customers. Each small upgrade enhances the user's ability to control the drone and overall create a better product. For example, the camera quality was increased and the sensors on board were either upgraded or had an option for optimized software. The latest upgrade for the drones is what is called the “Black Box.” It is essentially a flight recorder, which features 4GBs of storage dedicated to tracking the drone. Furthermore, it allows desktop support for the drones. Users can set a series of pre-planned points using the GPS and the drone will navigate through the path. Along with a 50 percent increase in battery life, these toys aren't just for kids anymore.


The drones are opening up all types of new ideas and ways for people to see the world. The AR. Drone has been used in multiple research experiments and even has its own Open Source Application programming interface for game development. In general, drones are becoming more commonplace within the public. Many university students are now beginning to start societies featuring aerospace and aeronautic studies. Some are even branches of IEEE student societies. During a large protest in 2011, a modified AR drone was used to monitor the police. This protest is known as the Occupy Wall Street protest and the man providing the live video of the police activities was Tim Pool. Not only was this a demonstration of technologies' progression but also of how accessible and readily available drones are becoming. Drone technology is only at its infancy, we can expect to see more of them in the future and upcoming years.



See more news at:

Cabe Atwell

Kinect Virtual Smash

Posted by Cabe Atwell Jul 3, 2013

kinect smash.JPGkinect smash2.JPG

(Left) A face getting squished by the Keio University's rolling pin. (Right) It does not roll like a regular kitchen tool, but houses a series of rollers and sensors to track what the user wants to do. (via Keio University)


Have you ever felt like smashing someone’s face? Here is yet another application for Microsoft’s Kinect camera that may virtually satisfy that kind of daydream.


A team of researchers lead by Yasuaki Kakehi from Keio University used the shape and image information collected by the Kinect to create a virtual image that can be felt, squished and distorted with the help of a specialized mechanical roller.


The setup looks like this: a Kinect camera is position above the object to be “rolled,” then the image appears on a horizontal screen upon which the specialized robotic roller deforms and squishes it virtually.


The Kinect can detect the convex or concave shape of an object, which can then be converted to the appropriate roller resistance that provides the user haptic feedback. To create this virtual touch sensation, five sets of roller are housed inside of what looks like an old-fashioned wooden roller.


As the apparatus moves over the image on the flat screen, cranks move each set of rollers down to touch the screen at precise locations and pressures to match the image and shape. The friction of the rollers on the screen creates the illusion of material resistance.


The virtual haptic feeling of resistance changes as the image is distorted just as a piece of dough would provide less resistance as it flattens out. The image itself also becomes stretched out and after the user is done modifying the image; it can be printed out as a souvenir.


This system could also be used for deforming stationary or non-deformable bodies while replacing them with symbols or characters. Games that make use of the haptic feedback could also be developed. Nevertheless, I can think of a couple stress relieving applications myself.


Via Diginfo:



See more news at:


Technology is changing and shaping our culture in so many ways that it should not come as a surprise to see robots playing music. Music is a large part of all cultures. It has been with us throughout history from the good times to the bad. Whether people are gathering to celebrate or marching off to war, music has always been there. In addition, in today's younger society techno and electronic music is becoming more popular. Therefore, to add a little more technology to that mix, a couple scientists from Tokyo have built robots to play their music for them.



The scientists, Yoichiro Kawaguchi and Naofumi Yonetsuka, are from the University of Tokyo and have designed and created the robots themselves. The band consists of three different robot band members. Ashura is the drummer, and according to the scientists, he is equivalent to four people playing drums at the same time. There is also a keyboard player, Cosmo and a guitarist, Mach. Z-machines is the name of the band and just recently in Tokyo they played their first show in a club to an audience of about 100 people. The way the robots are designed gives the guitarist the equivalent of 78 fingers playing with 12 picks. If that wasn't enough, the drummer yields 21 sticks and the keyboard player shoots lasers from his eyes.


Z-machines is not the first band to come about from robots. In America, Compressorhead was introduced to the world last year. The heavy metal robots are also made up of a band with 3 members. However, rather than playing electronic dance music, the band sticks to metal and rock music such as Pantera and AC/DC. It took the robots some time to come together though. In 2007 Stickboy, the drummer, was built with 4 arms, 2 legs, and full metal head. 2 years later in 2009 Fingers, the guitar player joined featuring 78 fingers. Finally, in 2012 Bones, the bassist was created and is known as the youngest player in the band. They also like to keep up a pretty amusing website, A video of Compressorhead below:



Robots are amazing creations, even outperforming our own musical skill. It seems that we may be able to create them to do anything we can think of. Nevertheless, their performance and “talent” is limited to what we make it. Robots lack one important thing when it comes to music. That is creativity. Only humans can create what we refer to as enjoyable music. Rhythm and beats are something we know when we hear and only we will be able to visualize and create. That is, until the singularity comes.



See more news at:

Collection of Custom Controls.pngThree components on Lap Pad.png

(Left) Controller parts (Right) Lap pad (via Caleb Kraft & Hackaday)


Gaming, computer access and use of digital devices can help any person learn and have fun. The ways in which they are used are not always as simple as we take them to be. Disabilities have a wide range of ways they can manifest which could make the use of digital systems painful, awkward, inefficient and even impossible.


Caleb Kraft does not call himself a hacker. He says his skills involve spreading information and connecting people together. However, when he learned about the highly inflated costs attached to simple computer switches intended to make use of computers etc. more accessible for people with disabilities, he decided to put his hacker hat on and create something to help a boy named Thomas with muscular dystrophy. While Thomas’s condition is not very rare, the way this disorder impacts other people can be very different so one controller is not necessarily suited for all. Recognizing that making digital controllers for people with disabilities can boil down to a per-case basis, Kraft created an open source project called The Controller Project to bring people who want to help together with people who need it.



The controller setup used by Thomas to play Minecraft (via Caleb Kraft)


Caleb’s controller for Thomas was one specifically designed to help Thomas play Minecraft, his favorite computer game. At the moment, Thomas can still use a regular video game controller but his abilities will diminish in time. In anticipation of his condition worsening, Thomas is able to apart all three components of his custom controller and Velcro them to a lap pad in a way that is comfortable for him.


These three components run on an Arduino micro and are tethered together using a Teensy board. Kraft created a D-pad, hacked a PSP joystick, and a button pad that Thomas can use to play Minecraft. These components simply emulate keyboard keys like A, S, D, W for directions, E, Q, space bar for accessing game inventories and functions as well as mouse clicks and movements. This means that no software installation is necessary.


Kraft created the plastic housings for these controllers using a 3D printer given to him by Luckily Lulzbot for prototyping the project. He used super cheap 6mm momentary pressure switches, some of which he modified to be pressed with levers, lowering the weight necessary to depress them from 60 grams to 15 grams.


With strong desire to provide these services to people with special needs, The Controller Project is collecting very interesting tools and making them part of the public domain. A project called BlinkTalk allows users to put strings of words together using nothing but the blinking of their eyes. Another accessibility tool uses a TV IR remote in place of a mouse for controlling the on screen cursor. The low-pressure switches Kraft created are also sold on the site for just $20. Of course, the parts lists and source code for Thomas’s controller are there and the community is encouraged to improve upon them.


People interested in helping can find links to other organizations like where they can be put in contact with people looking for custom controllers. People can share custom requests and controllers can be donated but Kraft points out that charging for labor is fair and often times results in a product that is still much cheaper than commercially available units.


The basic Arduino code for setting up the keyboard emulator can be found on here . When the free market can’t help, hackers and the community will do the job just fine. See also, the AbleGamers Foundation for more info on how to help people with disabilities enjoy some gaming...



See more news at:


Leap Motion controller (via Leap Motion)


The Leap Motion interface is just a couple of weeks away from sitting on your desk. Fortunately, unlike other secretive Natural User Interface (NUI) manufacturers, the makers of the Leap released all necessary information to developers months ago and have distributed around 12,000 units to them. This way, come launch day, there will already be tons of added functions and applications to use with the Leap.


Developer groups have hacked the Leap to use it as gesture interface to control quadrocopters and create augmented reality experiences by connecting them to smartphones. Google has also jumped into the Leap hacking game. In celebration of Earth Day, Google Earth announced it now supports the Leap so users can explore the world with natural hand gestures.


Stanford developer, electrical engineer and leader of the Human-Machine Technologies Organization, Bryan Brown, is taking Leap hacks one step further by developing a “middleware” applications that allows all sorts of NUI tech, like the Kinect, Creative Gesture Camera, the Leap Motion and others, to interface with off-the-shelf control hardware. Brown believes that since most of our natural communication is done without physical touch, voice or gesture interfaces will find their natural place in electronics.


This middleware app is called NUILogix and it does exactly what the name says: provide the logic necessary to interface NUI tech to machine control hardware using Mobus/TCP and OPC-UA protocols. Of course, the control hardware is then used to control machinery.


A few applications of the NUILogix have already been demonstrated. Brown used this middleware and a Leap to control an OWI 535 robotic arm by using the app’s controls that appear on screen. Using simple gestures he could move the arm up and down, side to side and even pick up balls and put them in cups.


The gestures used to control the arm were not entirely intuitive because they were used to adjust on-screen controls. However, another of Brown’s hacks does make use of intuitive gestures to control a toy RC boat. Holding you hand above the Leap pad and tilting it down, will make the boat speed up as if you were pressing an invisible gas pedal. Tilting the hand up will cause the boat to decelerate and tilting to the right or left can turn the boat in that direction.


Gesture and touchless interfaces could have many uses like keeping surgeons or nurses from touching certain tools. People with disabilities could use them to control machines and hardware without having to make intricate, forceful or meticulous finger movements. The Leap has raised so much excitement that HP has announced it will integrate it into certain devices in the future.


NUILogix is being developed by Brown and others at the Human-Machine Technologies Organization as part of the Intel Perceptual Computing Challenge. The Leap Motion is currently taking pre-orders from July 22nd release, and it will also be a Best Buy exclusive on July 28th (2013).




See more news at:


QUADEYE military IR headset (via Elbit Systems)


Infrared optical devices can be extremely expensive, depending on the application for which they are being used. This is especially true when it comes to science and military applications such as counter-terrorism where SOF units require the ultimate in IR technology such as Elbit System’s QUADEYE. The night-vision 4-tube goggles provide a wide-angle view greater than conventional NVGs but cost a whopping $65,000+ per pair. This holds true for IR camera systems (such as FLIR) as well which can cost in excess of $150,000 when used on platforms such as naval-based ships or aircraft (reconnaissance). Handheld IR cameras on the other hand have been on a steady decline regarding costs and can be found for a few hundred dollars (depending on models) for a quality imager.



Infragram camera prototype concept and a photographic example (via PublicLab)


However, what if you are an agricultural spy on a strict budget and haven’t the skills to make your own? If that’s the case then perhaps Public Lab has the answer with their Infragram near-infrared camera. The camera was developed to monitor the damage done to the surrounding wetlands as a result of the BP oil spill in the Gulf of Mexico (back in April 2010). Public Lab is a conglomerate of individuals who develop open-source technology tools (hardware and software) for grass-roots environment exploration. Its members developed the Infragram camera in an effort to diagnose plant health using the IR spectrum. Plants absorb most of the visible light we humans can see for photosynthesis (growing) and reflect light in the near IR spectrum. It is that IR reflection that can viewed as a way to monitor plant health - the more reflection, the healthier the plant. The camera works by taking two separate images with one being a normal print and the other near-IR. Both images are then checked against a false-color composite, which show the differences in the IR reflection and therefore the overall health of the plant. The Infragram system makes use of an Infrablue filter that negates a regular digital cameras red (think RGB like any monitor or TV) channel thereby making the image near-IR. The image is saved to the cameras on-board storage, which is then uploaded and processed online where the image is combined with the blue and infrared channels producing the IR image. Public Lab has looked to Kickstarter to fund the development of their Infragram system for cheap cameras and webcams and has surpassed their goal of $30,000 US with over $52,000 in just one week. Infragram is set to come in three flavors with the first being a simple DIY filter pack which is essentially a single sheet of infrablue filter that you install on your own (with instructions of course). The second option includes an Infragram webcam circuit board that you can combine with Raspberry Pi or Arduino depending on your project. The third option includes a basic point-and-shoot camera (bought in bulk) outfitted with the Infrablue filter. The Lab hopes to retail their Infragram camera at a price point of about $35 or less but those looking to get their hands on one now can pledge $95 or more to get an experimental unit while a pledge of $10 gets you just the filter and $35 nets you the webcam circuit board.



See more news at:

cheetah-cub.jpgcheetah cub bot.png

(Left) Cheetah-bot on its own showing a cute, familiar, cat-like stance. (Right) Shown for size comparison (via EPFL)


EPFL, the Swiss Federal Institute of Technology in Lausanne, has been busy of late designing several impressive biomechanical robots that intend to increase human understanding of the natural world. A previous post covered a salamander robot that mimicked the locomotive control system of a salamander using electronic controls to replace the signals normally sent up and down the animal’s spine. This time ‘round, the school’s BioRobotics laboratory enlists the help of a feline in its latest biomechanics work.


The International Journal of Robotics Research recently published a study detailing the lab’s study of a “cheetah-cub robot.” This fast and stable mechanical creature brilliantly mimics the natural movements of a housecat with much interest going into its leg design.


The reason the lab chose a cat for their design was simple - they didn’t have enough lab space to build a whole cheetah. In hopes of keeping their research focused on the biomechanical study of the undisturbed natural world, dogs were eliminated from the equation due to the high influence of human breeding in their evolution.


EPFL’s CatBot is approximately 8 inches long and 6 inches tall and is made of readily available, off the shelf components. At this stage of the research, operation is still performed via wire - though in the future, remote operation will allow for a wide range of outdoor applications. Its legs in particular mimic the movements and morphology of a housecat quite well - tendons are accounted for by spring-loaded pantograph leg design, and actuators on each leg are used in place of muscles.


As the fastest small quadruped robot of its kind, the 30kg cheetah-cub bot scurries along at a speed of 1.4 m/s that is equivalent to about seven body lengths per second. Even at its higher speeds, the leg design’s auto stabilization features allows the robot to traverse up and down obstacles up to 20% of its leg length.


Future applications will include rough terrain search and rescue operations that are generally troublesome for track and/or wheel driven robots. Studies have already shown the feline bot’s ability to manage tough terrains, and its spring loaded leg design hopes to one day enable quick and agile exploratory missions in unfriendly environments including natural disaster sites.


All in all, EPFL’s recent work on the cheetah-cub bot hopes to continue the trend of bio mimicry and biomechanically based technology use. As Auke Ijspeert, director of the institute’s BioRob Lab explains, the long-term goal is in “studying and using the principles of the animal kingdom to develop new solutions for use in robots...” In other words - when in doubt, look to nature for the answer.


Wrap this bot in fake fur, everyone will love it.




See more news at:


(via eyeSight)


It is surprising that with all the devices and electronics that have built-in camera systems that nobody thought sooner to develop a way to incorporate some kind of gestural control to go along with them. Well that was the case until now as eyeSight Technologies has recently developed specialized software that allows users to interact with apps using only the devices included camera. Essentially the software turns the camera into a full 3D-tracking sensor that allows users to navigate and interact with the device the software is installed on including TVs, PCs and mobile devices. Details as to how the software was developed are sketchy at best; however, it appears as though eyeSight designed their new software around their ‘finger-tracking’ solution that allows users to navigate devices using only a fingertip at an astounding range of 2 to 16 feet away from the camera. Unlike their finger-tracking technology, the updated software does not require that manufacturers embed the eyeSight technology into their hardware in order for the software to work. In a video released on YouTube, a user is navigating Google Maps using swiping hand gestures with a camera-equipped laptop to turn the globe in any direction. Once the users selected location was found the user simply made a fist to zoom into that location by moving her fist closer to the devices camera (pulling the fist away zooms the camera out). Additionally clicking on objects such as files and folders can be done just by touching two fingers together allowing for a more natural approach for navigation negating the need for peripherals such as a mouse. eyeSight’s gestural technology has become popular enough for companies such as AMD and Lenovo to incorporate it into their current generation products such as AMD’s APU-series of processors and the Lenovo series of Ultrabooks. The company states that their eyeSight software can also be used with stereoscopic camera systems as well as IR illuminators so does that mean it can be combined with Microsoft’s first generation Kinect for increased functionality in gaming or other projects? eyeSight’s hand/finger tracking technology is available now in SDK form for developers with now word yet on a full product release.




See more news at:


RoboRoach "models" (via Backyard Brain)


Sometimes science can be a little gruesome especially when it concerns progress in areas such as the medical field. Other times it can be a bit on the strange side, which is the case when it comes to Backyard Brains current endeavor. The neuroscience grad-student team has developed a way to manipulate and control cockroaches using a smartphone. In order to actually control the roach, the team designed a specialized PCB board (known as a backpack) that sends electrical impulses to the roach in order to control its direction of movement. Roaches house neurons in their antenna that are sensitive to both touch as well as smell and use them for navigation. When the antennas contact obstacles such as a wall, it sends an electrical signal to the brain that informs the invertebrate of the object allowing it to change direction. Backyard Brains takes advantage of those electrical impulses by performing minor surgery and implanting a series (3 total) of electrodes inside the roach’s antenna that connect to the backpack. A third electrode is inserted into the bug’s exoskeleton in the thorax section, which is also used to control the roach’s movement. Controlling the roach is done through Backyard Brains RoboRoach app (available for both iPhone and Android mobile devices) which sends directional commands wirelessly (using Bluetooth) to the backpack which in turn sends an electrical ‘spike’ to the appropriate antenna to move the roach. According to the company’s website, the surgery (starting with anesthetizing the roach) takes around 45 minutes to accomplish and after a few hours of recovery, the cyborg roach is ready to go.


Controlling the roach lasts for only a few minutes before it is able to adapt and overcome the artificial electrical impulses, however the bug will ‘forget’ its adaptation after 20 minutes of being inside its cage for down-time. After a few days (2 to 7) the roach will adapt completely to the electrical impulses and control is no longer possible after which the electrodes and backpack can be safely be removed with no lasting side effects. Backyard Brains is looking for funding on Kickstarter to develop a new streamlined PCB backpack and have so far received over $7,000 US with a goal of $10,000. Those pledging $100 or more will receive the RoboRoach kit, which includes 3 electrode sets, the backpack and a 1632 RoboRoach battery. The company also sells South American roaches by the box (for $24) but states that their kit can also be used on crickets as well. While Backyard Brains RoboRoach control kit may sound like fun for most users, others frown on the whole idea of surgically altered insects. PETA (believe it or not) released a statement about the RoboRoach kit and finds the surgical process as ‘retrogressive and morally dubious’ even though the kit was designed for educational purposes in the neuroscience field. The question remains however if actual medical progress can be reached by controlling the roaches wirelessly.

Like a hero from a sci-fi film, all I want is for the RoboRoach to escape its overlord’s control.




See more news at:


Think this is cruel? It’s only the surface. Check out more insect morbidity below:

The insect surveillance vehicle, infesting in the near future

Tweet controlled insect - Twitter Roach - an art exhibit

Energy scavenging from insects

Moth Learns to Drive Robot

Filter Blog

By date:
By tag: