The team designed a 3D-printed glove, which contains a SLIM sensor near each finger, equipped with a lithium battery and Bluetooth. (Image Credit: Cornell University)
Researchers at Cornell University have developed fiber-optic sensors that use inexpensive LEDs and dyes to form a stretchable “skin” capable of detecting pressure, bending, and strain. The sensor could have soft robotic system applications and provide augmented reality users with the ability to feel sensations that mammals rely on to navigate the real world. By providing a sense of touch, the sensor could also measure forces in physical therapy and sports medicine.
By imitating the skin’s tactile sensation, machines could physically feel and explore their environment. This could be utilized in surgical robots that feel their way through a body and robotic prostheses that provide the wearer with a sense of touch. It could also be used to offer new types of data for physical therapy, which include forces applied by a patient when they exercise. This type of therapy may involve motion-tracking to observe a patent’s movement. However, there isn’t a reliable way to measure the forces involved.
The researchers found inspiration from silica-based distributed fiber optic sensors capable of detecting minor wavelength shifts to identify multiple properties, including changes in humidity, temperature, and strain.
One downside is that silica fibers are incompatible with soft and stretchable electronics. The team created a stretchable light guide for multimodal sensing (SLIMS). The long tube includes a pair of polyurethane elastomeric cores. While one core is translucent, the other is full of absorbing dyes in several areas and attaches to an LED. Both cores are coupled with a red-green-blue sensor chip, which registers geometric changes in the optical light’s path.
The SLIMS sensors are capable of detecting pressure, bending, and strain. It also pinpoints their exact location and magnitudes. (Image Credit: Cornell University)
Using a dual-core design boosts the number of outputs the sensor uses to detect a range of deformations, such as pressure, bending or elongation. The deformations are indicated by lighting the dye, which operates as a spatial encoder. Researchers paired the technology with a mathematical model that can decouple the distinctive deformations and identify their exact location and magnitude. A SLIM sensor can function with tiny optoelectronics with lower resolution. This makes it inexpensive and simpler to manufacture and incorporate into systems. The newly developed sensor could be installed in a robot hand to detect slippage.
The team also created a 3D-printed glove, which is powered by a lithium battery, with a SLIMS sensor near each finger. It’s also outfitted with Bluetooth, allowing it to transmit data to software that remodels the glove’s movements and deformations in real-time.
“Right now, sensing is done mostly by vision,” Rob Shepherd, associate professor of mechanical and aerospace engineering in the College of Engineering, said. “We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch. This is the most convenient and practical way to do it in a scalable way.”
The team is exploring how the SLIMS sensors can be used to enhance virtual and augmented reality experiences.
“VR and AR immersion is based on motion capture. Touch is barely there at all,” Shepherd said. “Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tire. If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualization could say, ‘Turn and then stop, so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”
Have a story tip? Message me at: http://twitter.com/Cabe_Atwell