Latest
news...

+ products

Robotic artificial intelligence learns to be spontaneous

Wednesday, 18 November 2020 08:37

3D printed glove, lined with stretchable fiber optic sensors that use light to detect a series of deformations in real time. (Photo: Cornell U.)

 

It's no exaggeration to say that stretchable sensors could change the way squishy robots will function and feel. In fact, they will be able to feel quite a bit.

 

Researchers at Cornell University have created a fiber optic sensor that combines inexpensive LEDs and dyes, resulting in a stretchable "skin" that detects deformations such as pressure, bending and tension. This sensor could give soft robotic systems - and anyone using augmented reality technology - the ability to feel the same rich tactile sensations that mammals rely on to navigate the natural world.

 

The researchers, led by Rob Shepherd, associate professor of mechanical and aerospace engineering, are working to commercialize the technology for physical therapy and sports medicine.

 

Their work was published in the journal Science. The co-authors of the article were PhD students Hedan Bai and Shuo Li.

 

Bai was inspired by silica-based distributed fiber optic sensors and developed an Extendable Light Guide for Multimodal Sensing (SLIMS). This long tube contains a pair of elastomeric polyurethane cores. One of the nuclei is transparent; the other is filled with absorbent dyes in multiple places and connects to an LED. Each core is coupled to a red, green, and blue sensor chip to record geometric changes in the optical path of light.

 

The researchers designed a 3D-printed glove with a SLIMS sensor on each finger. The glove is powered by a lithium battery and equipped with Bluetooth so that it can transmit data to the basic software, which Bai designed, which reconstructs the movements and deformations of the glove in real time.

 

"Right now, detection is mostly done through vision," Shepherd said. "We hardly measure touch in real life. This skin is a way of allowing ourselves and machines to measure touch interactions in the way that we now use our phone cameras. It's using vision to measure touch. This is the most convenient and practical way to do it in a scalable way."

 

Bai and Shepherd are working with the Cornell Technology Licensing Center to patent the technology, with an eye toward applications in physical therapy and sports medicine. Both fields have taken advantage of motion-tracking technology, but have so far lacked the ability to capture force interactions.

 

Researchers are also investigating the ways that SLIMS sensors can power augmented and virtual reality experiences.

 

"The immersion in VR and AR is all about motion capture. The touch is hardly there," Shepherd said. "Suppose you want to have an augmented reality simulation that teaches you how to fix your car or change a tire. If you had a glove or something that could measure pressure, as well as movement, that augmented reality display might say, 'Turn and Then stop, so you don't over-tighten the nuts. 'There's nothing out there that does that right now, but this is one way to do it." (Source: NCYT Amazings)

 

 

Source: Redacción, N., 2020. Un Sensor Cutáneo Estirable Proporciona A Los Robots Esta Sensación Humana. [online] Noticias de la Ciencia y la Tecnología (Amazings® / NCYT®). Available at: <https://noticiasdelaciencia.com/art/40183/un-sensor-cutaneo-estirable-proporciona-a-los-robots-esta-sensacion-humana> [Accessed 17 November 2020].


At Ineltec we offer tailored made solutions to perform any kind of test, doing a strong analysis of all the details so we can provide a tailored answer, that is also the most efficient and affordable solution to our client. If you need more information, don't hesitate to contact us by sending your request to This email address is being protected from spambots. You need JavaScript enabled to view it..

 

If you have suggestions on our blog or need information from our teams, do not hesitate to contact us.

Commercial, Marketing and Communication Department

Email:  This email address is being protected from spambots. You need JavaScript enabled to view it.

Tel: (+34) 938.605.100