Goto

Collaborating Authors

 optical lace


This Light-based Nervous System Helps Robots 'Feel'

#artificialintelligence

Last night, way past midnight, I stumbled onto my porch blindly grasping for my keys after a hellish day of international travel. Lights were low, I was half-asleep, yet my hand grabbed the keychain, found the lock, and opened the door. Thanks to the intricate wiring between our brain and millions of sensors dotted on--and inside--our skin, we know exactly where our hand is in space and what it's touching without needing visual confirmation. But this combined sense of the internal and the external is completely lost to robots, which generally rely on computer vision or surface mechanosensors to track their movements and their interaction with the outside world. What if, instead, we could give robots an artificial nervous system?


Nerve-like mesh could give robots a sense of touch more delicate than SKIN on the human back

Daily Mail - Science & tech

A synthetic mesh could give robots a sense of touch that is delicate as the skin on out backs, researchers have claimed. The material forms a linked sensory network similar to that of a biological nervous system -- one that could help robots feel their interactions with the environment. The lattice is made of flexible polyurethane that contains stretchable optical fibres with sensors than can detect how the fibres are being deformed. The device -- a sort-of stretchable optical lace -- was developed by roboticists Patricia Xu and Rob Shepherd of Cornell University and colleagues. 'We want to have a way to measure stresses and strains for highly deformable objects, and we want to do it using the hardware itself, not vision,' said Professor Shepherd.


Nerve-like 'optical lace' gives robots a human touch

#artificialintelligence

The stretchable optical lace material was developed by Ph.D. student Patricia Xu through the Organics Robotics Lab at Cornell University. "We want to have a way to measure stresses and strains for highly deformable objects, and we want to do it using the hardware itself, not vision," said lab director Rob Shepherd, associate professor of mechanical and aerospace engineering and the paper's senior author. "A good way to think about it is from a biological perspective. A blind person can still feel because they have sensors in their fingers that deform when their finger deforms. Robots don't have that right now." Shepherd's lab previously created sensory foams that used optical fibers to detect such deformations.