Soft Robots Integrated with Optical Lace Have Increased Sensory Ability

Soft robots could be made even softer with the help of a novel stretchable optical lace. This artificial material produces a linked sensory network akin to a biological nervous system, allowing robots to perceive the way they interact with their environment and thus alter their actions accordingly.

A flexible, porous lattice structure is threaded with stretchable optical fibers containing more than a dozen mechanosensors and attached to an LED light. When the lattice structure is pressed, the sensors pinpoint changes in the photon flow. (Image credit: Cornell University)

The artificial material was developed by PhD student Patricia Xu through the Organic Robotics Lab. The lab’s paper titled, “Optical Lace for Synthetic Afferent Neural Networks,” was reported in Science Robotics on September 11th, 2019.

We want to have a way to measure stresses and strains for highly deformable objects, and we want to do it using the hardware itself, not vision. A good way to think about it is from a biological perspective. A person can still feel their environment with their eyes closed, because they have sensors in their fingers that deform when their finger deforms. Robots can’t do that right now.

Rob Shepherd, Lab Director and Associate Professor, Sibley School of Mechanical and Aerospace Engineering, Cornell University

Shepherd is also the study senior author. Earlier, his lab produced sensory foams that utilized optical fibers to identify such deformations. Xu employed a porous and flexible lattice structure produced from 3D-printed polyurethane for the optical lace project.

She initially threaded the core of this structure with stretchable optical fibers comprising over a dozen mechanosensors, and then illuminated the fiber by connecting an LED light. When Xu pressed the lattice structure at different points, she found that the sensors could point out the variations in the photon flow.

When the structure deforms, you have contact between the input line and the output lines, and the light jumps into these output loops in the structure, so you can tell where the contact is happening,” Xu stated. “The intensity of this determines the intensity of the deformation itself.”

Shepherd informed that the optical lace would not be utilized as a skin coating for robots, but rather would be used more like the flesh itself. Robots integrated with the material would be radically different from the cold and callous cyborgs of dystopian science fiction. Such robots would be more appropriate for the manufacturing and health care sectors, particularly beginning-of-life and end-of-life care, he stated.

There’s a lot of people getting older, and there’s not as many young people to take care of them. So the idea that robots could help take care of the elderly is very real.

Rob Shepherd

Shepherd added, “The robot would need to know its own shape in order to touch and hold and assist elderly people without damaging them. The same is true if you’re using a robot to assist in manufacturing. If they can feel what they’re touching, then that will improve their accuracy.”

Although the new optical lace is not as sensitive as, for example, a human fingertip that is closely packed with nerve receptors, the material is more responsive to touch when compared to the human back. Moreover, the material can also be washed, which presents another application: Shepherd’s laboratory has established a startup company to market Xu’s sensors to produce garments that can determine the movements and shape of a person for augmented reality training.

The investigators are also planning to study the possibility of integrating machine learning to identify more intricate deformations, such as twisting and bending.

I make models to calculate where the structure is being touched, and how much is being touched. But in the future, when you have 30 times more sensors and they’re spread randomly throughout, that will be a lot harder to do. Machine learning will do it much faster.

Patricia Xu, PhD Student, Organic Robotics Lab, Cornell University

Even as robots turn out to be more aware of the way they interact with the environment, humans still have a clear advantage over them.

We don’t know our own bodies as well as some robots know parts of their limbs and joints and fingers,” Shepherd further added. “But we can still perform better than robots generally because we can deal with uncertainty through mechanical design, as well as neural architecture.”

The paper was co-authored by postdoctoral researcher Anand Kumar Mishra, PhD students Lillia Bai and Cameron Aubin, and Letizia Zullo from the Center for Synaptic Neuroscience and Technology, Istituto Italiano di Tecnologia.

The study was supported by grants from the Air Force Office of Scientific Research and the National Institutes of Health.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.