Posted in | News | Consumer Robotics

SonicSense: A New System for Human-Like Robotic Interaction

New research from Duke University, set to be presented at the Conference on Robot Learning (CoRL 2024) in Munich, Germany, from November 6-9, 2024, introduces a system called SonicSense. This system enables robots to interact with their environment in ways that were previously only accessible to humans.

SonicSense: A New System for Human-Like Robotic Interaction
The ability to feel acoustic vibrations through tactile interactions gives this robotic hand a human-like sense of touch to better perceive the world. Image Credit: Pratt School of Engineering

Imagine sitting in a dark movie theater and wondering how much soda is left in your large cup. Instead of removing the cap to check, you pick it up and give it a slight shake, listening for the rattle of the ice. This helps you gauge if it is time for a refill. Afterward, you might tap the armrest, notice the hollow echo, and conclude that it is made of plastic rather than wood.

Without realizing it, you use acoustic vibrations to understand the world around you. Now, researchers are close to giving robots this same capability, complementing their growing range of sensory skills.

Robots today mostly rely on vision to interpret the world. We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to ‘feel’ and understand the world.

Jiaxun Liu, Study Lead Author and MEMS Ph.D. Student, Duke University

SonicSense features a robotic hand with four fingers, each embedded with a contact microphone at the tip. These sensors pick up and record vibrations when the robot taps, grasps, or shakes an object. Since the microphones are in direct contact with the object, the system can filter out background noise.

By extracting frequency features from these interactions and signals, SonicSense uses its prior knowledge and advanced AI algorithms to identify the material composition and three-dimensional shape of the object. If the object is unfamiliar, the robot may require up to 20 interactions to make a determination, but if the object is already in its database, it can correctly identify it in as few as four steps.

SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects. While vision is essential, sound adds layers of information that can reveal things the eye might miss.

Boyuan Chen, Assistant Professor, Department of Mechanical Engineering and Materials Science, Duke University

Chen and his team demonstrated several capabilities enabled by SonicSense in their study. For example, the system can count the number of dice inside a box and determine their shape simply by shaking or turning the box. It can also determine how much liquid a bottle contains by performing the same procedure.

Additionally, SonicSense can tap the exterior of an object to create a 3D reconstruction of its shape and identify its material—mimicking how humans examine objects in the dark. While this approach has been explored before, SonicSense surpasses previous attempts by using touch-based microphones to block out background noise, employing four fingers instead of one, and integrating advanced AI techniques.

This combination allows SonicSense to handle objects with complex geometries, reflective or transparent surfaces, and materials that typically challenge vision-based systems.

Liu added, “While most datasets are collected in controlled lab settings or with human intervention, we needed our robot to interact with objects independently in an open lab environment. It is difficult to replicate that level of complexity in simulations. This gap between controlled and real-world data is critical, and SonicSense bridges that by enabling robots to interact directly with the diverse, messy realities of the physical world.

SonicSense provides a robust foundation for teaching robots to perceive objects in dynamic, unstructured environments. One of its notable advantages is its cost-effectiveness—construction costs are kept at just over $200 by using commercially available components, such as contact microphones typically used by musicians to record guitar sound, alongside 3D printing techniques.

The team’s next objective is to enhance the system’s ability to interact with a wider range of objects. By integrating object-tracking algorithms, SonicSense will help robots manage cluttered, dynamic environments, potentially surpassing human adaptability in certain tasks.

Additionally, the design of the robotic hand itself marks another significant advancement in improving the system's overall performance.

Chen concluded, “This is only the beginning. In the future, we envision SonicSense being used in more advanced robotic hands with dexterous manipulation skills, allowing robots to perform tasks that require a nuanced sense of touch. We are excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions.

The study was supported by the Army Research Laboratory STRONG program (W911NF2320182, W911NF2220113), DARPA’s FoundSci program (HR00112490372), and TIAMAT (HR00112490419).

SonicSense: Object Perception from In-Hand Acoustic Vibration

SonicSense: Object perception from in-hand acoustic vibration. Video Credit: Pratt School of Engineering

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.