Intelligent Robots Get a Grip on Garbage Sorting

Researchers from Tsinghua University have developed an intelligent robot tactile system to sort garbage. This work has been published in Applied Physics Reviews by AIP Publishing.

Intelligent Robots Get a Grip on Garbage Sorting
Successful garbage sorting by an intelligent robot tactile system. Image Credit: Qian Mao and Rong Zhu

Modern intelligent robots can accurately recognize many objects through vision and touch. They can also identify objects they have handled in the past by using machine learning algorithms and tactile data collected by sensors.

However, sensing is often confused when presented with objects similar in size and shape or unknown to the robot. Robot perception is further limited by background noise and objects of the same type that differ in size and shape.

Humans possess many types of touch sensing, one of which is thermal feeling. This enables us to perceive hot and cold, feel the wind blowing, and distinguish between different types of matter, like metal and wood, because of the various cooling sensations produced. The researchers hoped to imitate this capacity by creating a robotic tactile sensing technique that included thermal sensations for more reliable and accurate object detection.

We propose utilizing spatiotemporal tactile sensing during hand grasping to extend the robotic function and ability to simultaneously perceive multi-attributes of the grasped object, including thermal conductivity, thermal diffusivity, surface roughness, contact pressure, and temperature.

Rong Zhu, Study Author, Tsinghua University

The group developed a layered sensor with a porous middle layer that is heat-sensitive, material detection at the surface, and pressure sensitivity at the bottom. An effective cascade classification algorithm was used in conjunction with this sensor to filter out different object types step-by-step. The algorithm starts with basic categories like empty cartons and progresses to more complex ones like orange peels or cloth.

The group created a tactile, intelligent robot system to sort trash. A variety of typical trash items were collected by the robot, such as empty cartons, leftover bread, plastic bags, bottles, napkins, sponges, orange peels, and outdated medications. It divided the garbage into distinct bins for hazardous waste, food scraps, recyclables, and other waste types. Their system recognized trash objects it had never seen before, with a classification accuracy of 98.85 %. This effective trash-sorting behavior has a wide range of applications for smart life technologies and has the potential to reduce human labor in real-world situations significantly.

Improving autonomous implementation and robotic embodied intelligence will be the main goals of this field’s future research.

In addition, by combining this sensor with brain-computer interface technology, tactile information collected by the sensor could be converted into neural signals acceptable to the human brain, re-empowering tactile perception capabilities for people with hand disabilities.

 Rong Zhu, Study Author, Tsinghua University

Journal Reference:

Mao, Q. & Zhu, R. (2024) Enhanced robotic tactile perception with spatiotemporal sensing and logical reasoning for robust object recognition. Applied Physics Reviews. doi.org/10.1063/5.0176343

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.