Posted in | News | Medical Robotics

Innovative Robotic Hand for Amputees Combines User and Automation Controls

Researchers at EPFL are developing new methods for better control of robotic hands—especially for amputees—that integrate individual finger control and automation for enhanced grasping and operation.

(Artoni Fiorenzo tries out shared control with the robotic arm. (Image credit: 2019 EPFL/Alain Herzog)

This interdisciplinary proof-of-concept between robotics and neuroengineering was effectively tested on three amputees and seven healthy study participants. The results have been published in the latest issue of Nature Machine Intelligence.

The new technology combines two concepts from two diverse fields. Executing them both together had never been achieved before for robotic hand control, and adds to the forthcoming field of shared control in neuroprosthetics.

One theory, from neuroengineering, includes decoding intended finger movement from the muscular activity on the amputee’s stump for separate finger control of the prosthetic hand which has never been achieved before. The other, from robotics, enables the robotic hand to help get a hold of objects and maintain contact with them for strong grasping.

When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react. The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.

Aude Billard, Head of Learning Algorithms and Systems Laboratory, EPFL

How Shared Control Works

The algorithm first studies how to decode user intention and converts this into finger movement of the prosthetic hand. The amputee must carry out a series of hand gestures so as to train the algorithm that uses machine learning. Sensors positioned on the amputee’s stump sense muscular activity, and the algorithm picks up which hand movements match which patterns of muscular activity.

Once the user’s intended finger movements are learned, this information can be used to regulate each finger of the prosthetic hand.

Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” said Katie Zhuang first author of the publication.

Subsequently, the researchers designed the algorithm so that robotic automation commences when the user attempts to grip an object. The algorithm informs the prosthetic hand to close its fingers as soon as an object makes contact with the sensors on the surface of the prosthetic hand.

This automatic grasping was adapted from an earlier study for robotic arms built to infer the shape of objects and grasp them based on only tactile information, without the aid of visual signals.

Several challenges are still present in the algorithm before it can be incorporated in a commercially available prosthetic hand for amputees. For the time being, testing of the algorithm on a robot (offered by an external party) is still ongoing.

Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices.

Silvestro Micera, Bertarelli Foundation Chair in Translational Neuroengineering, EPFL

Micera is also a professor of bioelectronics at Scuola Superiore Sant’Anna in Italy.

A smart artificial hand for amputees merges user and robotic control

(Video credit: EPFL)

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.