Ray Zhao’s Breakthrough in AI-Powered Prosthetic Technology

XYZ Media, a media and marketing company at the intersection of education and technology, proudly announces the latest feature in its “Next Generation of Innovators” series. This initiative showcases young innovators addressing global challenges through innovation. Ray Zhao, a student researcher from Mission San Jose High School in Fremont, CA, is being recognized for his research work on NeuroLimbAI—a prosthetic arm powered by a sophisticated AI-driven EEG control system, introducing haptic feedback to recreate a realistic sense of touch and motion.

NeuroLimbAI tackles some of the most critical issues in modern prosthetics: limited control, lack of sensory feedback, and high costs. Ray’s AI-driven EEG-based approach leverages a non-invasive brain-computer interface that translates users' brain signals into movement commands for the prosthetic, allowing intuitive, precise control. With machine learning, NeuroLimbAI interprets EEG data to guide hand movements with impressive accuracy—achieving over 96% precision in test scenarios. This high accuracy enables real-time responses that feel more natural to the user.

The system’s electrotactile and vibrotactile feedback mechanisms also simulate realistic touch and pressure sensations, helping users regain spatial awareness and minimize phantom limb pain. The NeuroLimbAI prosthetic has demonstrated its effectiveness in essential tasks such as picking up and releasing objects, providing users with a sense of touch sensitivity, and adjusting grip strength through feedback. This model’s ability to filter out noise in raw EEG data was a significant breakthrough, enhancing its reliability and real-world performance.

The hardest part of the project was translating the brain signals to guide prosthetic arm movement,” said Ray Zhao. “Most previous research used static data for training pre-existing labels, and applied machine learning algorithms that did not take sequence of the signal into consideration. To overcome this, I leveraged the power of the transformer model with positional encoding to convert the brain signal into rich contextual embeddings before feeding into the classification model. I also train and inference the model using live data via EMOTIV’s API.”

The NeuroLimbAI prosthetic, designed with origami-inspired elements, mirrors the flexibility of the human hand. Its machine learning model continues to learn and adapt, promising to bring natural, accessible prosthetic solutions to users in need. Ray expressed his hopes for the project’s future: “I hope for a future where AI-based prosthetics feel like a natural extension of ourselves, restoring both movement and a sense of independence. With NeuroLimbAI, I’m working to bridge together the mind and the machine, allowing for further development for prosthetics that respond as seamlessly as a regular human arm would. I hope that with my research, I can redefine human capability and give people back a part of themselves.”

“Ray’s NeuroLimbAI prosthetic arm demonstrates just how much our young scientists can innovate in finding accessible healthcare solutions,” said Jordan Hayes, Director of Communications at XYZ Media. “The combination of advanced sensory feedback and affordability is a powerful example of how accessible technology can really make a difference in the future.”

“Next Generation of Innovators” highlights the achievements of young researchers turning innovative ideas into impactful solutions. Through this platform, XYZ Media seeks to inspire future leaders to create meaningful change.

Source:

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.