Researchers are ready to advance their tests of a novel brain-computer interface (BCI) from animals to human subjects, and the Defense Advanced Research Projects Agency just granted them more than $6 million over the next three years to get those human clinical trials under way.
A BIONIC prosthetic arm that is controlled by its operator's thoughts and even restores some skin sensation to the amputee went on display Thursday at a major US science conference.
More than 50 amputees worldwide, many of them military veterans whose limbs were lost in combat, have received such devices since they were first developed by US doctor Todd Kuiken in 2002.
The arm uses technology called Targeted Muscle Reinervation (TMR), which works by re-routing brain signals from nerves that were severed in the injury to muscles that are working and intact.
Glen Lehman, a retired US military sergeant who lost his arm in Iraq, was scheduled to demonstrate the latest technology at the annual conference of the American Association for the Advancement of Science.
"More than 20 years ago, I came to an understanding that current prostheses really fell short in their ability to enhance function and movement for amputees," said Kuiken in a statement released ahead of the presentation.
"There was a significant unmet need to improve the lives of amputees, and I wanted to develop a technology that would help," he said. A series of other efforts to test and improve on this mind-reading robotics, known as brain-computer interfaces, were also to be showcased at the conference.
Ongoing research out of the Johns Hopkins Applied Physics Laboratory and the University of Pittsburgh has already demonstrated that the team's tiny 10x10 array of electrodes implanted on the surface of a monkey's brain can process activity from individual neurons to guide a robotic arm through such simple tasks as turning doorknobs and eating marshmallows.
"Our animal studies have shown that we can interpret the messages the brain sends to make a simple robotic arm reach for an object and turn a mechanical wrist," says Andrew Schwartz, a professor of neurobiology at the Pitt School of Medicine and a senior investigator on the project. "The next step is to see not only if we can make these techniques work for people, but also if we can make the movements more complex."
Expected to launch later this year, the study will test two separate electrodes on human participants, and the team is already brainstorming more sophisticated approaches to the technology, including a telemetry system that would enable wireless control of prosthetic arms with sensory components.
Schwartz has worked on algorithms to translate the brain's electrical activity into physical movements for several years; initially he showed that monkeys can use thought alone to move cursors on computer screens, and in 2008 demonstrated that they could also control a simple robotic arm.
"We are now ready to begin testing BCI technology in the patients who might benefit from it the most, namely those who have lost the ability to move their upper limbs due to a spinal cord injury," says Michael Boninger, director of the UPMC Rehabilitation Institute and a senior scientist on the project. "Our ultimate aim is to develop technologies that can give patients with physical disabilities control of assistive devices that will help restore their independence."
This will not be the first human clinical trial of a BCI. In 2004, quadriplegic Matthew Nagle (who was paralyzed from the neck down after being stabbed) became the first person to use BCI to control a computer mouse cursor. His BrainGate Neural Interface System was eventually removed, and in 2008 Nagle died, but before his death he told PBS: "I can't put it into words. It's just--I use my brain. I just thought it. I said, "Cursor go up to the top right." And it did, and now I can control it all over the screen."
But while the technology may seem stunning, it is anything but easy work for the patients.
According to Jose del R. Millan and his team at the Ecole Polytechnique Federale de Lausanne in Switzerland, in a "typical brain-computer interface (BCI) set-up", users send mental messages of either left, right, or no-command."But it turns out that no-command is very taxing to maintain and requires extreme concentration. After about an hour, most users are spent. Not much help if you need to manoeuvre that wheelchair through an airport," his team said in a statement.
So now researchers are working on methods to hook up a machine to interpret a user's brain signals and read their intent.
Users are asked to read or speak aloud while thinking of as many left, right or no commands as possible. The technology learns to sift through the fray and figure out when a command has been delivered. The result makes multi tasking a reality while at the same time allows users to catch a break.