“Index… ring… pinky… index… middle…”
Nathan Copeland is telling a researcher which of his fingers he feels a touch on. But the researcher is touching a robotic hand, not Copeland’s, whose hand hasn’t felt a thing in over a decade.
In this “proof of principle” experiment, a man whose spinal injury removed all sensation from his limbs was able to “feel” pressure on several robotic digits connected directly to his brain. It’s a long way from a cybernetic hand, but it opens the possibility of using one to even more of those who need it.
Now, two caveats: first, this isn’t the first time a robotic hand has sent sensations to a user’s brain; that’s been happening for a while, and depends on how you define those things. Second, it’s important to note that as cool as this all sounds, it’s still unbelievably crude compared with the subtlety and intricacy of the nervous system — we’re nowhere
that level of control or even understanding of it.
That said, this is still important research because it skips a step many other prosthetics rely on: the peripheral nervous system. If you need to send signals from a replacement hand, you can often plug in further up the arm, tapping in where those signals would have gone anyway. But with a spinal injury, those signals never reach the brain, so the approach doesn’t work.
What Robert Gaunt and his team at the University of Pittsburgh has done is essentially plug the robotic arm directly into the brain, bypassing the intermediary nerves and spinal cord altogether.
Copeland was in an accident 12 years ago that left him quadruplegic. But 16 years of operating his limbs means he remembers what it feels like when his hand is touched — and that means his brain remembers, too.
So the researchers had Copeland concentrate on the feeling of having different fingers touched, and tracked the brain activity associated with that feeling. Then they surgically implanted four sets of fingertip-sized microelectrode arrays into Copeland’s sensory cortex, where those feelings were centered.
Over the next few months the team repeatedly stimulated those areas of the brain, finding the patterns and locations that produced the sensation of being touched on the index finger, ring finger, and so on. Eventually Copeland was hooked up to a robotic hand, each finger of which corresponded to the circuit in his brain.
He got 85 percent right at first, then later nearly 100 percent. This is highly validating, although everyone involved will tell you how early this is.
“The ultimate goal is to create a system which moves and feels just like a natural arm would,” Gaunt said in a UP news release. “We have a long way to go to get there, but this is a great start.”
For one thing, the sensation needs to be evened out — “sometimes it feels electrical and sometimes its pressure, but for the most part, I can tell most of the fingers with definite precision,” Copeland said. The many gradations and types of touch are miles off.
This is also a one-way street: no data is being passed from the brain to the arm. Control methods would rely on completely different neural circuitry, in the motor cortex; it’s a whole different field of research. But this kind of feedback, going straight from the prosthetic to the brain, is important for intuitive controls that allow a user to grip and manipulate things in a natural way.
The team’s work is published in the journal Science Translational Medicine. It was funded through DARPA, the US Department of Veteran’s affairs, and several other grants.
Featured Image: University of Pittsburgh
Complete story at source: TechCrunch
More fresh stories at http://radudee2…