upper waypoint

Building a Better Bionic Arm by Teaching the Brain a New Signal

Save ArticleSave Article
Failed to save article

Please try again

Listen:

Building a Better Bionic Arm by Teaching the Brain a New Signal

Building a Better Bionic Arm by Teaching the Brain a New Signal

Jason Koger, a father of three from Owensboro, Kentucky, uses a pair of high-tech bionic hands.  Both his arms were amputated below the elbow after an ATV accident seven years ago. (Danny Beeler/Heart of the City Design)
Jason Koger, a father of three from Owensboro, Kentucky, uses these old-style hooks, and has a pair of bionic arms. Both his arms were amputated below the elbow after an ATV accident seven years ago. (Danny Beeler/Heart of the City Design)

Researchers have made dramatic improvements with artificial limbs in the last few years. Bionic hands, for example, are so dexterous they can pick up a quarter.

But for prosthetics to move more naturally, they need both good engineering and a way to communicate with the brain. That’s something scientists at UC San Francisco are working on, and their success hinges on the brain’s ability to make new connections.

Jason Koger lost both his arms in an ATV accident seven years ago. A father of three from Owensboro, Kentucky, Koger doesn’t mind showing off the high-tech bionic hands he wears now. With them, he’s able to open soda cans and work gadgets that have pressure-sensitive touchscreens – but a few things are missing.

“I have a three year old, a little boy,” Koger says, “and I can hold his hand, and we’re fixing to cross the street. But then all of the sudden I kind of look around and I look back and he’s not even holding my hand, and I thought he still was.”

It’s not just touch that’s absent. Koger has a hard time sensing where his bionic hands are when he can’t see them. Say, for example, he tries to thread a belt through the loops in the back of his blue jeans. He has to do it before putting them on. The bionic hands lack a sense called proprioception.

Sponsored

The Body’s Sense of Spatial Awareness

Though Koger owns a pair of high-tech bionic hands, he sometimes opts for old-fashioned hooks, like when he’s planning to work outside or when he expects rain. (Danny Beeler/Heart of the City Design)
Koger tosses a pitch to his daughter while wearing his hooks. (Danny Beeler/Heart of the City Design)

The human body has a built-in sense for where its various parts are in space, explains Philip Sabes, a professor and neurophysiologist at UC San Francisco.

“There are sensors that live in your muscles, in your joints, even in your tendons,” Sabes says, “that tell your brain about where you body is.”

This is why you can touch your nose even with your eyes closed; proprioception tells you where your nose is with respect to your hand.

As you move, proprioception gives the brain feedback, and this feedback lets you refine your movement, moment by moment. But amputees controlling a prosthetic have to do without this sense. Generally, they rely on sight instead, which isn’t as fast.

“Visual feedback is much slower to get to the parts of the brain that control movement than proprioception is,” Sabes says.

Sabes’ research is backed by the Department of Defense research agency, DARPA. There are hundreds of thousands of amputees in the U.S., including some 1,500 veterans of the wars in Iraq and Afghanistan.

Sabes’ Experiment

Sabes is researching how an artificial limb might be able to send useful feedback to the brain of an amputee. What if the signal were artificial, he asked, could the brain figure out what it meant and put it to use?

Sabes began with a monkey that was trained to move his hands in response to a dot on a screen, that is, a visual signal. Then Sabes added a signal to the brain.

The signal went to a tiny chip, surgically implanted just inside the monkey’s skull. It delivered tiny fractions of a volt to the motor cortex — an amount of electricity so small, Sabes says, you probably wouldn’t feel it on your hand.

Philip Sabes is a professor and neurophysiologist at UC San Francisco, where he researches brain science and engineering in hopes of developing a better connection between man and machine. (Cindy Chew/UCSF)
Philip Sabes is a professor and neurophysiologist at UC San Francisco, where he researches brain science and engineering in hopes of developing a better connection between humans and machines. (Cindy Chew/UCSF)

“And so that signalled to the monkey,” Sabes says, “that maybe there was something lightly touching his arm or his elbow or his shoulder.”

The researchers paired this stimulation alongside the visual signals the monkey was already used to reading. Sabes says initially, it didn’t make much difference.

“Over days and weeks,” he says, “we noticed that the monkey started to perform better and better when he had both cues.”

The monkey’s brain was incorporating the new signal — putting it to use, so he had to rely less on visual cues alone.

“After a few weeks, he was so good we actually let him do the task with just stimulation alone,” Sabes says. “So then he was able to sit in a dark room with no visual feedback, nonetheless he could reach from one target to the next to the next, because he could feel where those targets were, via the stimulation.”

What It Means

The results show that the primate brain, given an arbitrary kind of signal, can adapt to understand it as a kind of faux proprioception. Knowing that the brain can do this could someday lead to giving amputees a sense of where their artificial limbs are in space. Such signals could be delivered through feedback, Sabes says, via either the nerves or small implants in the brain.

So how long until this kind of technology is far enough along it can help amputees control their prosthetics well enough to move like the real thing?

“I’m going to be bullish,” Sabes said. “Fifteen to 20 years until it’s as natural as normal movement.”

lower waypoint
next waypoint