By Kyle Niemeyer
Imagine: a robotic prosthetic arm that you can not only control with your brain, but actually feel when it touches something. This might sound like science fiction, but a team of researchers designed and tested a system with monkeys that does just this. They call their setup a "brain-machine-brain interface," or BMBI, and it has the potential to give amputees closer-to-normal functionality.
Brain-machine interfaces (BMI) have come a long way in recent years, enabling complex robotic limbs with multiple degrees of freedom, but people rely on tactile feedback for fine control of their limbs. Try to imagine picking up something as simple as a glass without being able to feel when your fingers are around it—awkward and difficult. Unfortunately, this is one area where there has been less progress. One group used vibrational feedback to indicate touching, but otherwise most BMI systems rely on sight—until now.
The BMI portion of this new approach is similar to one we reported on a few years ago. The researchers implanted microelectrode wires in the primary motor cortex (also known as M1) of a monkey’s brain. M1 is the region of the brain responsible for movements, so by measuring electrical signals in particular places, the brain interface can effectively directly control robotic limbs.
There weren’t any robotic monkey arms here, though; the monkeys controlled a virtual arm on a computer monitor. Initially, this was done through a joystick (which they learned to use through fruit juice rewards). After it was clear that the monkeys could use the joystick to find virtual objects, the researchers switched the control of the virtual arm to the BMI—the monkeys still moved the joystick, but their brain signals actually moved the arm. The joystick was necessary because monkeys, unlike most humans, don’t respond well to being told to move their arms in a certain way.
more
http://arstechnica.com/science/news/2011/10/researchers-devise-brain-machine-interface-with-a-sense-of-touch.ars