subscribe to TIME
Monday, June 4, 2001  |  You are Home » Technology » Story »

Site Home

Get a

Install Your

Write to TIME

Love & Sex
The Future

Full Contents

TIME Interactive
CNN's Our Interactive World


Brain Power
In the brave new world of technology, we may soon be able to control distant objects just by thought

Wil McIntyre for TIME.
Nicolelis trained monkeys to move this robotic arm via vrain waves.

Yuri Geller claims he can bend spoons simply by using his brain. In the South Korean city of Taejon, people are learning a very similar trick—though they need a computer between themselves and the cutlery. Here's how the experimental software works. You don an electrode-studded cap that monitors brain waves and sends data to a computer that displays a virtual spoon. Different types of mental activity produce distinct signals in the brain, and the computer can discern, in a crude way, what's going on inside your head. To make the spoon bend, you have to relax. When the computer detects signals from a calm brain, the spoon begins to wilt.

What's going on in Taejon, at the Korea Research Institute, is a very basic example of what could be the most interactive technology of the future: brain-computer interfaces. Early computers were controlled by cardboard punch cards; the first PCs demanded typed DOS commands; the mouse gave us a graphic interface. Will we one day be able to enter the world of computing with no external mechanical intermediary whatsoever—in other words, just by thinking? Researchers around the globe are working on the problem. The Joint Research Centre of the European Commission, for example, has developed the Adaptive Brain Interface, a helmet and software program (like the one in Korea) intended to allow disabled people to operate appliances using thought commands. At the British government's Defense Evaluation and Research Agency, in Farnborough, the same techniques are helping fighter pilots fly jets with their minds. But the place where brains and computers are truly coming together is in the lab of Miguel Nicolelis, associate professor of neurobiology at the Duke University Medical Center in North Carolina. He has trained two owl monkeys to control a robotic arm via brain signals—giving glimpses of how the virtual and physical worlds may merge.

Message Board: What do you think about brain-interface technology?

Poll: Should brain-computer interface technology be used for anything other than medical applications?

Multimedia Feature

TIME Asia Editor Adi Ignatius talks about TIME Interactive
Working with colleagues at Duke, M.I.T.'s Laboratory for Human and Machine Haptics (also known as the Touch Lab) and the State University of New York Health Science Center, Nicolelis implanted electrodes into the sections of the monkeys' brains in which the planning and execution of arm movements takes place. When the brain instructs the body to make a motion, it fires off electric signals well before any action actually takes place; in other words, the body lags slightly behind the brain's intention to act. In effect, the brain warms up for an impending movement by directing specific clusters of neurons to fire, just as you might warm up your car's engine by pumping the gas pedal.

Nicolelis and his colleagues monitored the monkeys' brain signals as they warmed up for various tasks, like reaching for food, and isolated the signals that preceded the movements. Then they routed the monkeys' brain signals through a computer. As a monkey started to grasp for food, the computer picked up the neural traffic and forwarded it to a robotic arm called the Phantom. When the monkey extended its arm, the Phantom, using the neural signals from the monkey, precisely mimicked the action. Nicolelis even transmitted the brain signals over the Internet to the Touch Lab in Cambridge, Massachusetts, so the monkey's neural commands operated another Phantom 965 km away.

Nicolelis is convinced that this system will work for humans—an interface that might allow paralyzed people, who generally have healthy limbs that they are unable to use due to spinal cord damage (which prevents brain signals from reaching their limbs), to control their own biological limbs. It could also give people extended senses, allowing them to have virtual limbs in cyberspace or robotic limbs in the physical world. "The brain knows that it has an arm and a hand because it is connected to these things and gets feedback from them," Nicolelis says. "The same could be true for robotic or virtual appendages. If you control a remote hand that senses objects and sends tactile sensations back to your brain, it behaves as if it's your own hand. It becomes part of you. Your body becomes extended beyond the surface of your skin."

What Nicolelis is describing is a reverse phantom limb. Instead of continuing to feel the presence of a limb that is no longer there, people equipped with a brain-computer interface could operate new appendages, and the brain would eventually come to regard these as its own. But what could a person do with a remote robotic or virtual limb? The possibilities range from the mundane to the otherworldly. In the virtual realm, these appendages would dispense with the bulky technology of conventional haptics and allow Web shoppers to squeeze a peach online to see if it's ripe. Video conferences and chats might start with actual handshakes. And of course, there's sex. Consenting adults could use the technology to engage in far more intimate embraces and manipulations. In the realm of robotics, devices could be sent to dangerous or inhospitable climes, like deep-sea thermal vents or the craters of active volcanos.

For the human brain to truly incorporate prosthetics into its body map requires feedback: the brain will only become aware of its new limbs if they make their presence known. To see how the monkeys might respond to this kind of anatomical extension, Nicolelis is creating a feedback loop between the monkeys and the robotic arm. In the next experiments the monkeys will have sensors attached to their bodies, so that the robotic arm delivers tactile sensations directly to their skin. When the monkey's brain waves impel the robotic arm to grasp a piece of fruit, for example, the animal will be able to feel the fruit's texture. The monkeys will also be able to watch the robotic arm in action on a computer screen. This kind of tactile and visual feedback, Nicolelis hopes, will teach the monkeys to associate the arm's movements with their thoughts. Once they make that link, they might not take the trouble to stretch out their arms anymore. Why bother when a mere thought will move the robot arm?

So far, there is no way to tap into the brain without dramatically invasive surgery, so human experimentation is unlikely. And there's an intriguing risk in the realm of brain-computer interfaces. What would happen if the process was reversed? The signals that are routed from the monkey's brain through the computer to control the robotic arm could be sent back to the monkey—to control its behavior. Implants in humans would face strong opposition unless the possibility of this kind of mind control could be eliminated, which so far seems impossible to achieve.

Nicolelis is confident that a technological breakthrough will come, perhaps in the form of some kind of permanent intracranial implants, and that the ethical issues surrounding the technology will be resolved. It will probably be a long time before our brains will merge with our computers. When that day does come, however, our bodies will still be ourselves—but they could well have more than just two arms and two legs.

© James Geary 2001. This is an edited excerpt from Geary's book The Body Electric, which will be published by Weidenfeld & Nicolson in the fall

Related Sites
Miguel Nicolelis
Extropy Institute

Note: Pages will open in a new browser window
External sites are not endorsed by Time Inc.

More Technology Stories
The Best of the Web
A guide to intelligent interaction, featuring 40 sites you've (probably) never heard of

Watch and Wear
What's smaller than a hat, heavier than a revolver and poses no danger to Armani? A computer that gives new meaning to gear head

Brain Power
In the brave new world of technology, we may soon be able to control distant objects just by thought

File It Under Sharing
Unlike Napster, the latest peer-to-peer innovations can access anything without giving foes a target to shut down or sue

High Fidelity
The latest digital enhancements in the audio lab are setting a new tone for sound

Speak Up
Your computer is listening

Hands On
You may be able to feel it but that doesn't mean it's real

Stop and smell the virtual gunpowder

Kenyan Company Creates Native Language Email Services
On a continent that speaks hundreds of different languages, working with email and other computer applications written mostly in English can be difficult

Talk Is Cheap and Coming to Gadgets Near You
Houses talk to computers. Magazines talk to wireless phones. Cars talk to the Internet

Around the World in 18 Days: Part 2
TIME's Aparisim Ghosh finds innovation in strange places

Around the World in 18 Days
How Wired is the Valley? TIME's Aparisim Ghosh reports from Silicon Valley

Vending the Rules
Japan's New Economy vending machines have got boxers, breakfast, beer--and a brand-new business model on tap


Copyright © 2001 Time Inc. All rights reserved.
Reproduction in whole or in part without permission is prohibited.
FAQ | Site Map | Privacy Policy | Terms of Use


Brian Bennett, reporter for TIME magazine, speaks with LiLi, a virtual veejay


Full Contents: all of the stories in one simple list

Multimedia: the home of our video, audio and interactive features

Video: CNN circles the globe for how technology is changing our lives

Toolbox: software you may need for this site

Subscribe to TIME
Stories from this week's issue

Big Brother is watching the Net. Do you care?

Talk to your thermostat, surf from the toilet, phone your fridge

Music mixing as easy as logging on to a website and typing on a keyboard

 Back to top Site Home | Home | Home