-
-
- Glancing at a stereo and turning it on with a thought
may have once been science fiction, but inside a virtual world at the University
of Rochester, people are listening to music by simply wishing it so. Outfitted
with a virtual reality helmet and a computer program adept at recognizing
key brain signals, volunteers use their thoughts to take actions like those
of any apartment dweller-turning on the television or the stereo, for instance.
The line of research, which links a brain and computer in a near real-world
environment, may someday allow patients with extreme paralysis to regain
some control of their surroundings, say the project's developers, and could
eventually eliminate keyboards and computer mice as the go-betweens connecting
our thoughts and the actions we wish to see in our environment.
-
- While several teams around the world are working on brain-computer
interfaces (BCI), computer science graduate student Jessica Bayliss is
the first to show that detection of the brain's weak electrical signals
is possible in a busy environment filled with activity. She has shown that
volunteers who don a virtual reality helmet in her lab can control elements
in a virtual world, including turning lights on and off and bringing a
mock-up of a car to a stop by thought alone. Though all this is currently
taking place only in virtual reality, the team is confident that the technology
will make the jump to the "real world" and should soon enable
people to look around a real apartment and take control in a way they couldn't
before.
-
- "This is a remarkable feat of engineering,"
says Dana Ballard, professor of computer science and Bayliss' adviser.
"She's managed to separate out the tiny brain signals from all the
electric noise of the virtual reality gear. We usually try to read brain
signals in a pristine, quiet environment, but a real environment isn't
so quiet. Jessica has found a way to effectively cut through the interference."
-
- The National Institutes of Health is supporting Bayliss'
research because it may someday give back some control to those who have
lost the ability to move. A person so paralyzed that he or she is unable
even to speak may be able to communicate once again if this technology
can be perfected, explains Bayliss. By merely looking at the telephone,
television or thermostat and wishing it to be used, a person with disabilities
could call a friend or turn up the heat on a chilly day. Bayliss hopes
that someday such people may even be able to operate a wheelchair by themselves
simply by thinking their commands.
-
- "Virtual reality is a safe testing ground,"
says Bayliss. "We can see what works and what doesn't without the
danger of driving a wheelchair into a wall. We can learn how brain interfaces
will work in the real world, instead of how they work when someone is just
looking at test patterns and letters. The brain normally interacts with
a 3-D world, so I want to see if it gives off different signals when dealing
with a 3-D world than with a chart."
-
- The brain signal Bayliss listens for is called the "P300
evoked potential." It's not a specific signal that could be translated
as "Aunt Nora" or "stop at the red light," but rather
a sign of recognition-more like "That's it!"
-
- "It's as if each neuron is a single person who's
talking," explains Bayliss. "If there's just one person, then
it's easy to hear what he's saying, but the brain has billions of neurons,
so imagine a room full of a billion people all talking at once. You can't
pick out one person's voice, but if everyone suddenly cheers or oohs or
aahs, you can hear it. That's what we listen for, when several neurons
suddenly say 'that's it!' "
-
- Bayliss looks for this signal to occur in sync with a
light flashing on the television or stereo. If the rhythm matches the blinks
of the stereo light, for instance, the computer knows the person is concentrating
on the stereo and turns it on. A person doesn't even have to look directly
at the stereo; as long as the object is in the field of view, it can be
controlled by the person's brain signals. Since it's not necessary to move
even the eyes, this system could work for paralysis patients who are completely
"locked in," a state where even eye blinks or movement are impossible.
-
- The virtual apartment in which volunteers have been turning
appliances on and off is modeled after Bayliss' own. Such a simple, virtual
world is the first step toward developing a way to accurately control the
real world. Once Bayliss has perfected the computer's ability to determine
what a person is looking at in the virtual room, the next hurdle will be
to devise a system that can tell what object a person is looking at in
the real world. BCI groups are also close to surmounting another obstacle-that
of attaching the sensors to the head. Right now dozens of electrodes must
be attached to the scalp one at a time with a gooey gel, but Bayliss says
dry sensors are just around the corner, and simple slip-on head caps should
not be far behind.
-
- "One place such an interface may be very useful
is in wearable computers," Ballard says. "With the roving eye
as a mouse and the P300 wave as a mouse-click, small computers that you
wear as glasses may be more promising than ever."
-
- BCIs are divided into two categories: biofeedback and
stimulus-response. Bayliss uses the latter approach, which simply measures
the response the brain has to an event. Biofeedback is a method where a
person learns to control some aspect of his or her body, such as relaxing,
and the resulting change in the brain can be detected. Though many BCI
groups use this approach, Bayliss decided against it because people must
be trained, sometimes for a year or more, and not everyone can learn to
accurately control their thought patterns.
-
- Bayliss and Ballard work in the University's National
Resource Laboratory for the Study of Brain and Behavior, which brings together
computer scientists, cognitive scientists, visual scientists, and neurologists
to study neural functions in complex settings. The laboratory's research
combines tools that mimic real-world sensations, such as virtual reality
driving simulators and gloves that simulate the feel of virtual objects,
with sensory trackers that measure eye, head, and finger movements. Recently
the lab added virtual people, robot-like actors with which volunteers can
interact in a limited way.
-
- So in the future will we all be wearing little caps that
will let us open doors, channel surf and drive the car on a whim? "Not
likely," Bayliss says. "Anything you can do with your brain can
be done a lot faster, cheaper and easier with a finger and a remote control."
-
-
- Editor's Note: The original news release can be found
at http://www.rochester.edu/pr/releases/cs/bayliss.html
-
- Note: This story has been adapted from a news release
issued by University Of Rochester for journalists and other members of
the public. If you wish to quote from any part of this story, please credit
University Of Rochester as the original source. You may also wish to
include the following link in any citation: http://www.sciencedaily.com/releases/2000/05/000503180714.htm
-
- RELATED: Stories Newsgroups Sites Books < PREVIOUS
NEXT Copyright © 1995-2000 ScienceDaily Magazine | Email: editor@sciencedaily.com
Best viewed with Internet Explorer or Netscape Navigator (version 3.0 or
higher)
-
- SIGHTINGS HOMEPAGE
http://www.sightings.com
- This
Site Served by TheHostPros
|