Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface
In a jaw-dropping feat of engineering, electronics turn a person's thoughts into commands for a robot. Using a brain-computer interface technology pioneered by University of Minnesota biomedical engineering professor Bin He, several young people have learned to use their thoughts to steer a flying robot around a gym, making it turn, rise, dip, and even sail through a ring.
The technology may someday allow people robbed of speech and mobility by neurodegenerative diseases to regain function by controlling artificial limbs, wheelchairs, or other devices. And it's completely noninvasive: Brain waves (EEG) are picked up by the electrodes of an EEG cap on the scalp, not a chip implanted in the brain.
The development of BCIs is aimed at providing users with the ability to communicate with the external world through the modulation of thought. Such a task is achieved through a closed loop of sensing, processing and actuation. Bioelectric signals are sensed and digitized before being passed to a computer system. The computer then interprets fluctuations in the signals through an understanding of the underlying neurophysiology, in order to discern user intent from the changing signal. The final step is the actuation of this intent, in which it is translated into specific commands for a computer or robotic system to execute. The user can then receive feedback in order to adjust his or her thoughts, and then generates new and adapted signals for the BCI system to interpret.
Originally posted by Clairaudience
Two threads on any given topic are allowed.