Brain-Machine Interface: Creating Mind Controlled Robots

in Article
March 13th, 2012


In the United States alone there are about a quarter of a million people affected by spinal cord injury with over 10,000 new injuries resulting in conditions such as paraplegia and quadriplegia each year. Spinal cord injuries can be completely debilitating and can occur when least expected. Drawing from a high school memory of mine, a hockey player from a town nearby was pushed head first into the boards one night during a game and sustained a severe neck injury, permanently impairing his motor skills and changing the course of his life.

It is sad to hear about from a personal perspective but there may be new hope for recovery for people affected by these life-altering injuries. It may not be realistic yet to restore full limb use to a quadriplegic patient, but what if there are ways to make entire new limbs? I am not talking about the average prosthetic limb, but one that could move on command, through a patient’s thoughts. It used to be the talk of science fiction to imagine the mind moving objects just by thought. Now this idea is not so far-fetched. Through recent work in computational and cognitive neuroscience great strides are currently being made in utilizing the electrical activity of the brain non-invasively through a brain-machine interface (BMI) to control semi-autonomous robots that can do anything from picking up a cup of coffee to checking your email.

It is clear that the potential of this research is enormous, ranging from medical use to military use. The exact mechanism of how one can simply think about a motor task and have a robot perform that task is fascinating. The current research primarily takes advantage of the electrical activity that occurs within neurons during normal brain function. Through the use of electroencephalography (EEG), the voltage changes that arise due to movement of ionic current within neurons in a specific area of the brain can be measured. Electrodes attached to the scalp take these measurements non-invasively. With good temporal resolution and decent spatial resolution, the EEG recordings can be taken from an area such as the motor cortex, which lies conveniently in the frontal lobe of the cerebral cortex anterior to the precentral gyrus, and run through algorithms that decode the neural activity. In this way the electrical activity is turned into commands that the robot or prosthetic limb can carry out without the patient having to move a muscle. Besides EEG, researchers are also exploring the use of other methods of measuring neural output such as electrocorticographic signals (ECoG) and local field potentials (LFPs), which vary in their ability to measure electrical activity spatially and temporally. Depending on the exact goal of the brain-machine interface and the specific needs of the patient it may be more appropriate for measurements to be taken only from a precise area of the cortical homunculus, or for many commands to be decoded over a short period of time. These needs beckon the use of recording methods that are able to extract neural activity from precise groups of neurons over a given time period.

This research is currently happening right here at Boston University in the Neuromorphics Laboratory and center for computational neuroscience. Neuromorphics literally means a technology with a form based on the architecture of the brain. In this lab, founded in 2010 by Dr. Massimiliano Versace, Ph.D., the overarching goal is to model whole-brain systems and implement them at the biological level utilizing innovative neural chips and mobile robotic platforms. In this way the mind (whole-brain systems), brain (neural chip), and body (robot) are run in unison to create an intelligent agent that can perform motor and cognitive tasks that include learning and memory. One of the more amazing aspects of this lab work is that the algorithms they develop to decode the neural activity employ a two-way co-adaption paradigm that allows the patient and the prosthetic limb or robot to learn from one another. As movements are practiced, both the subject and the robot are able to improve their performance. These brain-machine interfaces are being tested continually with different kinds of robots and the BU Neuromorphics lab has already gained publicity for its work on this cutting edge technology, being featured in BU Today just last week.

Brain-machine interface research is still in its beginning stages. Eventually, it will not just be motor activity that is being recorded and decoded by robots. Hopefully there will someday be fully autonomous robots that perform tasks like aiding the elderly and robotic exoskeletons for those affected by spinal cord injury who wish to walk again. The possibilities for this technology grow as our understanding of brain function expands. To be able to provide a quadriplegic with the possibility of regaining at least some basic motor function is extremely exciting and this is exactly where BMI research will soon take us.

Here is a clip of a monkey utilizing a robotic arm connected to electrodes that record the activity in the monkey’s motor cortex to snack on a few marshmallows:

Spinal Cord Injury Facts and Statistics – SCI Info Pages

Building robots that learn – Aisha Sohail, CNN.com

Brain-Robot Interation – BU Neuromorphics Laboratory

Selecting the signals for a brain-machine interface – Current Opinion in Neurobiology

Toward electrocorticographic control of a dextrous upper limb prosthesis: building brain-machine interfaces.

– IEEE Pulse

Tagged , , , , , ,

Post Your Comment