By Leo Shapiro
Brain Signals from One Primate Move Paralyzed Limbs in Another Primate
Researchers from Cornell’s School of Electrical and Computer Engineering and Harvard Medical School’s Department of Neurosurgery have developed a new neural prosthetic that uses neural activity recorded from premotor neurons to control limb movements in functionally paralyzed primates. This is a step toward making brain-machine interfaces for paralyzed humans to control their own limbs using just their brain activity. Previous research has been limited to controlling external devices such as robotic limbs.
Google, IBM, Microsoft, Baidu, NEC and others use deep learning and neural networks in development of their most recent speech recognition and image analysis systems. Neural networks have countless other uses, so naturally there are tons of startups trying to use neural networks in new ways. The problem being faced with now, is how exactly to implement neural network models in a way that mimics the circuitry of the brain. Brains are highly parallel and extremely efficient; the ultimate achievement of a neural network model would be able to perform large scale parallel operations while being energy efficient. Current technology is not well suited for large-scale parallel processing, and it is nowhere near as efficient as our brain, which uses only 20 watts of power on average (a typical supercomputer uses somewhere along the order of several megawatts of electricity).
One way in which future computers could mimic the efficiency of the brain is being developed by IBM. IBM believes that energy efficiency is what will guide the next generation of computers, not raw processing power. Current silicon chips have been doubling in power through Moore’s Law for almost half a century, but are now reaching a physical limit. To break through this limit, researchers at IBM’s Zurich lab headed by Dr. Patrick Ruch and Dr. Bruno Michel want to mimic biology’s allometric scaling for new “bionic” computing. Allometric scaling is when an animal’s metabolic power increases with its body size; the approach taken by IBM is to start with 3D computing architecture, with processors stacked and memory in between. In order to keep everything running without overheating, this biologically-inspired computer would be powered, and cooled, by so-called electronic blood. Hopefully this fluid will be able to multi-task, and like blood supplies sugar while taking away heat, accomplish liquid fueling and cooling at the same time.
Researchers from the U.S. Department of Energy’s Brookhaven National Laboratory and Thomas Jefferson National Accelerator Facility, Oak Ridge National Laboratory, Johns Hopkins Medical School, the University of Maryland, and Weizmann Institute’s Neurobiology Department have all developed new and improved brain scanning techniques. These new methods allow scientists to monitor brain activity in fully-awake, moving animals.
At Brookhaven, researchers combined light-activated proteins that stimulate specific brain cells, a technique known as optogenetics, with positron emission tomography (PET) to observe the effects of stimulation throughout the entire brain. Their paper in the Journal of Neuroscience describes this method, which will allow researchers to map exactly which neurological pathways are activated or deactivated downstream by stimulation in specific brain areas. Hopefully, following these pathways will enable researchers to correlate the brain activity with observed behaviors or certain symptoms of disease.
Scientists from Duke University and Brazil claim wires connecting one rodent to another can allow communication spanning continents via the internet. Professor Miguel Nicolelis of Duke University in Durham, North Carolina, led a team of researchers who demonstrated that it is possible to transmit instructions from one animal to another by brain-to-brain communication, a process akin to telepathy.
Brain-to-brain communication could be the start of organic-based computing based on networks of interconnected brains. Pairs of laboratory rats were able to communicate with each other using microscopic electrodes implanted into their brains. This occurred as part of an experiment where the two rats had to work together in order to receive a reward (see video at source).
The Human Connectome Project (HCP) has started trials on volunteers with a state-of-the-art scanner.
Today’s technology allows neuroscientists to map the brain’s connections on an unprecedented level of detail. The ultimate goal of the HCP is to create a map, or connectome, of every neuron and synapse to better understand how the brain works. A better understanding of the brain means a better understanding of brain disorders like schizophrenia or autism, which in turn means better treatment.
Recently, 23 year old Kim Suozzi who was diagnosed with terminal brain cancer was seeking financial help for cryonic suspension. Diagnosed with an aggressive form of Glioblastoma multiforme, Kim died on January 17th and spent the final two weeks of her life at a hospice in Scottsdale, Arizona, close by to the cryopreservation center that she chose.
Suozzi was seeking financial help for her suspensial, which proved controversial but is now settled since the Alcor board agreed to fund her cryopreservation as a charity case, stating “The board accepted the CEO’s recommendation to accept Kim Suozzi as a charity case, based on arrangements that will reduce Alcor’s costs. The full allocation of $25,000 to the patient care trust fund will be made. Alcor members have contributed to the fundraising effort to enable Kim to be cryopreserved.” More controversial, however, is the possiblilty that many terminally ill patients might seek preservation as charity cases, potentially impacting the viability of the entire operation. Furthermore, cryopreservation is not a cure in itself, terminally ill patients could possibly not be the best test subjects for a successful preservation and revival simply due to the chance of succeeding.
Neuroscience researchers in China have created a method of transforming brainwaves into music by combining EEG and fMRI scans into sounds that are recognizable to human beings. The EEG adjusts the pitch and duration of a note, while the fMRI controls the intensity of the music. According to Jing Lu and his associated colleagues from the University of Electronic Science and Technology in China, this brain music, “embodies the workings of the brain as art, providing a platform for scientists and artists to work together to better understand the links between music and the human brain.”
Applying EEG and fMRI data to make better music represents the limitless opportunities of the brain, potentially leading to improvements useful for research, clinical diagnosis or biofeedback therapy. In fact, researchers at the Department of Homeland Security’s Science and Technology Directorate have already looked at a form of neuro-training called ‘Brain Music’, which uses music created from an individual’s brain waves to help the individual move from an anxious state to a relaxed state.
We live in an era where the rapid advances in technology are constantly changing how we perceive and interact with the world around us. The question on everyone’s mind is always “what’s next?” The answer: brain-machine interfaces. For the average consumer, brain-computer interfaces are becoming increasingly available on the mass market and their current uses offer a wide range of fascinating opportunities.
A company that’s been in the news a lot lately is NeuroVigil. Their product known as the iBrain has been used to help world-renowned astrophysicist Steven Hawking communicate with a computer simply by thinking. Hawking, who suffers from Lou Gehrig’s disease, developed his own solution to allow him to speak by twitching his cheek to select words from a computer. In its current state, the iBrain is still slower than Hawking’s solution, but NeuroVigil’s founder MD Philip Low hopes that it will eventually be possible to read thoughts aloud. NeuroVigil also made the news by signing a contract with Roche, a major Swiss pharmaceutical company, to use the iBrain in clinical studies for evaluating drugs for neurological diseases.
A research team led by Laura Matzen at Sandia National Laboratories in Albuqurque, NM has demonstrated that it is possible to predict how well people will remember information by monitoring their brain activity while studying. Matzen’s team monitored test volunteers with electroencephalography (EEG) sensors to make accurate predictions. Why bother making a prediction if the result will show how well someone remembered the information anyways? Matzen brought up this example, ”if you had someone learning new material and you were recording the EEG, you might be able to tell them, ‘You’re going to forget this, you should study this again,’ or tell them, ‘OK, you got it and go on to the next thing.” Essentially providing a real-time performance metric, the applications of which many students would appreciate. More
Researchers have developed a technique that reconstructs the words patients are thinking of that could help locked-in or comatose patients communicate.
A newly developed computer model reconstructs the sounds of words that patients think of. Over the past few years, scientists have been coming closer to being able to listen in to our thoughts. This study achieved that goal by implanting electrodes directly into patients’ brains. In an earlier 2011 study, test subjects with electrodes in their brains were able to move a cursor around a screen just by thinking of different vowel sounds. Another study, conducted in September of that year by Jack Gallant at the University of California, Berkeley, was able to guess images being thought of through functional magnetic resonance imaging (fMRI).