A groundbreaking new research study by Susumu Tonegawa’s team at MIT has opened up grounds for debate in the ethics of neuroscience once again. In Tonegawa’s experiment, neuroscientists were able to implant memories into the brains of mice using optogenetics, a technology in which specific cells can be turned on or off by exposure to a certain wavelength of light. The specific memory manipulated in this study was a conditioned fear response in mice to a mild electrical foot shock.
Researchers in Tonegawa’s lab began by engineering mice hippocampal cells to express channelrhodopsin, a protein that activates specific neurons when stimulated by light. Channelrhodopsin was also modified to be produced whenever c-fos, a gene necessary for memory formation, was turned on. On day one, the engineered mice explored Room A without any exposure to foot shock; the mice behaved normally. As the mice explored this room, their memory cells were labeled with channelrhodopsin. On day two, the same mice were placed into Room B, a distinctly different room, where a foot shock was received; the mice exhibited a fear response. While receiving the foot shocks, channelrhodopsin was activated via optogenetics causing the fear response to be encoding not only to Room B, but Room A as well. To test this hypothesis, the mice were brought back to Room A on day three.
Before babies can crawl or walk, they explore the world around them by looking at it. This is a natural and necessary part of infant development, and it sets the stage for future brain growth. By using eye-tracking technology, scientists were able to measure the way infants look at and respond to different social cues. This new research suggests that babies who are reluctant to look into people’s eyes may be showing early signs of autism.
The researchers at Marcus Autism Center, Children’s Healthcare of Atlanta and Emory University School of Medicine followed babies from birth until age 3, and discovered that infants later diagnosed with autism showed declining attention to the eyes of other people, from the age of 2 months onwards.
It’s just about that time of year again – in just over a week’s time we’ll be sitting down to a huge feast consisting of turkey, stuffing, and mashed potatoes; we’ll be watching the Macy’s Parade soon to be followed by two football games; and we’ll be giving thanks for our reunion with our grandparents, uncles, aunts, cousins, brothers, sisters, parents, and more. Thanksgiving definitely holds a special place in my heart – however, up until recently, it always used to provide just a little bit of stress. That is because, at least in my family, somewhere between polishing off the last roll and preparing for pecan pie one relative or another always asks me, “so what are you studying in school again?” And when I answer “Neuroscience!” I typically get one of two responses: the confused look, followed by “Neuroscience? What is neuroscience?” (typically from the older crowd in the room), or the rolling of the eyes, followed by “What are you going to do with a degree in neuroscience?” (typically from the former engineers and business majors). I love neuroscience, and I know I’ve found my passion studying it here at BU, but those questions always seem to bring with them a certain pressure that I always felt I cracked beneath. However, I recently discovered the perfect way to address both of these questions, and I’m here to let you in on the secret so you can impress your relatives at the thanksgiving dinner table as well. This year, when Grandma or Uncle Tony ask me “why neuroscience?” my answer will be simple – because neuroscience is changing, and will continue to change, the world and how we approach it.
I can already imagine the taken aback look crossing my relative’s faces, and the comment that I’m perhaps being a little dramatic – neuroscience is changing the world? Not only will my answer definitely get their attention, but I’m confident that my answer is correct, and proving my point to my disbelieving family will only make Thanksgiving that much more fun. Neuroscience is the science of understanding the nervous system (that is the system that essentially allows for all of our functioning) on a basic scientific level, and then applying that knowledge to do a bunch of things, from eradicating the diseases that plague the system (Alzheimer’s, Parkinson’s), to applying the knowledge in the classroom so that students of all ages can learn to their full potential. If you take a step back and view the whole picture, it’s not surprising that neuroscience will change the world in our lifetime; as opposed to some other fields, neuroscience is constantly acquiring completely new information about systems that not too long ago used to be a complete mystery – this knowledge is overflowing and already being applied to the real world to make beneficial changes. I will quickly outline two fascinating new outlets of neuroscience that are changing the world right before our very eyes, so that you have solid proof to further widen the eyes of your relatives this holiday season.
As we approach the loved holiday season, we also approach the dreaded weight gain that comes along with it. It probably won’t come as a surprise to you that our brain, specifically the hippocampus, plays a role in resisting immediate or delayed temptation.
The hippocampus deals with memory, including recalling past events and imagining them in the future. A study called “A Critical Role for the Hippocampus in the Valuation of Imagined Outcomes” examines healthy people as well as people with Alzheimer’s disease, which impairs memory and is associated with atrophy of the hippocampus. The study looked at “time- dependent” choices having to do with money in addition to “episodic” choices having to do with food, sports, and cultural events.
Imagine for a second feeling “overcomplete.” You have all of your limbs, and they are perfectly healthy. Yet, you feel as though your leg doesn’t belong to you. It shouldn’t be there, and you know it needs to go. The only way you can feel “whole” again is through its removal. You might be wondering why anyone would want to get rid of a perfectly healthy limb. Well, this phenomenon is the result of apotemnophilia, or Body Integrity Identity Disorder (BIID), a condition characterized “by the intense and long-standing desire for the amputation of a specific limb.” Most know exactly where they want the line of amputation to be, and this place stays fairly constant as time passes. Since it is rare for a surgeon to agree to amputate a healthy limb, many suffering from this condition will unfortunately resort to attempting the amputation themselves. In the past, this rare disorder was thought to be only psychological, and that perhaps the yearning for amputation was simply a way of seeking attention. However, in recent years, studies have shown otherwise.
One study performed by David Brang, Paul D. McGeoch and Vilayanur S. Ramachandran suggests that apotemnophilia is indeed a neurological disorder. The otherwise healthy subjects of the study went through a series of blind skin conductance response tests where they were pinpricked above and below the line where the amputation was. The responses to the pinpricking below the line were much greater than those above it. The resulting differences found suggest some abnormalities with the somatosensory input to the body part in question.
They suspect the disorder to be a consequence of damage to the right parietal lobe, which is responsible for processing sensory input from certain areas and forming our sense of body image. It could be the dysfunction within the right parietal lobe, specifically the right superior parietal lobe, causing irregular sympathetic output. The discrepancy between the signals and body image can lead to the feeling that a certain limb doesn’t belong and should be removed in order to feel “complete.” In addition, most patients have dealt with these feelings since childhood suggesting that the dysfunction is congenital.
Google, IBM, Microsoft, Baidu, NEC and others use deep learning and neural networks in development of their most recent speech recognition and image analysis systems. Neural networks have countless other uses, so naturally there are tons of startups trying to use neural networks in new ways. The problem being faced with now, is how exactly to implement neural network models in a way that mimics the circuitry of the brain. Brains are highly parallel and extremely efficient; the ultimate achievement of a neural network model would be able to perform large scale parallel operations while being energy efficient. Current technology is not well suited for large-scale parallel processing, and it is nowhere near as efficient as our brain, which uses only 20 watts of power on average (a typical supercomputer uses somewhere along the order of several megawatts of electricity).
One way in which future computers could mimic the efficiency of the brain is being developed by IBM. IBM believes that energy efficiency is what will guide the next generation of computers, not raw processing power. Current silicon chips have been doubling in power through Moore’s Law for almost half a century, but are now reaching a physical limit. To break through this limit, researchers at IBM’s Zurich lab headed by Dr. Patrick Ruch and Dr. Bruno Michel want to mimic biology’s allometric scaling for new “bionic” computing. Allometric scaling is when an animal’s metabolic power increases with its body size; the approach taken by IBM is to start with 3D computing architecture, with processors stacked and memory in between. In order to keep everything running without overheating, this biologically-inspired computer would be powered, and cooled, by so-called electronic blood. Hopefully this fluid will be able to multi-task, and like blood supplies sugar while taking away heat, accomplish liquid fueling and cooling at the same time.
Huntington’s disease, a neuro-degenerative disorder, affects roughly 5-10 out of every 100,000 people. The disease acts by deteriorating many structures in the brain, beginning with the Caudate Nucleus, which is involved in motor control. By the end of their lives, patients are expected to lose about 30% of their brain mass.
In the last few decades much has been learned about the disease: we now know that it’s genetic, caused by a mutation on the 4th chromosome, and leads the ‘so-called’ huntingtin protein to grow too long and fold in on itself incorrectly. Parents with Huntington’s disease each have a 50% chance of passing it on to their child. Symptoms usually appear between 35 and 45 years of age, and the life expectancy after the first appearance of symptoms is 10-20 years. There currently is no treatment for Huntington’s itself, so patients can only receive a little bit of relief from their symptoms while the disease progresses.
So far, there is no real consensus among scientists about how exactly the disease works, but it’s generally agreed that it all starts with the mutated huntingtin protein. What if that protein was somehow changed or blocked? This idea prompted some experimentation by Holly Kordasiewicz and her colleagues.
Have you ever sat down to write a paper at 10pm the night before it is due? It is like setting out to run a marathon. You have all your necessary snacks lined up on the desk, the perfect playlist is on, breaks are not an option, and it will probably take you around 5 hours (more or less depending how much research and dedication you prepared for this moment). Most people are more likely to encounter this type of marathon than an actual marathon; however, according to a recent research in the field of psychophysiology, marathons require just as much mental effort as the aforementioned paper.
Samuele Marcora, a professor and researcher at the University of Kent, has recently theorized that the perceived mental effort during physical activity is just as important as how strenuous as the activity actually is. Previously, the common idea has relied on the afferent feedback model – “that perception of effort results from the complex integration of sensory signals from the active muscles and other organs” and relies little on what the person actually perceives to be difficult. In the general field of exercise physiology, it has been assumed that once a subject exerts a certain amount of energy, exhaustion is reacted because there are no more stores of energy to draw upon. Marcora’s studies are beginning to prove that this previously assumed idea is not the case. Marathon runners are still able to function after a marathon, before refueling, even though they may not have been able to push any harder during the race. This is just one example of how there may be enough glycogen (units of energy) yet the runner is unable to keep pushing harder. According to these new studies, the actual energy available to a person during strenuous exercise, if there is enough glycogen present, relies on the person’s ‘brain strength’ to push past the mental boundaries caused by negative thoughts, boredom, and a build up of neurotransmitters – something that Marcora believes that runners can train to become easier.
Stressed out? You may be at a higher risk for Alzheimer’s disease. You’re probably wondering to yourself how that is possible. Highly intelligent people who use their brains all of the time, like scientists, CEOs, and presidents, deal with stress on a day to day basis. The truth is that lack of higher education or brain activity is not the only major cause of dementia.
If keeping your brain active is a good way to prevent cognitive decline, then why did people such as Ronald Reagan and Norman Rockwell develop Alzheimer’s disease? The answer is stress. Recent studies have shown that people who deal with high levels of stress in their career or their family life are more likely to develop dementia. Stress cannot be said to directly cause dementia, but it is a trigger for the degenerative process in the brain.
An Argentine research team examined 118 patients with Alzheimer’s disease and 81 healthy individuals whose age, gender, and educational level were comparable to the Alzheimer’s patients. Both groups were questioned about the amount of stress that they had faced in the past three years. The researchers reported that 72% of the Alzheimer’s patients admitted to coping with severe emotional stress or grief, such as the death of a loved one or financial problems. This was nearly three times as many as the control group.
A man or a woman could be of above-average intelligence, well-educated in medicine and psychology, and understand that every statistical measurement of personality and temperament can be distributed across a bell-curve in a large enough population. He or she could comprehend the determining power of genetics, the impact of cultural influence on belief systems, and how neuroplasticity molds our mental processing to respond to environmental stimuli.
A clever, sophisticated professional could understand all of this, yet still believe that somehow, all members of the opposite sex are manipulative and irrational. Not some, all 3.5 billion of them. Why?
Perhaps throughout his or her life, this person has made irrational decisions to socialize with very irrational and emotional people of the opposite sex, and through these experiences has thus formed a gender bias. Research shows we tend to mostly place people into categories of gender, race, and age. This task is so pervasive that scientists have deemed it our “primitive” categorization.
Once we’ve made up our mind about a group, or even someone in particular (consciously or not), it’s often hard to change our opinion. When beliefs are formed, confirmation biases kick in and begin to look for information that supports our views, and selectively ignore everything which doesn’t. Maybe someone had decided that you were shy and uptight when you first met. You were more reticent than usual because you had only gotten 3 hours of sleep the night before. Now that acquaintance may not notice all the times you’re friendly and outgoing, but instead seems to pounce on all the times you’re a little quiet.