Do you ever think about your childhood or replay an event in your head that happened 15 years ago but its so vivid that it seems like it happened yesterday? Do you ever hear something and think it sounds like your favorite song and then start singing that song? These are memories that were formed in your brain that are replayed as a result of a specific stimulus. For a long time scientists believed that memories were formed, processed, and sent to different destinations in the brain. Dr. Wilder Penfield was one of the first to accidentally discover this. In the 40s he electrically stimulated different areas of his patients’ brains while they were under local anesthesia and found that the region he stimulated would elicit specific memories in the patient’s life (see video below). For example, in one of his patients he stimulated her temporal lobe (auditory cortex) and she started to hum her favorite song out loud. This suggested that the memory of this song was stored in the place where it was processed or originated (i.e. the auditory cortex processed the first time she listened to the song). Penfield concluded that the cortex (the outer layers of the brain) stored the “complete record of the stream of consciousness; all those things in which a man was aware at any time…” Until recently, scientists have believed this phenomenon.
What is music? It’s something I listen to when I want to relax or when I want to focus. If I’m missing home, I listen to Bollywood. When it’s Christmas, I listen to carols, both classic and modern. So clearly, I think of music as a source of entertainment. In fact, in both ancient and modern times, music has been a key component of celebrations, like weddings and cultural events. Interestingly, scientific research on music and the brain has shown that music has more benefits than entertainment alone.
I think I’m funny. Some people say I’m funny. But when the moment presents itself where its my time to shine, all lights on me, this ‘one’ is going to be a knee slapper…nope, not so much. The first time I realized I wasn’t funny was in the eleventh grade in my calculus class. My teacher’s name was Mr. Butke and he easily is ranked in my top 3 ‘all-time’ of the math professors I’ve encountered in my lifetime. He had a mustache that covered his mouth and you never knew whether he was smiling, smirking, or grimacing at you. It kept you guessing, I liked that. He also presented stories of how he slayed cobras in Kenyan villages while pursuing a multi-purpose cure for malaria, encephalitis’ of sorts, and maybe AIDS. Bottom line, he was memorable and his stage presence resonated with my classmates and I.
Philosophers since the time of Plato have considered the extent to which we can truly perceive the physical world, or the so called ‘mind independent’ universe. Modern science has given us further insight into the question, through experiments designed to understand the way in which our brain receives and manipulates sensory information. While it has been known for some time that human perception is subject to various priming effects and spatiotemporal biases, psychologists at the University of California, Berkeley have discovered that visual perception is also influenced by something called the ‘continuity field.’
To put it simply, the continuity field is what allows us to view our surrounding environment as a continuous perception. In his recent article in Nature Neuroscience, David Whitney and his colleagues have shown that our perception of the orientation of a certain object in our visual field is actually strongly biased towards the orientation of that object 10 seconds prior. This means that our brain ‘smoothes out’ small changes in the physical world so that we perceive a continuous image. Without the influence of this continuity field, we would be hypersensitive to the smallest changes in our visual field, and presumably have trouble determining which changes in our surroundings would be most relevant to our immediate needs.
Recently, The Atlantic posted an article relating the growing field of neuroscience to international negotiations, specifically those surrounding the Iranian nuclear negotiations. Co-written by a neuroscientist and an expert in international relations, the article prompted a rather stern and testy response from Christian Jarrett, a science writer for Wired, yet he brings up some excellent points. Before continuing, I urge you to read The Atlantic‘s article here.
Although it may be well-intentioned, it appears that The Atlantic ’s article is little more than an attempt to grab headlines and call more attention to the piece, riding the hype trains of two popular subjects. While it appears that applying concepts from neuroscience to news and international negotiations might be something that can contribute to our understanding, realistically it only serves to dilute the field. At best, it is a misguided attempt at connections between fields. At worst, it is another example of today’s journalism: lazy and prone to clickbait.
What’s the latest on all that news about concussions in the NFL? It’s been a while since this story initially made its way into headlines and penetrated news and popular culture around the country. The neuroscience world continues to bustle over the story and research on the disease now known as chronic traumatic encephalopathy, or CTE, remains a priority right here at the Boston University School of Medicine in the Center for the Study of Traumatic Encephalopathy (CSTE) directed by Dr. Robert Cantu, Dr. Anne McKee, and Mr. Chris Nowinski. Especially with the release of the Frontline documentary released last October titled “League of Denial” (see below), aimed at exploring the possibility that the NFL has mishandled the concussion issue for a number of years, have more and more families in the United States and all over the world begun to understand the potential repercussions of traumatic head injury.
A groundbreaking new research study by Susumu Tonegawa’s team at MIT has opened up grounds for debate in the ethics of neuroscience once again. In Tonegawa’s experiment, neuroscientists were able to implant memories into the brains of mice using optogenetics, a technology in which specific cells can be turned on or off by exposure to a certain wavelength of light. The specific memory manipulated in this study was a conditioned fear response in mice to a mild electrical foot shock.
Researchers in Tonegawa’s lab began by engineering mice hippocampal cells to express channelrhodopsin, a protein that activates specific neurons when stimulated by light. Channelrhodopsin was also modified to be produced whenever c-fos, a gene necessary for memory formation, was turned on. On day one, the engineered mice explored Room A without any exposure to foot shock; the mice behaved normally. As the mice explored this room, their memory cells were labeled with channelrhodopsin. On day two, the same mice were placed into Room B, a distinctly different room, where a foot shock was received; the mice exhibited a fear response. While receiving the foot shocks, channelrhodopsin was activated via optogenetics causing the fear response to be encoding not only to Room B, but Room A as well. To test this hypothesis, the mice were brought back to Room A on day three.
It’s just about that time of year again – in just over a week’s time we’ll be sitting down to a huge feast consisting of turkey, stuffing, and mashed potatoes; we’ll be watching the Macy’s Parade soon to be followed by two football games; and we’ll be giving thanks for our reunion with our grandparents, uncles, aunts, cousins, brothers, sisters, parents, and more. Thanksgiving definitely holds a special place in my heart – however, up until recently, it always used to provide just a little bit of stress. That is because, at least in my family, somewhere between polishing off the last roll and preparing for pecan pie one relative or another always asks me, “so what are you studying in school again?” And when I answer “Neuroscience!” I typically get one of two responses: the confused look, followed by “Neuroscience? What is neuroscience?” (typically from the older crowd in the room), or the rolling of the eyes, followed by “What are you going to do with a degree in neuroscience?” (typically from the former engineers and business majors). I love neuroscience, and I know I’ve found my passion studying it here at BU, but those questions always seem to bring with them a certain pressure that I always felt I cracked beneath. However, I recently discovered the perfect way to address both of these questions, and I’m here to let you in on the secret so you can impress your relatives at the thanksgiving dinner table as well. This year, when Grandma or Uncle Tony ask me “why neuroscience?” my answer will be simple – because neuroscience is changing, and will continue to change, the world and how we approach it.
I can already imagine the taken aback look crossing my relative’s faces, and the comment that I’m perhaps being a little dramatic – neuroscience is changing the world? Not only will my answer definitely get their attention, but I’m confident that my answer is correct, and proving my point to my disbelieving family will only make Thanksgiving that much more fun. Neuroscience is the science of understanding the nervous system (that is the system that essentially allows for all of our functioning) on a basic scientific level, and then applying that knowledge to do a bunch of things, from eradicating the diseases that plague the system (Alzheimer’s, Parkinson’s), to applying the knowledge in the classroom so that students of all ages can learn to their full potential. If you take a step back and view the whole picture, it’s not surprising that neuroscience will change the world in our lifetime; as opposed to some other fields, neuroscience is constantly acquiring completely new information about systems that not too long ago used to be a complete mystery – this knowledge is overflowing and already being applied to the real world to make beneficial changes. I will quickly outline two fascinating new outlets of neuroscience that are changing the world right before our very eyes, so that you have solid proof to further widen the eyes of your relatives this holiday season.
Most of us have heard about them but only a few appreciate the power of them. It was more than 20 years ago that scientists discovered the fascinating mirror neurons. It was at the University of Parma, Italy where the first glimpse of mirror neurons occurred. The study’s focus was actually to examine motor neurons involved in hand and mouth actions in macaque monkeys. The basic procedure of the experiment involved monkeys reaching for food while researchers recorded firing in particular neurons. What these researchers found was that some neurons actually fired even when the monkey was not moving but was just watching someone else perform an action. So, one may easily deduce that mirror neurons are neurons that fire both when an animal performs an action and when an animal observes someone else perform an action. Nearly 20 years after the initial macaque monkey experiment mirror neurons studies are still generating fascinating results. The reason for this fascination is that mirror neurons are at the base of extremely important functions such as socialization, empathy and teaching.
More recent discoveries have shown that mirror neurons are critical in the interpretation of both facial expressions and body language. Moreover mirror neurons enable us to understand, empathize and socialize with others. As studies have shown, autistic individuals have trouble understanding other people´s intentions and feelings. Autistic individuals cannot understand the intentions of others while observing their actions. This is believed, at least in part, because autistic individuals have a malfunctioning mirror neuron system. This malfunctioning system disables these individuals’ ability to even try and comprehend someone else’s actions based on observation. In contrast to autistic individuals, people with well functioning mirror neuron system have no problem understanding other people’s intentions, which makes mirror neurons so important.
In my vision modeling class this week, we were learning about the structure of the (primate) visual cortex and one of my classmates posed an interesting question: how is it that birds sustain such amazing visual acuity when they don’t seem to have the cortical volume to process that detailed information? In other words, how does a bird brain deal witha bird’s eye view? I’m curious – and I still am, because so far I have not found a lot of research on the topic. Indeed, I imagine it’s difficult to come up with a definitive way to determine what a bird is experiencing for the sake of a laboratory experiment. Although, if I had to hazard a guess, perhaps much of a bird’s reaction to what it sees relies on more primitive structures – maybe birds rely more on instinct than interpretation? While this seems to remain mysterious, scientists do know some neat stuff about how birds’ eyes function in ways that allow them to see what we can’t. Check it out!