I think I’m funny. Some people say I’m funny. But when the moment presents itself where its my time to shine, all lights on me, this ‘one’ is going to be a knee slapper…nope, not so much. The first time I realized I wasn’t funny was in the eleventh grade in my calculus class. My teacher’s name was Mr. Butke and he easily is ranked in my top 3 ‘all-time’ of the math professors I’ve encountered in my lifetime. He had a mustache that covered his mouth and you never knew whether he was smiling, smirking, or grimacing at you. It kept you guessing, I liked that. He also presented stories of how he slayed cobras in Kenyan villages while pursuing a multi-purpose cure for malaria, encephalitis’ of sorts, and maybe AIDS. Bottom line, he was memorable and his stage presence resonated with my classmates and I.
Philosophers since the time of Plato have considered the extent to which we can truly perceive the physical world, or the so called ‘mind independent’ universe. Modern science has given us further insight into the question, through experiments designed to understand the way in which our brain receives and manipulates sensory information. While it has been known for some time that human perception is subject to various priming effects and spatiotemporal biases, psychologists at the University of California, Berkeley have discovered that visual perception is also influenced by something called the ‘continuity field.’
To put it simply, the continuity field is what allows us to view our surrounding environment as a continuous perception. In his recent article in Nature Neuroscience, David Whitney and his colleagues have shown that our perception of the orientation of a certain object in our visual field is actually strongly biased towards the orientation of that object 10 seconds prior. This means that our brain ‘smoothes out’ small changes in the physical world so that we perceive a continuous image. Without the influence of this continuity field, we would be hypersensitive to the smallest changes in our visual field, and presumably have trouble determining which changes in our surroundings would be most relevant to our immediate needs.
Recently, The Atlantic posted an article relating the growing field of neuroscience to international negotiations, specifically those surrounding the Iranian nuclear negotiations. Co-written by a neuroscientist and an expert in international relations, the article prompted a rather stern and testy response from Christian Jarrett, a science writer for Wired, yet he brings up some excellent points. Before continuing, I urge you to read The Atlantic‘s article here.
Although it may be well-intentioned, it appears that The Atlantic ’s article is little more than an attempt to grab headlines and call more attention to the piece, riding the hype trains of two popular subjects. While it appears that applying concepts from neuroscience to news and international negotiations might be something that can contribute to our understanding, realistically it only serves to dilute the field. At best, it is a misguided attempt at connections between fields. At worst, it is another example of today’s journalism: lazy and prone to clickbait.
What’s the latest on all that news about concussions in the NFL? It’s been a while since this story initially made its way into headlines and penetrated news and popular culture around the country. The neuroscience world continues to bustle over the story and research on the disease now known as chronic traumatic encephalopathy, or CTE, remains a priority right here at the Boston University School of Medicine in the Center for the Study of Traumatic Encephalopathy (CSTE) directed by Dr. Robert Cantu, Dr. Anne McKee, and Mr. Chris Nowinski. Especially with the release of the Frontline documentary released last October titled “League of Denial” (see below), aimed at exploring the possibility that the NFL has mishandled the concussion issue for a number of years, have more and more families in the United States and all over the world begun to understand the potential repercussions of traumatic head injury.
A groundbreaking new research study by Susumu Tonegawa’s team at MIT has opened up grounds for debate in the ethics of neuroscience once again. In Tonegawa’s experiment, neuroscientists were able to implant memories into the brains of mice using optogenetics, a technology in which specific cells can be turned on or off by exposure to a certain wavelength of light. The specific memory manipulated in this study was a conditioned fear response in mice to a mild electrical foot shock.
Researchers in Tonegawa’s lab began by engineering mice hippocampal cells to express channelrhodopsin, a protein that activates specific neurons when stimulated by light. Channelrhodopsin was also modified to be produced whenever c-fos, a gene necessary for memory formation, was turned on. On day one, the engineered mice explored Room A without any exposure to foot shock; the mice behaved normally. As the mice explored this room, their memory cells were labeled with channelrhodopsin. On day two, the same mice were placed into Room B, a distinctly different room, where a foot shock was received; the mice exhibited a fear response. While receiving the foot shocks, channelrhodopsin was activated via optogenetics causing the fear response to be encoding not only to Room B, but Room A as well. To test this hypothesis, the mice were brought back to Room A on day three.
It’s just about that time of year again – in just over a week’s time we’ll be sitting down to a huge feast consisting of turkey, stuffing, and mashed potatoes; we’ll be watching the Macy’s Parade soon to be followed by two football games; and we’ll be giving thanks for our reunion with our grandparents, uncles, aunts, cousins, brothers, sisters, parents, and more. Thanksgiving definitely holds a special place in my heart – however, up until recently, it always used to provide just a little bit of stress. That is because, at least in my family, somewhere between polishing off the last roll and preparing for pecan pie one relative or another always asks me, “so what are you studying in school again?” And when I answer “Neuroscience!” I typically get one of two responses: the confused look, followed by “Neuroscience? What is neuroscience?” (typically from the older crowd in the room), or the rolling of the eyes, followed by “What are you going to do with a degree in neuroscience?” (typically from the former engineers and business majors). I love neuroscience, and I know I’ve found my passion studying it here at BU, but those questions always seem to bring with them a certain pressure that I always felt I cracked beneath. However, I recently discovered the perfect way to address both of these questions, and I’m here to let you in on the secret so you can impress your relatives at the thanksgiving dinner table as well. This year, when Grandma or Uncle Tony ask me “why neuroscience?” my answer will be simple – because neuroscience is changing, and will continue to change, the world and how we approach it.
I can already imagine the taken aback look crossing my relative’s faces, and the comment that I’m perhaps being a little dramatic – neuroscience is changing the world? Not only will my answer definitely get their attention, but I’m confident that my answer is correct, and proving my point to my disbelieving family will only make Thanksgiving that much more fun. Neuroscience is the science of understanding the nervous system (that is the system that essentially allows for all of our functioning) on a basic scientific level, and then applying that knowledge to do a bunch of things, from eradicating the diseases that plague the system (Alzheimer’s, Parkinson’s), to applying the knowledge in the classroom so that students of all ages can learn to their full potential. If you take a step back and view the whole picture, it’s not surprising that neuroscience will change the world in our lifetime; as opposed to some other fields, neuroscience is constantly acquiring completely new information about systems that not too long ago used to be a complete mystery – this knowledge is overflowing and already being applied to the real world to make beneficial changes. I will quickly outline two fascinating new outlets of neuroscience that are changing the world right before our very eyes, so that you have solid proof to further widen the eyes of your relatives this holiday season.
Most of us have heard about them but only a few appreciate the power of them. It was more than 20 years ago that scientists discovered the fascinating mirror neurons. It was at the University of Parma, Italy where the first glimpse of mirror neurons occurred. The study’s focus was actually to examine motor neurons involved in hand and mouth actions in macaque monkeys. The basic procedure of the experiment involved monkeys reaching for food while researchers recorded firing in particular neurons. What these researchers found was that some neurons actually fired even when the monkey was not moving but was just watching someone else perform an action. So, one may easily deduce that mirror neurons are neurons that fire both when an animal performs an action and when an animal observes someone else perform an action. Nearly 20 years after the initial macaque monkey experiment mirror neurons studies are still generating fascinating results. The reason for this fascination is that mirror neurons are at the base of extremely important functions such as socialization, empathy and teaching.
More recent discoveries have shown that mirror neurons are critical in the interpretation of both facial expressions and body language. Moreover mirror neurons enable us to understand, empathize and socialize with others. As studies have shown, autistic individuals have trouble understanding other people´s intentions and feelings. Autistic individuals cannot understand the intentions of others while observing their actions. This is believed, at least in part, because autistic individuals have a malfunctioning mirror neuron system. This malfunctioning system disables these individuals’ ability to even try and comprehend someone else’s actions based on observation. In contrast to autistic individuals, people with well functioning mirror neuron system have no problem understanding other people’s intentions, which makes mirror neurons so important.
In my vision modeling class this week, we were learning about the structure of the (primate) visual cortex and one of my classmates posed an interesting question: how is it that birds sustain such amazing visual acuity when they don’t seem to have the cortical volume to process that detailed information? In other words, how does a bird brain deal witha bird’s eye view? I’m curious – and I still am, because so far I have not found a lot of research on the topic. Indeed, I imagine it’s difficult to come up with a definitive way to determine what a bird is experiencing for the sake of a laboratory experiment. Although, if I had to hazard a guess, perhaps much of a bird’s reaction to what it sees relies on more primitive structures – maybe birds rely more on instinct than interpretation? While this seems to remain mysterious, scientists do know some neat stuff about how birds’ eyes function in ways that allow them to see what we can’t. Check it out!
Sometimes, writing is tough. The passion isn’t there, and every word is a struggle. We’ve all had those moments when forced to do something artistic or creative, whether it be writing or drawing or playing an instrument (or anything really). We’re just not into it, we don’t feel the pulse of the art pounding in our blood. Yet at other times, it’s like our blood rushes in a massive torrential pour, as if it had been held back by a massive dam for a thousand years. Whether its a subject that makes you jump for joy, a song you can head-bang to, or some other Picasso, some things just burst forth in a sudden and fervent explosion of productivity and creativity.
I think we’ve all had those moments when the pieces all click together, and a piece of work flows from us as easily as a hot knife through butter. During those moments, we feel alive, throbbing with a vibrant energy as our whole being is focused onto a single task. It’s an exhilarating feeling, yet at the same time, when you finally come down out of this strange natural high, it feels as though there was something slightly wrong about that, as if those who are capable of reaching that level often must have something wrong with them.
This is in reference to a 2011 lecture entitled “Plato’s Philosophy of Art”, given by Dr. James Grant of the University of London, Birkbeck. An audio recording of the lecture can be found at the bottom.
Today, Plato is probably known best for his work Republic, an outline of a highly idealistic and just city-state. Many remember bits and pieces from their Intro to Philosophy classes, but a criticism that is generally brushed over in discussion of the Republic is Plato’s flat-out renunciation of art. A prerequisite in understanding Plato’s position is realizing the role that art, and specifically poetry, played in Greek culture.
Poetry in the time of Plato played a similar role to the Bible in early American culture. Sections were recited at schools, in homes, and children were expected to memorize various passages for later recitation. Much like the Bible, these poems formed early moral backbones in young Greeks and were very much responsible for the development of certain cultural norms. It wasn’t so much a problem for Plato that art had such a grip on the cultural norms and moral fibers of a society, but rather that the artists themselves had no understanding of what they were representing, and thus inspired corrupt and destructive morals. In the eyes of Plato, the artist or poet was typically not the ideal moral character in any society, and thus should not have been in charge of dictating moral grounds or developing cultural norms. A second complaint Plato had about the role of the artist was that even if they were generally a moral and civilized human being, they were falsely representing reality through their art, something which Plato very much opposed to and which undermined a central theory in Platonism. More