If you have ever noticed that men tend to forget things quite often, especially compared to women, you are not alone. A research team led by Professor Jostein Holmen in Norway conducted a large, longitudinal population health study called Hunt3 to reach the conclusion that men are more forgetful than women, regardless of their age. This is one of the largest health studies ever performed, with answers from over 48,000 people leading to their conclusions.
The participants were asked at the beginning of the study if they had problems remembering things, if they had problems remembering dates and names, if they had a memory of what they did one year ago, and if they could remember details about specific conversations.
We know from everyday life that, at some point, we need to sleep. In fact, extended sleep deprivation can lead to death. Despite the amount of sleep research that has been conducted, none have been able to clearly reason out the essential function of sleep. However, recently, a promising study by Dr. Nedergaard showed that sleep functions in clearing neurotoxic waste from the brain of mice. In effect, without sleep, these toxins would build up and cause problems for the body.
Specifically, the study looked at what is known as the glymphatic system. Because our central nervous system lacks a lymphatic system which is in our peripheral system, the glymphatic clearance pathway is the primary way in which our brain can “clear” the cerebrospinal fluid (CSF) and interstital fluid (ISF) of the brain parenchyma. This clearance includes functions of getting rid of wastes, soluble proteins, and even controlling the volume of fluid. Interestingly, the Nedergaard study showed that this clearance system works faster when mice were asleep–in other words, the exchange rates of CSF and ISF increased during sleep. In addition, they were able to show that surrounding cells in the brain would shrink in size to allow more efficient clearance.
A groundbreaking new research study by Susumu Tonegawa’s team at MIT has opened up grounds for debate in the ethics of neuroscience once again. In Tonegawa’s experiment, neuroscientists were able to implant memories into the brains of mice using optogenetics, a technology in which specific cells can be turned on or off by exposure to a certain wavelength of light. The specific memory manipulated in this study was a conditioned fear response in mice to a mild electrical foot shock.
Researchers in Tonegawa’s lab began by engineering mice hippocampal cells to express channelrhodopsin, a protein that activates specific neurons when stimulated by light. Channelrhodopsin was also modified to be produced whenever c-fos, a gene necessary for memory formation, was turned on. On day one, the engineered mice explored Room A without any exposure to foot shock; the mice behaved normally. As the mice explored this room, their memory cells were labeled with channelrhodopsin. On day two, the same mice were placed into Room B, a distinctly different room, where a foot shock was received; the mice exhibited a fear response. While receiving the foot shocks, channelrhodopsin was activated via optogenetics causing the fear response to be encoding not only to Room B, but Room A as well. To test this hypothesis, the mice were brought back to Room A on day three.
Before babies can crawl or walk, they explore the world around them by looking at it. This is a natural and necessary part of infant development, and it sets the stage for future brain growth. By using eye-tracking technology, scientists were able to measure the way infants look at and respond to different social cues. This new research suggests that babies who are reluctant to look into people’s eyes may be showing early signs of autism.
The researchers at Marcus Autism Center, Children’s Healthcare of Atlanta and Emory University School of Medicine followed babies from birth until age 3, and discovered that infants later diagnosed with autism showed declining attention to the eyes of other people, from the age of 2 months onwards.
It’s just about that time of year again – in just over a week’s time we’ll be sitting down to a huge feast consisting of turkey, stuffing, and mashed potatoes; we’ll be watching the Macy’s Parade soon to be followed by two football games; and we’ll be giving thanks for our reunion with our grandparents, uncles, aunts, cousins, brothers, sisters, parents, and more. Thanksgiving definitely holds a special place in my heart – however, up until recently, it always used to provide just a little bit of stress. That is because, at least in my family, somewhere between polishing off the last roll and preparing for pecan pie one relative or another always asks me, “so what are you studying in school again?” And when I answer “Neuroscience!” I typically get one of two responses: the confused look, followed by “Neuroscience? What is neuroscience?” (typically from the older crowd in the room), or the rolling of the eyes, followed by “What are you going to do with a degree in neuroscience?” (typically from the former engineers and business majors). I love neuroscience, and I know I’ve found my passion studying it here at BU, but those questions always seem to bring with them a certain pressure that I always felt I cracked beneath. However, I recently discovered the perfect way to address both of these questions, and I’m here to let you in on the secret so you can impress your relatives at the thanksgiving dinner table as well. This year, when Grandma or Uncle Tony ask me “why neuroscience?” my answer will be simple – because neuroscience is changing, and will continue to change, the world and how we approach it.
I can already imagine the taken aback look crossing my relative’s faces, and the comment that I’m perhaps being a little dramatic – neuroscience is changing the world? Not only will my answer definitely get their attention, but I’m confident that my answer is correct, and proving my point to my disbelieving family will only make Thanksgiving that much more fun. Neuroscience is the science of understanding the nervous system (that is the system that essentially allows for all of our functioning) on a basic scientific level, and then applying that knowledge to do a bunch of things, from eradicating the diseases that plague the system (Alzheimer’s, Parkinson’s), to applying the knowledge in the classroom so that students of all ages can learn to their full potential. If you take a step back and view the whole picture, it’s not surprising that neuroscience will change the world in our lifetime; as opposed to some other fields, neuroscience is constantly acquiring completely new information about systems that not too long ago used to be a complete mystery – this knowledge is overflowing and already being applied to the real world to make beneficial changes. I will quickly outline two fascinating new outlets of neuroscience that are changing the world right before our very eyes, so that you have solid proof to further widen the eyes of your relatives this holiday season.
As we approach the loved holiday season, we also approach the dreaded weight gain that comes along with it. It probably won’t come as a surprise to you that our brain, specifically the hippocampus, plays a role in resisting immediate or delayed temptation.
The hippocampus deals with memory, including recalling past events and imagining them in the future. A study called “A Critical Role for the Hippocampus in the Valuation of Imagined Outcomes” examines healthy people as well as people with Alzheimer’s disease, which impairs memory and is associated with atrophy of the hippocampus. The study looked at “time- dependent” choices having to do with money in addition to “episodic” choices having to do with food, sports, and cultural events.
Imagine for a second feeling “overcomplete.” You have all of your limbs, and they are perfectly healthy. Yet, you feel as though your leg doesn’t belong to you. It shouldn’t be there, and you know it needs to go. The only way you can feel “whole” again is through its removal. You might be wondering why anyone would want to get rid of a perfectly healthy limb. Well, this phenomenon is the result of apotemnophilia, or Body Integrity Identity Disorder (BIID), a condition characterized “by the intense and long-standing desire for the amputation of a specific limb.” Most know exactly where they want the line of amputation to be, and this place stays fairly constant as time passes. Since it is rare for a surgeon to agree to amputate a healthy limb, many suffering from this condition will unfortunately resort to attempting the amputation themselves. In the past, this rare disorder was thought to be only psychological, and that perhaps the yearning for amputation was simply a way of seeking attention. However, in recent years, studies have shown otherwise.
One study performed by David Brang, Paul D. McGeoch and Vilayanur S. Ramachandran suggests that apotemnophilia is indeed a neurological disorder. The otherwise healthy subjects of the study went through a series of blind skin conductance response tests where they were pinpricked above and below the line where the amputation was. The responses to the pinpricking below the line were much greater than those above it. The resulting differences found suggest some abnormalities with the somatosensory input to the body part in question.
They suspect the disorder to be a consequence of damage to the right parietal lobe, which is responsible for processing sensory input from certain areas and forming our sense of body image. It could be the dysfunction within the right parietal lobe, specifically the right superior parietal lobe, causing irregular sympathetic output. The discrepancy between the signals and body image can lead to the feeling that a certain limb doesn’t belong and should be removed in order to feel “complete.” In addition, most patients have dealt with these feelings since childhood suggesting that the dysfunction is congenital.
Google, IBM, Microsoft, Baidu, NEC and others use deep learning and neural networks in development of their most recent speech recognition and image analysis systems. Neural networks have countless other uses, so naturally there are tons of startups trying to use neural networks in new ways. The problem being faced with now, is how exactly to implement neural network models in a way that mimics the circuitry of the brain. Brains are highly parallel and extremely efficient; the ultimate achievement of a neural network model would be able to perform large scale parallel operations while being energy efficient. Current technology is not well suited for large-scale parallel processing, and it is nowhere near as efficient as our brain, which uses only 20 watts of power on average (a typical supercomputer uses somewhere along the order of several megawatts of electricity).
One way in which future computers could mimic the efficiency of the brain is being developed by IBM. IBM believes that energy efficiency is what will guide the next generation of computers, not raw processing power. Current silicon chips have been doubling in power through Moore’s Law for almost half a century, but are now reaching a physical limit. To break through this limit, researchers at IBM’s Zurich lab headed by Dr. Patrick Ruch and Dr. Bruno Michel want to mimic biology’s allometric scaling for new “bionic” computing. Allometric scaling is when an animal’s metabolic power increases with its body size; the approach taken by IBM is to start with 3D computing architecture, with processors stacked and memory in between. In order to keep everything running without overheating, this biologically-inspired computer would be powered, and cooled, by so-called electronic blood. Hopefully this fluid will be able to multi-task, and like blood supplies sugar while taking away heat, accomplish liquid fueling and cooling at the same time.
Huntington’s disease, a neuro-degenerative disorder, affects roughly 5-10 out of every 100,000 people. The disease acts by deteriorating many structures in the brain, beginning with the Caudate Nucleus, which is involved in motor control. By the end of their lives, patients are expected to lose about 30% of their brain mass.
In the last few decades much has been learned about the disease: we now know that it’s genetic, caused by a mutation on the 4th chromosome, and leads the ‘so-called’ huntingtin protein to grow too long and fold in on itself incorrectly. Parents with Huntington’s disease each have a 50% chance of passing it on to their child. Symptoms usually appear between 35 and 45 years of age, and the life expectancy after the first appearance of symptoms is 10-20 years. There currently is no treatment for Huntington’s itself, so patients can only receive a little bit of relief from their symptoms while the disease progresses.
So far, there is no real consensus among scientists about how exactly the disease works, but it’s generally agreed that it all starts with the mutated huntingtin protein. What if that protein was somehow changed or blocked? This idea prompted some experimentation by Holly Kordasiewicz and her colleagues.
Have you ever sat down to write a paper at 10pm the night before it is due? It is like setting out to run a marathon. You have all your necessary snacks lined up on the desk, the perfect playlist is on, breaks are not an option, and it will probably take you around 5 hours (more or less depending how much research and dedication you prepared for this moment). Most people are more likely to encounter this type of marathon than an actual marathon; however, according to a recent research in the field of psychophysiology, marathons require just as much mental effort as the aforementioned paper.
Samuele Marcora, a professor and researcher at the University of Kent, has recently theorized that the perceived mental effort during physical activity is just as important as how strenuous as the activity actually is. Previously, the common idea has relied on the afferent feedback model – “that perception of effort results from the complex integration of sensory signals from the active muscles and other organs” and relies little on what the person actually perceives to be difficult. In the general field of exercise physiology, it has been assumed that once a subject exerts a certain amount of energy, exhaustion is reacted because there are no more stores of energy to draw upon. Marcora’s studies are beginning to prove that this previously assumed idea is not the case. Marathon runners are still able to function after a marathon, before refueling, even though they may not have been able to push any harder during the race. This is just one example of how there may be enough glycogen (units of energy) yet the runner is unable to keep pushing harder. According to these new studies, the actual energy available to a person during strenuous exercise, if there is enough glycogen present, relies on the person’s ‘brain strength’ to push past the mental boundaries caused by negative thoughts, boredom, and a build up of neurotransmitters – something that Marcora believes that runners can train to become easier.