As we approach the loved holiday season, we also approach the dreaded weight gain that comes along with it. It probably won’t come as a surprise to you that our brain, specifically the hippocampus, plays a role in resisting immediate or delayed temptation.
The hippocampus deals with memory, including recalling past events and imagining them in the future. A study called “A Critical Role for the Hippocampus in the Valuation of Imagined Outcomes” examines healthy people as well as people with Alzheimer’s disease, which impairs memory and is associated with atrophy of the hippocampus. The study looked at “time- dependent” choices having to do with money in addition to “episodic” choices having to do with food, sports, and cultural events.
Imagine for a second feeling “overcomplete.” You have all of your limbs, and they are perfectly healthy. Yet, you feel as though your leg doesn’t belong to you. It shouldn’t be there, and you know it needs to go. The only way you can feel “whole” again is through its removal. You might be wondering why anyone would want to get rid of a perfectly healthy limb. Well, this phenomenon is the result of apotemnophilia, or Body Integrity Identity Disorder (BIID), a condition characterized “by the intense and long-standing desire for the amputation of a specific limb.” Most know exactly where they want the line of amputation to be, and this place stays fairly constant as time passes. Since it is rare for a surgeon to agree to amputate a healthy limb, many suffering from this condition will unfortunately resort to attempting the amputation themselves. In the past, this rare disorder was thought to be only psychological, and that perhaps the yearning for amputation was simply a way of seeking attention. However, in recent years, studies have shown otherwise.
One study performed by David Brang, Paul D. McGeoch and Vilayanur S. Ramachandran suggests that apotemnophilia is indeed a neurological disorder. The otherwise healthy subjects of the study went through a series of blind skin conductance response tests where they were pinpricked above and below the line where the amputation was. The responses to the pinpricking below the line were much greater than those above it. The resulting differences found suggest some abnormalities with the somatosensory input to the body part in question.
They suspect the disorder to be a consequence of damage to the right parietal lobe, which is responsible for processing sensory input from certain areas and forming our sense of body image. It could be the dysfunction within the right parietal lobe, specifically the right superior parietal lobe, causing irregular sympathetic output. The discrepancy between the signals and body image can lead to the feeling that a certain limb doesn’t belong and should be removed in order to feel “complete.” In addition, most patients have dealt with these feelings since childhood suggesting that the dysfunction is congenital.
Google, IBM, Microsoft, Baidu, NEC and others use deep learning and neural networks in development of their most recent speech recognition and image analysis systems. Neural networks have countless other uses, so naturally there are tons of startups trying to use neural networks in new ways. The problem being faced with now, is how exactly to implement neural network models in a way that mimics the circuitry of the brain. Brains are highly parallel and extremely efficient; the ultimate achievement of a neural network model would be able to perform large scale parallel operations while being energy efficient. Current technology is not well suited for large-scale parallel processing, and it is nowhere near as efficient as our brain, which uses only 20 watts of power on average (a typical supercomputer uses somewhere along the order of several megawatts of electricity).
One way in which future computers could mimic the efficiency of the brain is being developed by IBM. IBM believes that energy efficiency is what will guide the next generation of computers, not raw processing power. Current silicon chips have been doubling in power through Moore’s Law for almost half a century, but are now reaching a physical limit. To break through this limit, researchers at IBM’s Zurich lab headed by Dr. Patrick Ruch and Dr. Bruno Michel want to mimic biology’s allometric scaling for new “bionic” computing. Allometric scaling is when an animal’s metabolic power increases with its body size; the approach taken by IBM is to start with 3D computing architecture, with processors stacked and memory in between. In order to keep everything running without overheating, this biologically-inspired computer would be powered, and cooled, by so-called electronic blood. Hopefully this fluid will be able to multi-task, and like blood supplies sugar while taking away heat, accomplish liquid fueling and cooling at the same time.
Huntington’s disease, a neuro-degenerative disorder, affects roughly 5-10 out of every 100,000 people. The disease acts by deteriorating many structures in the brain, beginning with the Caudate Nucleus, which is involved in motor control. By the end of their lives, patients are expected to lose about 30% of their brain mass.
In the last few decades much has been learned about the disease: we now know that it’s genetic, caused by a mutation on the 4th chromosome, and leads the ‘so-called’ huntingtin protein to grow too long and fold in on itself incorrectly. Parents with Huntington’s disease each have a 50% chance of passing it on to their child. Symptoms usually appear between 35 and 45 years of age, and the life expectancy after the first appearance of symptoms is 10-20 years. There currently is no treatment for Huntington’s itself, so patients can only receive a little bit of relief from their symptoms while the disease progresses.
So far, there is no real consensus among scientists about how exactly the disease works, but it’s generally agreed that it all starts with the mutated huntingtin protein. What if that protein was somehow changed or blocked? This idea prompted some experimentation by Holly Kordasiewicz and her colleagues.
Have you ever sat down to write a paper at 10pm the night before it is due? It is like setting out to run a marathon. You have all your necessary snacks lined up on the desk, the perfect playlist is on, breaks are not an option, and it will probably take you around 5 hours (more or less depending how much research and dedication you prepared for this moment). Most people are more likely to encounter this type of marathon than an actual marathon; however, according to a recent research in the field of psychophysiology, marathons require just as much mental effort as the aforementioned paper.
Samuele Marcora, a professor and researcher at the University of Kent, has recently theorized that the perceived mental effort during physical activity is just as important as how strenuous as the activity actually is. Previously, the common idea has relied on the afferent feedback model – “that perception of effort results from the complex integration of sensory signals from the active muscles and other organs” and relies little on what the person actually perceives to be difficult. In the general field of exercise physiology, it has been assumed that once a subject exerts a certain amount of energy, exhaustion is reacted because there are no more stores of energy to draw upon. Marcora’s studies are beginning to prove that this previously assumed idea is not the case. Marathon runners are still able to function after a marathon, before refueling, even though they may not have been able to push any harder during the race. This is just one example of how there may be enough glycogen (units of energy) yet the runner is unable to keep pushing harder. According to these new studies, the actual energy available to a person during strenuous exercise, if there is enough glycogen present, relies on the person’s ‘brain strength’ to push past the mental boundaries caused by negative thoughts, boredom, and a build up of neurotransmitters – something that Marcora believes that runners can train to become easier.
Stressed out? You may be at a higher risk for Alzheimer’s disease. You’re probably wondering to yourself how that is possible. Highly intelligent people who use their brains all of the time, like scientists, CEOs, and presidents, deal with stress on a day to day basis. The truth is that lack of higher education or brain activity is not the only major cause of dementia.
If keeping your brain active is a good way to prevent cognitive decline, then why did people such as Ronald Reagan and Norman Rockwell develop Alzheimer’s disease? The answer is stress. Recent studies have shown that people who deal with high levels of stress in their career or their family life are more likely to develop dementia. Stress cannot be said to directly cause dementia, but it is a trigger for the degenerative process in the brain.
An Argentine research team examined 118 patients with Alzheimer’s disease and 81 healthy individuals whose age, gender, and educational level were comparable to the Alzheimer’s patients. Both groups were questioned about the amount of stress that they had faced in the past three years. The researchers reported that 72% of the Alzheimer’s patients admitted to coping with severe emotional stress or grief, such as the death of a loved one or financial problems. This was nearly three times as many as the control group.
A man or a woman could be of above-average intelligence, well-educated in medicine and psychology, and understand that every statistical measurement of personality and temperament can be distributed across a bell-curve in a large enough population. He or she could comprehend the determining power of genetics, the impact of cultural influence on belief systems, and how neuroplasticity molds our mental processing to respond to environmental stimuli.
A clever, sophisticated professional could understand all of this, yet still believe that somehow, all members of the opposite sex are manipulative and irrational. Not some, all 3.5 billion of them. Why?
Perhaps throughout his or her life, this person has made irrational decisions to socialize with very irrational and emotional people of the opposite sex, and through these experiences has thus formed a gender bias. Research shows we tend to mostly place people into categories of gender, race, and age. This task is so pervasive that scientists have deemed it our “primitive” categorization.
Once we’ve made up our mind about a group, or even someone in particular (consciously or not), it’s often hard to change our opinion. When beliefs are formed, confirmation biases kick in and begin to look for information that supports our views, and selectively ignore everything which doesn’t. Maybe someone had decided that you were shy and uptight when you first met. You were more reticent than usual because you had only gotten 3 hours of sleep the night before. Now that acquaintance may not notice all the times you’re friendly and outgoing, but instead seems to pounce on all the times you’re a little quiet.
Are carbohydrates holding us back from our true potential? Exploring the possibilities of a ketogenic diet.
It is hard to avoid carbohydrates in the world we live in today, where since the industrial age 100-200 years ago, factories have been able to produce large quantities of sugar and white flour to feed the masses. Really though, foods high in carbohydrates (such as pasta, bread, rice, and potatoes) have only been available to us since the rise of agriculture, approximately 5-10,000 years ago. Prior to that, humans assumed a hunter-gatherer lifestyle where our diets consisted primarily of animal products and low starch vegetables, basically whatever we could find in nature without growing ourselves.
What if you were able to erase all of your painful memories by simply taking a pill? While this might sound like something out of a sci-fi film, a recent study conducted by a group of researchers at MIT suggests that it may be possible in the future.
The researchers say that they’ve identified a gene known as Tet1 that appears to be important in the process of “memory extinction.” Memory extinction is the natural process of older memories being overridden by newer experiences. In this process, conditioned responses to stimuli can change: what once elicited a fearful response doesn’t always need to if the danger has ceased.
In the study, researchers compared normal mice to mice without the Tet1 gene. The researchers conditioned all of the mice to fear a particular cage where they received a mild shock. Once the memory was formed, the researchers then put the mice in the cage but did not shock them. After a while, mice with the Tet1 gene lost their fear of the cage as new memories replaced the old ones. However, mice lacking the Tet1 gene remained fearful.
It’s almost time for the dreaded fall midterms. Somehow, midterms manage to be even more stressful than finals. Maybe it’s because of the time of year they fall, which is easily the most beautiful time to be living in Boston. You just want to spend time outside walking on the esplanade, looking at the beautiful red and orange leaves on the trees that line the river and watching the rowing teams pass you by. Well, it may actually be beneficial to take some time out of your studying to take a stroll along the river, or to just sit on a bench for a little while. In fact, take some time to picnic this fall with some brain food, because studies show that it will enhance your studying experience.
Some of these brain foods include foods that you’d expect. These are the foods that your mom has been forcing down your throat, whether you like them or not, for as long as you can remember. But take a step back and think about why. Berries, for example, provide neurological benefits. They have a ton of antioxidants which will protect you from bacteria that make you sick when you’re stressed. Berries mediate signaling pathways that are involved in cell survival, and they increase the neuroplasticity, neurotransmission and calcium buffering properties of the brain, all related to aging, and in turn, memory and behavioral changes.