Google, IBM, Microsoft, Baidu, NEC and others use deep learning and neural networks in development of their most recent speech recognition and image analysis systems. Neural networks have countless other uses, so naturally there are tons of startups trying to use neural networks in new ways. The problem being faced with now, is how exactly to implement neural network models in a way that mimics the circuitry of the brain. Brains are highly parallel and extremely efficient; the ultimate achievement of a neural network model would be able to perform large scale parallel operations while being energy efficient. Current technology is not well suited for large-scale parallel processing, and it is nowhere near as efficient as our brain, which uses only 20 watts of power on average (a typical supercomputer uses somewhere along the order of several megawatts of electricity).
One way in which future computers could mimic the efficiency of the brain is being developed by IBM. IBM believes that energy efficiency is what will guide the next generation of computers, not raw processing power. Current silicon chips have been doubling in power through Moore’s Law for almost half a century, but are now reaching a physical limit. To break through this limit, researchers at IBM’s Zurich lab headed by Dr. Patrick Ruch and Dr. Bruno Michel want to mimic biology’s allometric scaling for new “bionic” computing. Allometric scaling is when an animal’s metabolic power increases with its body size; the approach taken by IBM is to start with 3D computing architecture, with processors stacked and memory in between. In order to keep everything running without overheating, this biologically-inspired computer would be powered, and cooled, by so-called electronic blood. Hopefully this fluid will be able to multi-task, and like blood supplies sugar while taking away heat, accomplish liquid fueling and cooling at the same time.
Huntington’s disease, a neuro-degenerative disorder, affects roughly 5-10 out of every 100,000 people. The disease acts by deteriorating many structures in the brain, beginning with the Caudate Nucleus, which is involved in motor control. By the end of their lives, patients are expected to lose about 30% of their brain mass.
In the last few decades much has been learned about the disease: we now know that it’s genetic, caused by a mutation on the 4th chromosome, and leads the ‘so-called’ huntingtin protein to grow too long and fold in on itself incorrectly. Parents with Huntington’s disease each have a 50% chance of passing it on to their child. Symptoms usually appear between 35 and 45 years of age, and the life expectancy after the first appearance of symptoms is 10-20 years. There currently is no treatment for Huntington’s itself, so patients can only receive a little bit of relief from their symptoms while the disease progresses.
So far, there is no real consensus among scientists about how exactly the disease works, but it’s generally agreed that it all starts with the mutated huntingtin protein. What if that protein was somehow changed or blocked? This idea prompted some experimentation by Holly Kordasiewicz and her colleagues.
Have you ever sat down to write a paper at 10pm the night before it is due? It is like setting out to run a marathon. You have all your necessary snacks lined up on the desk, the perfect playlist is on, breaks are not an option, and it will probably take you around 5 hours (more or less depending how much research and dedication you prepared for this moment). Most people are more likely to encounter this type of marathon than an actual marathon; however, according to a recent research in the field of psychophysiology, marathons require just as much mental effort as the aforementioned paper.
Samuele Marcora, a professor and researcher at the University of Kent, has recently theorized that the perceived mental effort during physical activity is just as important as how strenuous as the activity actually is. Previously, the common idea has relied on the afferent feedback model – “that perception of effort results from the complex integration of sensory signals from the active muscles and other organs” and relies little on what the person actually perceives to be difficult. In the general field of exercise physiology, it has been assumed that once a subject exerts a certain amount of energy, exhaustion is reacted because there are no more stores of energy to draw upon. Marcora’s studies are beginning to prove that this previously assumed idea is not the case. Marathon runners are still able to function after a marathon, before refueling, even though they may not have been able to push any harder during the race. This is just one example of how there may be enough glycogen (units of energy) yet the runner is unable to keep pushing harder. According to these new studies, the actual energy available to a person during strenuous exercise, if there is enough glycogen present, relies on the person’s ‘brain strength’ to push past the mental boundaries caused by negative thoughts, boredom, and a build up of neurotransmitters – something that Marcora believes that runners can train to become easier.
Stressed out? You may be at a higher risk for Alzheimer’s disease. You’re probably wondering to yourself how that is possible. Highly intelligent people who use their brains all of the time, like scientists, CEOs, and presidents, deal with stress on a day to day basis. The truth is that lack of higher education or brain activity is not the only major cause of dementia.
If keeping your brain active is a good way to prevent cognitive decline, then why did people such as Ronald Reagan and Norman Rockwell develop Alzheimer’s disease? The answer is stress. Recent studies have shown that people who deal with high levels of stress in their career or their family life are more likely to develop dementia. Stress cannot be said to directly cause dementia, but it is a trigger for the degenerative process in the brain.
An Argentine research team examined 118 patients with Alzheimer’s disease and 81 healthy individuals whose age, gender, and educational level were comparable to the Alzheimer’s patients. Both groups were questioned about the amount of stress that they had faced in the past three years. The researchers reported that 72% of the Alzheimer’s patients admitted to coping with severe emotional stress or grief, such as the death of a loved one or financial problems. This was nearly three times as many as the control group.
A man or a woman could be of above-average intelligence, well-educated in medicine and psychology, and understand that every statistical measurement of personality and temperament can be distributed across a bell-curve in a large enough population. He or she could comprehend the determining power of genetics, the impact of cultural influence on belief systems, and how neuroplasticity molds our mental processing to respond to environmental stimuli.
A clever, sophisticated professional could understand all of this, yet still believe that somehow, all members of the opposite sex are manipulative and irrational. Not some, all 3.5 billion of them. Why?
Perhaps throughout his or her life, this person has made irrational decisions to socialize with very irrational and emotional people of the opposite sex, and through these experiences has thus formed a gender bias. Research shows we tend to mostly place people into categories of gender, race, and age. This task is so pervasive that scientists have deemed it our “primitive” categorization.
Once we’ve made up our mind about a group, or even someone in particular (consciously or not), it’s often hard to change our opinion. When beliefs are formed, confirmation biases kick in and begin to look for information that supports our views, and selectively ignore everything which doesn’t. Maybe someone had decided that you were shy and uptight when you first met. You were more reticent than usual because you had only gotten 3 hours of sleep the night before. Now that acquaintance may not notice all the times you’re friendly and outgoing, but instead seems to pounce on all the times you’re a little quiet.
Are carbohydrates holding us back from our true potential? Exploring the possibilities of a ketogenic diet.
It is hard to avoid carbohydrates in the world we live in today. Since the industrial age 100-200 years ago, factories have been able to produce large quantities of sugar and white flour to feed the masses. Really though, foods high in carbohydrates (such as pasta, bread, rice, and potatoes) have only been available to us since the rise of agriculture, approximately 5-10,000 years ago. Prior to that, humans assumed a hunter-gatherer lifestyle where our diets consisted primarily of animal products and low starch vegetables; this was basically whatever we could find in nature without growing ourselves. According to Stephen D Phinney, simply due to circumstance, it is likely that the hunter-gatherer era of humans followed a high fat, moderate-high protein, and very low carbohydrate diet . This has become known as a ketogenic diet, named after ketosis, a natural metabolic state the body undergoes when carbohydrates are nearly eliminated from one’s diet.
What if you were able to erase all of your painful memories by simply taking a pill? While this might sound like something out of a sci-fi film, a recent study conducted by a group of researchers at MIT suggests that it may be possible in the future.
The researchers say that they’ve identified a gene known as Tet1 that appears to be important in the process of “memory extinction.” Memory extinction is the natural process of older memories being overridden by newer experiences. In this process, conditioned responses to stimuli can change: what once elicited a fearful response doesn’t always need to if the danger has ceased.
In the study, researchers compared normal mice to mice without the Tet1 gene. The researchers conditioned all of the mice to fear a particular cage where they received a mild shock. Once the memory was formed, the researchers then put the mice in the cage but did not shock them. After a while, mice with the Tet1 gene lost their fear of the cage as new memories replaced the old ones. However, mice lacking the Tet1 gene remained fearful.
It’s almost time for the dreaded fall midterms. Somehow, midterms manage to be even more stressful than finals. Maybe it’s because of the time of year they fall, which is easily the most beautiful time to be living in Boston. You just want to spend time outside walking on the esplanade, looking at the beautiful red and orange leaves on the trees that line the river and watching the rowing teams pass you by. Well, it may actually be beneficial to take some time out of your studying to take a stroll along the river, or to just sit on a bench for a little while. In fact, take some time to picnic this fall with some brain food, because studies show that it will enhance your studying experience.
Some of these brain foods include foods that you’d expect. These are the foods that your mom has been forcing down your throat, whether you like them or not, for as long as you can remember. But take a step back and think about why. Berries, for example, provide neurological benefits. They have a ton of antioxidants which will protect you from bacteria that make you sick when you’re stressed. Berries mediate signaling pathways that are involved in cell survival, and they increase the neuroplasticity, neurotransmission and calcium buffering properties of the brain, all related to aging, and in turn, memory and behavioral changes.
If you’ve ever seen someone with a baby, chances are you’ve heard them say something along the lines of “You’re so cute; I could just eat you up!”
Well a recent article published in Frontiers in Psychology by a research team at the Technische Universität of Dresden, Germany shows there may be a link between an infant’s smell and a female’s response, depending on the status of the female. Scientists have studied the connection between olfactory signals and the bond between a mother and her infant in several non-human mammal species. However, up until now, the research performed on mother-infant bonding in humans has only ever explored the visual and auditory senses.
What they did:
A total of 30 women were tested. Fifteen of the women had given birth for the first time three to six weeks prior to the experiment (primiparous). The other 15 women had never given birth (nulliparous). To obtain the sample odors, 18 infants each wore a T-shirt for two nights postpartum. The shirts were then placed in plastic bags and frozen to keep the odor unaltered. During the experiment, each woman was exposed to both “odorless” air and the odors of two different infants; primiparous women were never exposed to their own baby’s odor. The women were asked to rate the odor on intensity, familiarity, and pleasantness, though none of the participants were aware of what the odor stimulus was. As the women processed the different odors, an fMRI machine scanned their brain.
In the last century, treatment of social and learning disabilities has drastically changed. Through the Individuals with Disabilities Education Act, every student who qualifies for special education is entitled to a free and appropriate public education, delivered through an individualized education plan. An ‘IEP’ is designed through the collaboration of parents, teachers, and special education specialists. The largest category of learning disability is the specific learning disability, of which dyslexia is a typical example.
The amount of care put into special education has drastically changed the lives of many individuals, however, special education excludes those who have a learning disability due to economic situations. This reflects a longstanding social and educational belief that learning disabilities are innate, the result of genetic predisposition and not due to upbringing. The prevailing paradigm did not believe that upbringing could have a significant effect on the development on the brain.
To little surprise, neuroscience is showing otherwise.
We have always known that acute incidents can have a significant effect on brain development and function (such as in the effects of repeated physical trauma on function), but recent research is suggesting that external factors during development, including many associated with poverty, can have significant, long-term effects. These factors include higher levels of environmental toxins, lower nutritional levels, and increased levels of parental neglect. Recent research suggests that external factors, including poverty, can have significant internal effects on the brain, including brain development and function. Poverty affects the development of the brain in multiple ways, including through poverty-associated factors such as higher environmental toxins, lower nutritional levels, and higher levels of parental neglect. However, the research of Gary Evans and Michelle Schamberg of Cornell University indicates that solely the added stress of low socioeconomic status is responsible for these effects.