We know from everyday life that, at some point, we need to sleep. In fact, extended sleep deprivation can lead to death. Despite the amount of sleep research that has been conducted, none have been able to clearly reason out the essential function of sleep. However, recently, a promising study by Dr. Nedergaard showed that sleep functions in clearing neurotoxic waste from the brain of mice. In effect, without sleep, these toxins would build up and cause problems for the body.
Specifically, the study looked at what is known as the glymphatic system. Because our central nervous system lacks a lymphatic system which is in our peripheral system, the glymphatic clearance pathway is the primary way in which our brain can “clear” the cerebrospinal fluid (CSF) and interstital fluid (ISF) of the brain parenchyma. This clearance includes functions of getting rid of wastes, soluble proteins, and even controlling the volume of fluid. Interestingly, the Nedergaard study showed that this clearance system works faster when mice were asleep–in other words, the exchange rates of CSF and ISF increased during sleep. In addition, they were able to show that surrounding cells in the brain would shrink in size to allow more efficient clearance.
A groundbreaking new research study by Susumu Tonegawa’s team at MIT has opened up grounds for debate in the ethics of neuroscience once again. In Tonegawa’s experiment, neuroscientists were able to implant memories into the brains of mice using optogenetics, a technology in which specific cells can be turned on or off by exposure to a certain wavelength of light. The specific memory manipulated in this study was a conditioned fear response in mice to a mild electrical foot shock.
Researchers in Tonegawa’s lab began by engineering mice hippocampal cells to express channelrhodopsin, a protein that activates specific neurons when stimulated by light. Channelrhodopsin was also modified to be produced whenever c-fos, a gene necessary for memory formation, was turned on. On day one, the engineered mice explored Room A without any exposure to foot shock; the mice behaved normally. As the mice explored this room, their memory cells were labeled with channelrhodopsin. On day two, the same mice were placed into Room B, a distinctly different room, where a foot shock was received; the mice exhibited a fear response. While receiving the foot shocks, channelrhodopsin was activated via optogenetics causing the fear response to be encoding not only to Room B, but Room A as well. To test this hypothesis, the mice were brought back to Room A on day three.
Before babies can crawl or walk, they explore the world around them by looking at it. This is a natural and necessary part of infant development, and it sets the stage for future brain growth. By using eye-tracking technology, scientists were able to measure the way infants look at and respond to different social cues. This new research suggests that babies who are reluctant to look into people’s eyes may be showing early signs of autism.
The researchers at Marcus Autism Center, Children’s Healthcare of Atlanta and Emory University School of Medicine followed babies from birth until age 3, and discovered that infants later diagnosed with autism showed declining attention to the eyes of other people, from the age of 2 months onwards.
Google, IBM, Microsoft, Baidu, NEC and others use deep learning and neural networks in development of their most recent speech recognition and image analysis systems. Neural networks have countless other uses, so naturally there are tons of startups trying to use neural networks in new ways. The problem being faced with now, is how exactly to implement neural network models in a way that mimics the circuitry of the brain. Brains are highly parallel and extremely efficient; the ultimate achievement of a neural network model would be able to perform large scale parallel operations while being energy efficient. Current technology is not well suited for large-scale parallel processing, and it is nowhere near as efficient as our brain, which uses only 20 watts of power on average (a typical supercomputer uses somewhere along the order of several megawatts of electricity).
One way in which future computers could mimic the efficiency of the brain is being developed by IBM. IBM believes that energy efficiency is what will guide the next generation of computers, not raw processing power. Current silicon chips have been doubling in power through Moore’s Law for almost half a century, but are now reaching a physical limit. To break through this limit, researchers at IBM’s Zurich lab headed by Dr. Patrick Ruch and Dr. Bruno Michel want to mimic biology’s allometric scaling for new “bionic” computing. Allometric scaling is when an animal’s metabolic power increases with its body size; the approach taken by IBM is to start with 3D computing architecture, with processors stacked and memory in between. In order to keep everything running without overheating, this biologically-inspired computer would be powered, and cooled, by so-called electronic blood. Hopefully this fluid will be able to multi-task, and like blood supplies sugar while taking away heat, accomplish liquid fueling and cooling at the same time.
Huntington’s disease, a neuro-degenerative disorder, affects roughly 5-10 out of every 100,000 people. The disease acts by deteriorating many structures in the brain, beginning with the Caudate Nucleus, which is involved in motor control. By the end of their lives, patients are expected to lose about 30% of their brain mass.
In the last few decades much has been learned about the disease: we now know that it’s genetic, caused by a mutation on the 4th chromosome, and leads the ‘so-called’ huntingtin protein to grow too long and fold in on itself incorrectly. Parents with Huntington’s disease each have a 50% chance of passing it on to their child. Symptoms usually appear between 35 and 45 years of age, and the life expectancy after the first appearance of symptoms is 10-20 years. There currently is no treatment for Huntington’s itself, so patients can only receive a little bit of relief from their symptoms while the disease progresses.
So far, there is no real consensus among scientists about how exactly the disease works, but it’s generally agreed that it all starts with the mutated huntingtin protein. What if that protein was somehow changed or blocked? This idea prompted some experimentation by Holly Kordasiewicz and her colleagues.
For those interested in scientific research, the Nobel Prize is an esteemed award, recognizing one’s work and dedication. However, many student researchers are often disillusioned about the kind of work that is put in to receive such high praise. More often than not, these individuals did not suddenly discover something new. To be sure, their findings are the fruits of years of research and passion in the field of science. As a great model of dedication, recent Nobel laureate Thomas Südhof provided incredible insight to our understanding of the human brain communication. But, once again, his work was in the basic-science field, on a topic which, in fact, we learn about in our introductory neuroscience courses. In this brief article, I will outline the most recent findings from Dr. Südhof’s lab in hopes of showing aspiring researchers that continued diligence and passion for learning is most important.
Neurons communicate with each other by a signaling process mediated by what is known as action potentials. Changes in concentration and electrical gradients cause the action potentials to fire down the neuron, until it reaches the synaptic bouton. It is here that one neuron forms a synapse with another neuron. The synapse is the site at which communication is happening. But, for the most part, two neurons are not physically connected, so how does communication happen? In a process called neurotransmission, when the action potential reaches the end of the neuron, an influx of Ca2+ ions cause vesicles to release certain chemicals out of the neuron, as a signal to the next. These vesicles are like little packets of neurotransmitter chemicals. Herein lies the question that Dr. Südhof sought to illuminate: mechanistically, how do these vesicles actually release the chemicals?
Stressed out? You may be at a higher risk for Alzheimer’s disease. You’re probably wondering to yourself how that is possible. Highly intelligent people who use their brains all of the time, like scientists, CEOs, and presidents, deal with stress on a day to day basis. The truth is that lack of higher education or brain activity is not the only major cause of dementia.
If keeping your brain active is a good way to prevent cognitive decline, then why did people such as Ronald Reagan and Norman Rockwell develop Alzheimer’s disease? The answer is stress. Recent studies have shown that people who deal with high levels of stress in their career or their family life are more likely to develop dementia. Stress cannot be said to directly cause dementia, but it is a trigger for the degenerative process in the brain.
An Argentine research team examined 118 patients with Alzheimer’s disease and 81 healthy individuals whose age, gender, and educational level were comparable to the Alzheimer’s patients. Both groups were questioned about the amount of stress that they had faced in the past three years. The researchers reported that 72% of the Alzheimer’s patients admitted to coping with severe emotional stress or grief, such as the death of a loved one or financial problems. This was nearly three times as many as the control group.
A man or a woman could be of above-average intelligence, well-educated in medicine and psychology, and understand that every statistical measurement of personality and temperament can be distributed across a bell-curve in a large enough population. He or she could comprehend the determining power of genetics, the impact of cultural influence on belief systems, and how neuroplasticity molds our mental processing to respond to environmental stimuli.
A clever, sophisticated professional could understand all of this, yet still believe that somehow, all members of the opposite sex are manipulative and irrational. Not some, all 3.5 billion of them. Why?
Perhaps throughout his or her life, this person has made irrational decisions to socialize with very irrational and emotional people of the opposite sex, and through these experiences has thus formed a gender bias. Research shows we tend to mostly place people into categories of gender, race, and age. This task is so pervasive that scientists have deemed it our “primitive” categorization.
Once we’ve made up our mind about a group, or even someone in particular (consciously or not), it’s often hard to change our opinion. When beliefs are formed, confirmation biases kick in and begin to look for information that supports our views, and selectively ignore everything which doesn’t. Maybe someone had decided that you were shy and uptight when you first met. You were more reticent than usual because you had only gotten 3 hours of sleep the night before. Now that acquaintance may not notice all the times you’re friendly and outgoing, but instead seems to pounce on all the times you’re a little quiet.
What if you were able to erase all of your painful memories by simply taking a pill? While this might sound like something out of a sci-fi film, a recent study conducted by a group of researchers at MIT suggests that it may be possible in the future.
The researchers say that they’ve identified a gene known as Tet1 that appears to be important in the process of “memory extinction.” Memory extinction is the natural process of older memories being overridden by newer experiences. In this process, conditioned responses to stimuli can change: what once elicited a fearful response doesn’t always need to if the danger has ceased.
In the study, researchers compared normal mice to mice without the Tet1 gene. The researchers conditioned all of the mice to fear a particular cage where they received a mild shock. Once the memory was formed, the researchers then put the mice in the cage but did not shock them. After a while, mice with the Tet1 gene lost their fear of the cage as new memories replaced the old ones. However, mice lacking the Tet1 gene remained fearful.
If you’ve ever seen someone with a baby, chances are you’ve heard them say something along the lines of “You’re so cute; I could just eat you up!”
Well a recent article published in Frontiers in Psychology by a research team at the Technische Universität of Dresden, Germany shows there may be a link between an infant’s smell and a female’s response, depending on the status of the female. Scientists have studied the connection between olfactory signals and the bond between a mother and her infant in several non-human mammal species. However, up until now, the research performed on mother-infant bonding in humans has only ever explored the visual and auditory senses.
What they did:
A total of 30 women were tested. Fifteen of the women had given birth for the first time three to six weeks prior to the experiment (primiparous). The other 15 women had never given birth (nulliparous). To obtain the sample odors, 18 infants each wore a T-shirt for two nights postpartum. The shirts were then placed in plastic bags and frozen to keep the odor unaltered. During the experiment, each woman was exposed to both “odorless” air and the odors of two different infants; primiparous women were never exposed to their own baby’s odor. The women were asked to rate the odor on intensity, familiarity, and pleasantness, though none of the participants were aware of what the odor stimulus was. As the women processed the different odors, an fMRI machine scanned their brain.