Imagine you’re on your first date and you and your partner are hitting it off big time. It’s probably his/her witty comments or good sense of humor, his/her intelligence or impeccably beautiful smile that makes you feel extremely attracted to your date. As time goes on, you look deeply into each other’s eyes and giggle. You wonder, “am I falling in love?” The answer is: probably not (you’re only on your first date here, come on). You may not be falling in love, but you are feeling a stronger and closer bond being formed; and you’re feeling this way with some help from the hormones norepinephrine, dopamine, and oxytocin.
That’s right, kids– everything your parents told you about your crazy hormones when you had “The Talk” is true. Your hormones really are going crazy, and they really are helping you feel the way you do. When in love, areas in the brain that are known for their dopamine and norepinephrine production light up.
Philosophers since the time of Plato have considered the extent to which we can truly perceive the physical world, or the so called ‘mind independent’ universe. Modern science has given us further insight into the question, through experiments designed to understand the way in which our brain receives and manipulates sensory information. While it has been known for some time that human perception is subject to various priming effects and spatiotemporal biases, psychologists at the University of California, Berkeley have discovered that visual perception is also influenced by something called the ‘continuity field.’
To put it simply, the continuity field is what allows us to view our surrounding environment as a continuous perception. In his recent article in Nature Neuroscience, David Whitney and his colleagues have shown that our perception of the orientation of a certain object in our visual field is actually strongly biased towards the orientation of that object 10 seconds prior. This means that our brain ‘smoothes out’ small changes in the physical world so that we perceive a continuous image. Without the influence of this continuity field, we would be hypersensitive to the smallest changes in our visual field, and presumably have trouble determining which changes in our surroundings would be most relevant to our immediate needs.
For years doctors have been able to detect the early symptoms of Alzheimer’s disease through scans, lumbar punctures, and genetic testing. While these methods can be painful or expensive, a new blood test has recently been discovered that can easily and accurately predict the onset of Alzheimer’s disease.
Doctor Howard J. Federoff of Georgetown University Medical Center conducted a research study in which he took blood samples from hundreds of healthy, elderly men and women over the age of 70. Over the next five years, some of these healthy individuals developed Mild Cognitive Impairment or Alzheimer’s Disease. Federoff then compared their blood samples to the samples of the healthy individuals. He found a group of ten lipids, or fats, that were present in lower amounts in the blood samples of the participants who had developed Alzheimer’s Disease.
As we approach the loved holiday season, we also approach the dreaded weight gain that comes along with it. It probably won’t come as a surprise to you that our brain, specifically the hippocampus, plays a role in resisting immediate or delayed temptation.
The hippocampus deals with memory, including recalling past events and imagining them in the future. A study called “A Critical Role for the Hippocampus in the Valuation of Imagined Outcomes” examines healthy people as well as people with Alzheimer’s disease, which impairs memory and is associated with atrophy of the hippocampus. The study looked at “time- dependent” choices having to do with money in addition to “episodic” choices having to do with food, sports, and cultural events.
Intelligence is classically thought of as an immutable characteristic of each individual, pre-determined by genetics and permanent for a person’s entire life. But what if this is not true? It is an appealing idea to think that somehow, one can voluntarily, and naturally, boost his or her level of cognitive performance. Research has already shown that the brain is more plastic than originally thought. Parts of the hippocampus, a subcortical brain structure implicated in tasks of memory and other cognitive control functions, as well as the olfactory bulb (smell center) have been shown to generate new neurons after initial neurogenesis in the mammalian brain. These findings play into the idea that there is a way to somehow become smarter, even though you aren’t necessarily born with such cognitive gifts.
There are now apps and other computer programs that claim to improve brain function with excessive use. One of the bigger names in this field, Lumosity, runs advertisements that their brain-training program is backed by research and is “based on neuroscience.” The purpose here is not to discredit such apps like Lumosity and Brain Fit, but to take a closer look at the legitimacy of these claims of cognitive improvement, examine some actual research being done, and encourage a critical approach to a topic that popular culture REALLY wants to be true.
The science community received big news out of California last week as Karl Deisseroth and his team of researchers from the Department of Bioengineering at Stanford University had their paper concerning their newly developed CLARITY brain imaging technique published in Nature. The most astounding aspect of the newly released technique is that is creates a “see-through” brain that can be anatomically analyzed in a number of ways. This method truly is a game-changer as it revolutionizes how neuroscientists are able to view brain tissue and allows for a clearer view of the big picture. In this case the big picture is an intact, whole brain.
The technique operates on the idea lipids in the bilayer of a cell’s plasma membrane block visible light. This is why the brain is normally not transparent. Removing these lipids but still keeping the other parts of the cell and its environment intact would render the brain “see-through” and allow for much easier imaging of large pieces of brain tissue, if not the whole brain at once. This idea is carried out by taking the brain and infusing it with acrylamide, which binds proteins, nucleic acids and other molecules, then heating the tissue to form a mesh that holds the tissue together. The brain is then treated with SDS detergent to remove the light-blocking lipids resulting in a stable brain-hydrogel hybrid. From here the transparent tissue can be fluorescently labeled for certain cells and analyzed. Through the whole process there is less than 10% protein loss in the brain tissue compared to around 41% for other current methods. This is an amazing improvement!
“Man had always assumed that he was more intelligent than dolphins because he had achieved so much — the wheel, New York, wars and so on — whilst all the dolphins had ever done was muck about in the water having a good time. But conversely, the dolphins had always believed that they were far more intelligent than man — for precisely the same reasons….In fact there was only one species on the planet more intelligent than dolphins, and they spent a lot of their time in behavioural research laboratories running round inside wheels and conducting frighteningly elegant and subtle experiments on man. The fact that once again man completely misinterpreted this relationship was entirely according to these creatures’ plans.” – Douglas Adams, The Hitchhiker’s Guide to the Galaxy
As tempting as it may be to believe the science fiction version of the intelligence rankings, real-life science has spoken and suggests (much to my displeasure) that humans may actually be the highest on the intelligence scale.
Behold – our recent ancestor, the gorilla, and ourselves, the human:
There are many characteristics that separate us from our monkey fathers. Most notably, factors that mark the evolution are the use of fire, use of tools, and a bigger brain. A recent study suggests that it is actually the onset of the use of fire that explains the ability to begin to grow a larger brain. According to a timeline of human history, the earliest Homo Sapiens appeared shortly after beginning to use fire to cook food:
We live in an era where the rapid advances in technology are constantly changing how we perceive and interact with the world around us. The question on everyone’s mind is always “what’s next?” The answer: brain-machine interfaces. For the average consumer, brain-computer interfaces are becoming increasingly available on the mass market and their current uses offer a wide range of fascinating opportunities.
A company that’s been in the news a lot lately is NeuroVigil. Their product known as the iBrain has been used to help world-renowned astrophysicist Steven Hawking communicate with a computer simply by thinking. Hawking, who suffers from Lou Gehrig’s disease, developed his own solution to allow him to speak by twitching his cheek to select words from a computer. In its current state, the iBrain is still slower than Hawking’s solution, but NeuroVigil’s founder MD Philip Low hopes that it will eventually be possible to read thoughts aloud. NeuroVigil also made the news by signing a contract with Roche, a major Swiss pharmaceutical company, to use the iBrain in clinical studies for evaluating drugs for neurological diseases.
There are numerous brain imaging techniques that allow us to gain insight into what damage the brain may have incurred after a patient has a traumatic injury. The ever popular fMRI measures blood flow to infer neural activity. Diffusion tensor imaging (DTI) uses the magnetic properties of water to look at white matter in the brain, while positron emission tomography (PET) uses radiolabeling to look for a specific chemical in the brain. All of these are important for possible disease diagnosis, however, there is skepticism around how dependent we should be on this technology, as the results should never be taken as the absolute truth.
Now, a new type of brain imaging developed by researchers at the University of Pittsburgh allows researchers to look for connections that have been broken as a result of traumatic brain injury, much like an X-Ray allows doctors to look for broken bones. It is called High Definition Fiber Tracking (HDFT). Although the technology is not specific at the cellular level, it is accurate in observing specific connections that have been lost as a result of injury. These lost connections act as a reliable predictor for cellular information, such as the percentage of axons that have been lost.
The accompanying publication in the Journal of Neurosurgery focuses on a case study of a man who sustained severe brain damage after crashing an all-terrain vehicle (public service announcement: this is why we wear helments!!!). Initial MRI scans showed hemorrhaging in the right basal ganglia, which was confirmed by a later DTI. The patient had extreme difficulty moving the left side of his body, and it was assumed to be a result of damage to the basal ganglia. It was not until the patient had a HDFT test that doctors could pinpoint the true problem: fiber tracts innervating the motor cortex had been lost. More