Category: Uncategorized
Dream Bigger, Darling.
As my good friend Cobb once told me, “Dreams feel real while we’re in them. It’s only when we wake up that we realize something was actually strange.”
OK, fine, Leonardo DiCaprio’s character from Inception isn’t real, but he does make a valid point. Oneirologists, those who study dreams, have traditionally viewed dreams as uncontrollable streams of sounds and images with the ability to induce a tremendous spectrum of emotion. However, the idea of lucid dreaming has caused the conventional understanding of dreams to collapse. A “lucid dream,” terminology coined by the Dutch psychiatrist Frederik van Eeden, is one in which the sleeper is aware that he or she is dreaming. This example of dissociation is wonderfully paradoxical in that it exhibits components of both waking and dreaming consciousness.
An American psychiatrist and dream researcher named Allan Hobson specializes in the quantification of mental events and their corresponding brain activities. Although he vehemently dismisses the idea of hidden meanings in dreams, he has embarked on a search along with other neurobiologists and cognitive scientists to decipher the neurological basis of consciousness. Hobson hypothesizes that subjects may learn to become lucid, self-awaken, and regulate plot control by intercalating voluntary decisions into the involuntary nature of the dream.
The validation of this idea would imply that the mind is capable of experiencing a waking and a dreaming state at the same time. Consequently, Hobson states, “…it may be possible to measure the physiological correlates of three conscious states, waking, non-lucid dreaming, and lucid dreaming in the laboratory.” If there is a psychological distinction between the three, there should also be a physiological difference.
The advent of lucid dreaming experimentation has not only benefitted Hollywood, but it has also provided possible treatment options for those hindered by frequent nightmares or post-traumatic stress disorder (PTSD). Methodologically speaking, the study of lucid dreaming presents a formidable challenge, but it is becoming an important component of the cognitive neurosciences.
Josefin Gavie and Antti Revonsuo have built on Hobson’s theories by proposing a technique termed lucid dreaming treatment (LDT). The key to this treatment is that the subject learns how to identify cues that facilitate lucidity during a dream, and the subject learns to manipulate the environment once lucidity is attained. The phenomenon of lucidity may prove to be a useful device in that it offers the sleeper a method to control components of the dream – altering and diminishing any threatening situation. Although the investigation of LDT is extremely new and incontestably controversial, it has shown promising preliminary results in its ability to lower the frequency of nightmares in the selected subjects.
The premise of the film Inception may be wildly hypothetical, but it has expertly amplified the current research on lucid dreams. However, researchers in the field should take a word of advice from the character of Eames: “You mustn’t be afraid to dream a little bigger, darling.”
The Neurobiology of Consciousness: Lucid Dreaming Wakes Up – J. Allan Hobson
The Future of Lucid Dreaming Treatment (PDF) – Josefin Gavie and Antti Revonsuo
Piano Teachers Must Be Neuroscientists
The familiar mantra “practice makes perfect” may be taken too literally. The definition of effective practice as the constant repetition of a particular exercise - a golf swing, a tennis serve, a dance step - is faulty, as it turns out.
Time has reported on a study published in Nature Neuroscience by neuroscientists at the University of Southern California and UCLA. The study compares the results of repetitive, “constant practice” with the results of “variable practice." In one experiment, scientists instructed a group of subjects to copy a movement with their forearm as displayed by a line on a computer screen. One group representing constant practice repeated a movement holding their arm at 60-degrees 120 times. The variable practice group was asked to do the same 60-degree movement only 60 times, but they were also asked to do three other movements 20 times each. The two groups did equally well in practice. However, when they were retested 24 hours later, the variable practice group outperformed the rote repetition group on the 60-degree task.
So, variable practice works - but why? Some of the subjects from each group were treated with transcranial magnetic stimulation (TMS). A portion of each group had TMS in the prefrontal cortex, and another portion received TMS in the primary motor cortex. The prefrontal cortex is the part of the brain that allows for executive functions like reasoning and planning while the primary motor cortex deals with simple, physical task learning. Fittingly, when the prefrontal cortices of variable-practice group members were “messed with” by TMS, the performance of the participants declined. Performance levels also decreased when constant-practice subjects underwent TMS in their primary motor cortices. It seems that “tedium is bad for the brain,” and it needs variety to actively learn by using higher structures like the prefrontal cortex to better retain what has been practiced.
It would be interesting to find out whether or not this concept applies to different types of learning, like studying for exams or playing an instrument. Even when training a dog, it is suggested to work amid distractions and to increase the time between clicking the “clicker” to let the dog know it has performed a task correctly and rewarding it with a treat. A higher level of focus seems to occur when there are more variables in the practice routine. My piano teacher must have been on to something when she gave me so much homework!
Study: Why Athletes Should Mix Sports-Training Routines - Time
Article: Practice Structure and Motor Memory -Nature Neuroscience
Left=Language in monkey brains, too
I would hate to marginalize the Creationists that may frequent this blog, but, it is becoming difficult to ignore all of the
evidence for Evolution piling up higher and higher. This conglomerate of information is contributed to by almost all fields of study- from Archeology to Biology, and in a recent surge in the rapidly growing field, Neuroscience. Unfortunately for us, submitting to the idea of Evolution forces us to think of ourselves and our fellow humans as a little less awesome or unique- we have always reveled in our species immense capacity for complex language processing, as well as other things of course. But it didn’t just pop up out of nowhere a couple million years in.
To expand upon the research being done by neuroscience in exploring the evolution of the brain, this article focuses on Wernicke’s area (known for its dedication to processing auditory language information) in chimpanzees. This study used design-based stereologic methods to estimate regional volumes, total neuron number and neuron density. When compared to what we know about the human brain, the results are intriguing.
What did they find? A leftward asymmetry of this language area in the monkey brains. What does this mean to us? It suggests that the left lateralization of the language area in the brain (left = language, left = language, first thing to memorize in Psych 101) originated before our cutting-edge human species, prior to the appearance of our modern human language.
This investigation may seem generally boring- these Chimpanzees aren’t Darwin’s finches or anything- but it certainly is significant in showing Neurosciences’ huge potential for contribution to the case of Evolution. They showed that a language specialization that is key to our unique language capabilities actually evolved prior to the emergence of modern humans, serving as a pre-adaptation to modern human language and speech. Studies like these are closing the gap between ancestral species, and unfortunately, making us all feel a little less special about our leap to civilized society. Way to go Evolution.
Original Article: http://rspb.royalsocietypublishing.org/content/277/1691/2165.full?sid=912995c2-144b-45bb-b92c-d7c0196d1fef#ref-5
Similar articles investigating monkey brains and language adaptions: http://current.com/1ri8u4c
Ps.- If anyone studying at BU is really really into Evolution, I highly recommend the Ecuador Study Abroad Program… A trip to the Galapagos Islands (on a private yacht, no less) and to the Charles Darwin Research Station is a chance of a lifetime.
Who is John Galt? Obviously, Not a Neuroscientist.
Capitalism gained a solid foundation in the 19th and 20th centuries due to the development of several philosophies of human nature, all of which proposed a rational, self-sufficient individual to be the most important element in any society. One author in particular, the famously arrogant Ayn Rand, who once said “emotions are not tools of cognition” advanced the idea that humans have direct contact with reality through sensory perception, and thus could inductively and deductively produce logical concepts of the world which exactly reflected the nature of reality. This idea, combined with the assumption that humans have free-will, led Rand and others to the conclusion that the individual’s rational pursuit of survival, property and happiness was an inalienable right, and thus laissez faire capitalism remained the only moral political system.
This was all nice and fine for a society that knew next to nothing about human psychology and brain physiology. But much of the data gathered since suggests that humans are irrational, lack an ability to directly perceive their environment through sensory perception, and use emotions as their guides in nearly all decision making.
As an example of human irrationality, consider the Ultimatum Game, in which one subject is given an amount of money, say $10, and told to give any amount of this to a second subject. Both subjects are informed that if the receiving subject refuses to accept the offered amount, neither subject gets to keep any money at all.
The receiving subjects tend to reject stingy offers, preferring an outcome which actually grants them less money than they would have received if they had simply swallowed their pride and accepted the puny sum.
Further evidence for human irrationality can be seen in a study which asked its subjects to select a bar of soap to purchase among several different types. Subjects invariably justified their choice of one particular bar of soap, referencing its superior scent, sanitization capacity, and low price. Video records of the subjects revealed that they persistently caressed the smooth, ovular shape of the “superior” soap bar. The same soap’s popularity plummeted when it was presented in a rectangular shape.
Indeed, it appears that as humans we are often unknowingly overwhelmed by our whims and irrational urges. What, then, does this mean for capitalism?
If anything, it’s safe to say that as pursuers of happiness, we humans are at risk for making irrational decisions and conclusions about what will make us happy. We hoard electronic gadgets, eat sugary foods, and cram our brains full of reality TV and advertisements, all with the idea that these things bring us comfort and fulfillment. Our cities expand, our consumption of natural resources escalates, and our waste accumulates, everywhere. Still, we act surprised when we discover the rising rates of ADD, obesity and depression, and we somehow overlook the fact that our existence will be endangered upon the depletion of our finite resources.
Through adherence to a laissez-faire economic system, our inability to be consistently rational jeopardizes our survival. Hopefully, the influence of neuroscience and psychology will extend over the realm of politics and economics in time to save us from ourselves.
The development of interpersonal strategy:
Autism, theory-of-mind, cooperation and fairness -D. Sally, E. Hill
Messages and Myth - Dan P. Millar
For the New Intellectual - Ayn Rand
Turn on, Tune in, Drop out — Psychiatrists Reconsider the use of Psychedelics
“Turn on, Tune in, Drop out.”
With this snappy catchphrase, and an eponymous book, Dr. Timothy Leary boldly endorsed the use of hallucinogens to the American public five decades ago. Citing potential medical benefits, Leary believed that psychedelic drugs like LSD and psilocybin could help patients overcome psychiatric illness and facilitate a higher stage of consciousness.
While his divisive stance and strident attitude made him a symbol of the counterculture movement in the 1960’s, his ideas have been largely regarded by the scientific community as radical and medically insignificant. But recently, some researchers have begun to rethink those conclusions.
In April, the New York Times reported on the case of Clark Martin, a former psychologist who participated in a psilocybin study at Johns Hopkins Medical School. Suffering from depression, Martin opted to participate in the program after finding no solace in traditional treatments. In the study, Martin was given psilocybin and spent the next five hours listening to classical music in private reflection.
A year later, he reported that the psilocybin treatment had helped him largely overcome his depression, and considers the experience one of the most meaningful of his life. He wasn’t alone. Most of the patients in the study said the treatment yielded positive, long-lasting benefits.
Encouraged by these results, scientists are also considering studying the effects of psychedelics in the treatment of obsessive compulsive disorder, anxiety, post traumatic stress disorder, and addiction.
The Johns Hopkins study largely corroborates the data Leary had collected decades earlier in a 1960 study on psilocybin conducted at Harvard University, in which he saw significant positive results in his patients.
The use of psychoactive drugs in medicine is still relatively new. Not until the middle of the 20th century, with the introduction of the anti-psychotic drug Chlorpromazine, have physicians turned to pharmaceuticals to treat psychiatric illness. They largely displaced older, more invasive treatment options such as insulin shock therapy, psychosurgery, and electroconvulsive therapy.
Perhaps because of their initial success, pharmaceutical companies have recently embraced psychotropic medications, finding massive markets for the treatment of conditions like depression and anxiety.
But while drugs like Prozac and Valium have made fortunes, research for experimental treatments like psychedelics has garnered little attention. Without the profit making potential of less radical options, pharmaceutical companies are reluctant to allocate time and money to developing these equivocal treatments.
Still, a one-time session that produces lasting results could potentially provide a cheaper and less disruptive alternative to a daily pill. And the treatment, an interesting middle ground between therapy and medication, perhaps provides a more complete approach.
Now, 50 years removed from the counterculture movement that defined Leary’s research, his stigma is beginning fading, but maybe his science will remain.
Forget Me Not
One hundred years ago, when Alzheimer's Disease (AD) was even more of a mystery than it is now, amyloid protein aggregates were described as black spots that showed up on brain slices after autopsy. These aggregates, commonly known as plaques, denote the telltale sign that a patient has AD. Until recently, these plaques could only be detected after death, but Dr. Daniel Skovronsky, creator of Avid Radiopharmaceuticals, may have a solution.
On July 11th, Dr. Skovronsky will present his latest findings at the international meeting of the Alzheimer's Association in Honolulu. He has spent the last five years creating a fluorine radioactive dye to be used in positron emission tomography (PET) scans. The results of these PET scans are engineered to be so accurate that they can compete with brain autopsies, the only method currently available to determine whether a patient has AD.
The Food and Drug Administration (FDA) questioned Dr. Skovronsky about his fluorine-18 dye and whether the results of fluorine-18 PET scans compare to the definitive results of brain autopsies. Dr. Skovronsky recruited thirty-five patients in hospice with ranging levels of memory loss; all of these patients would receive a PET scan and would have their brains autopsied post-mortem. The results of each patient's PET scan matched his or her autopsy results.
If approved by the FDA, Dr. Skovronsky's work will lead to an increase in accuracy in the diagnosis of Alzheimer's disease. Currently, 20% of patients diagnosed with AD are revealed to not have the disease when an autopsy is performed. With fluorine-18, Dr. Skovronsky has fine-tuned a method to detect amyloid plaques in the brain in a living patient, which is a feat within itself. Previously, the only way one could determine whether a patient had the disease or not was through autopsy - a posthumous procedure. Now, patients could have the chance to receive an accurate diagnosis while they are still alive and earlier in their lives.
In addition to simply detecting plaque, fluorine-18 will also aid in understanding the development of the disease, for plaques were found in patients deemed as healthy when they took memory tests. Currently, people who are not diagnosed with AD earlier in life will not receive treatment until the disease has developed more, and they will likely not receive any preventative medicine. With Dr. Skovronsky's PET scans, doctors could diagnose the development of the disease earlier and administer preventative measures to slow down the development of the disease. Also, patients who are currently misdiagnosed with AD do not receive the correct treatments that they need for the diseases that are actually causing their memory loss or dementia, like depression.
The Vanishing Mind - Promise Seen for Detection of Alzheimer's - NYTimes
The Alzheimer's Disease Neuroimaging Initiative positron emission tomography core - Alzheimer's Dement. 2010
In Vivo Imaging of Amyloid Deposition in Alzheimer Disease Using the Radioligand 18F-AV-45 (Flobetapir F 18) - The Journal of Nuclear Medicine
Opening Eyes to Learning Difficulties
Learning difficulty and disability has long been a problem for many children, parents and school teachers alike. Dysfunctions such as dyslexia and motor disability have hindered the progress of countless adolescents across the country and continue to do so with every passing day. Now, studies have been performed that may centralize learning difficulties to the eye, rather than the brain itself.
Researchers at the Norwegian University of Science and Technology are conducting research that creates a causal link between motor and learning disabilities and dysfunction in visual perception. For example, people who cannot quickly learn a simple motor task such as catching a ball may have difficulty because the cells in their eyes are not perceiving the stimulus properly. The same rings true in individuals with dyslexia - their eyes may not be correctly processing the visual stimuli of words on the page.
The ocular cells in contest here are deemed "magno cells" and detect rapid movements in our visual field, creating the movie-like perception we experience on a daily basis. Without these, life would look like a disconnected string of frames - much like a comic book. In a test conducted by the researchers, it was found that individuals with difficulty in mathematics also showed difficulty in tracking the randomized movement of a dot on a screen with their eyes, elucidating a link between eye function efficiency, detection of rapid changes in the environment and learning ability.
In a greater context, this finding may have implications in special education and may change the mindset of those working with individuals with additional learning needs. With this new information, learning disability can be combated from the angle of visual field perception. Techniques aiming to strengthen visual perception and eye efficiency (such as eye movement and tracking exercises) could act as a therapy for learning or motor disability previously thought to be localized in the brain itself.
Source: Science Daily via The Norwegian University of Science and Technology
Toasters With Feelings
Anthropomorphism is the attribution of human characteristics to inanimate objects, animals, or God. It has been a hallmark of faiths and religions worldwide. Humans have a natural tendency to assign intentions and desires to inanimate objects ("my computer isn't feeling well today - he's so slow!"), but they also strip "lower" beings (animals) of those same human characteristics.
We have a history of treating animals unnecessarily cruelly. I don't mean killing for food - that's necessary for our survival; I'm referring to dog fights, hunting, and other violence. We didn't even think that animals could sense pain until quite recently!
Why do we think of lifeless forms as agents with intentions but of actual living creatures as emotionally inferior clumps of cells?
Could it be that the need to rationalize phenomena is simply stronger when the phenomena have absolutely no visible explanation?
And do toasters really have feelings??
Antibodies to Reverse Nervous System Damage
Until now, it was believed that antibodies were proteins created by the immune system to solely protect the body against viruses and bacteria. However, a new study conducted by the Stanford University School of Medicine may give insight into another function of these vital proteins – nerve repair.
In a study conducted on mice, the scientists at Stanford demonstrated that antibodies are able to repair nerve damage to the peripheral nervous system (PNS). The PNS contains all of the nervous tissue outside of the brain and spinal cord.
It has been largely unknown why nerve tissue in the PNS is able to regenerate whereas the tissue in the brain and spinal cord cannot. Perhaps antibodies provide an answer. While antibodies have access to the peripheral nervous tissue, the blood brain barrier, as well as the blood spinal barrier, does not allow antibodies to pass into these structures.
The process by which the antibodies are able to repair peripheral nervous tissue is believed to be attributed to their ability to degenerate myelin. Myelin, the fatty tissue covering the axons of neurons, remains after neuronal death in the brain and spinal cord. However, in the remainder of the nervous system, the myelin is broken down by antibodies after damage to a particular neuron. In the laboratory, researchers created mice that can’t make antibodies, and as a result, repair to peripheral nervous tissue was impeded, as was the removal of the myelin. After injecting these mice with healthy antibodies, the myelin was removed and the nervous tissue was repaired.
It is scientists’ hope that this finding will lead to a way to repair central nervous system tissue damage caused by strokes and spinal cord injury. One researcher claims, “‘One idea," said Barres, "would be to bypass the blood-brain barrier by delivering anti-degenerating-myelin proteins directly into the spinal fluid. We're hoping that these antibodies might then coat the myelin, signaling to microglia -- macrophages' counterparts in the central nervous system -- to clear the degenerating myelin." That might, in turn, jump-start the regeneration of damaged nervous tissue, he added.”
For the full article, click here.