Glossophobia is the medical term for the strong fear of public speaking. It is one of the most common phobias: about 75% of the world’s population struggle with this social phobia, or social anxiety disorder, to some extent. According to several surveys, the fear of public speaking is even greater than the fear of death.
Those who suffer from glossophobia tend to experience the classic fight or flight response when speaking in front of a group, even if the group only consists of a few people. They may tremble, sweat, freeze, and so on. As their brains release adrenaline and steroids, their blood sugar levels and heart rates increase. The symptoms are not necessarily limited to during a public speaking event; they can also happen prior to the event, that is, when it is anticipated. While glossophobic people may know that this fear is irrational, they have the least amount of power in controlling their feelings.
The exact cause of glossophobia is still unclear. However, genetic factors often play a huge role. Like many other phobias, glossophobia is more common in people who carry the corresponding characteristic from their families. Individuals with a family history of glossophobia or any similar fear may exhibit the symptoms when the fearful gene is activated. Certain traumatic events can also lead to glossophobia. You probably suffer from glossophobia and have a strong fear of being judged, embarrassed, or rejected because at some point in your life, you’ve experienced distressful situations, such as an unpleasant presentation you’ve given. Such events might not appear intense when they occur, but they can have a long-lasting impact. Another possible factor is education. Less educated people in general are more likely to feel uncomfortable with taking the stage. In one poll, 52% of respondents with a high school diploma or less expressed a fear of public speaking, compared to only 24% of college graduates. The consequence is that a glossophobic person is likely to deliberately avoid any public speaking scenarios.
Writer: Zijing Sang
Editor: Sophia Hon
According to the NIH, 90 people in the U.S. die from overdosing on opioids every day. This rise in opioid usage began when a number of different opiate brands, such as Vicodin and Xanax, started being prescribed to more patients of different, usually younger, ages. In 2000, the average age of drug-related deaths was 40; now we are seeing more people in their 20s and 30s die from drug overdoses. The opioid crisis began with prescription opiates but has now culminated in the abuse of prescription drugs and recreational narcotics like heroin. But how has this epidemic taken so many lives? Well, the answer lies in how the brain develops tolerance and addiction to drugs such as opiates.
What happens to your nerves when you take an opiate? Most chemical signals land on receptors in the axon terminal of your nerve cells, which normally work by attaching to chemical signals that excite electrical impulses through your nerves. By taking opiates, those opiate chemical signals land on opioid receptors and inhibit the electrical impulses from ever traveling through the nerve cell.
There are 3 different kinds of opioid receptors – Mu, Kappa, and Delta. The Mu-opiate receptor is the most important in that it is responsible for activating all the symptoms and effects of taking opiates, such as constipation, pain numbing, euphoria, and depression. These opiate receptors are all-encompassing because they affect not only pain pathways but other parts of the brain such as the locus ceruleus, among other regions.
GABAergic receptors are the start of addiction and are located in the midbrain. GABAergic receptors are in charge of switching on or off the pleasure and pain networks of the brain, and opiates shut off these receptors to induce dopamine (the chemical that relieves anxiety and stress) to fill up the receptors. The secretion of dopamine conditions the brain to think that opiates are good for you and creates a tolerance. These GABAergic receptors adapt to the opiates within your body by producing more cyclic AMP to adjust for the inhibition of the electrical impulses in the nerve cells. Following the increase in cyclic AMP, nerve cells fire more electrical impulses, which cause the symptoms associated with withdrawal, such as diarrhea, dysphoria, and anxiety.
With the knowledge of what causes addiction, new research have arisen from this opioid epidemic. The Scripps Research Institute in Florida has developed a new form of opiate that decreases the risk of overdose while still substantially decreasing pain. This new drug decreases the respiratory suppression normally found in analgesia, which is the number one cause of death from overdosing. Another study from the University of Utah found that a compound from marine snails actually blocks pain pathway signals, which could become an alternative in the near future instead of prescription opiates. Another more popular alternative to pain management as well as withdrawal management is the use of medical marijuana. In a pilot study published in Trends in Neurosciences, cannabinoids are seen to decrease withdrawal symptoms by reducing drug cravings and even restored some neurological damage from prolonged drug abuse.
While this epidemic has taken hold of a larger portion of the U.S. than expected, it is only with the help of the researchers, scientists, and doctors that any damage from this opioid epidemic can be repaired.
Writer: Cindy Wu
Look at me while you’re talking!
If you’re anything like me, you probably have a hard time managing your eyes during conversations. Although eye contact is encouraged to maintain engagement with the other person, awkwardness could lead some to avert their gaze.
Turns out, according to Kajimura and Nomura (2016), our attempts to maintain eye contact and to rummage for the proper words are in competition for a domain general cognitive resource. To be more specific, when one tries to maintain eye contact, the default mode network is activated, which allows for self-consciousness and perception of others’ mindsets. On the other hand, verbal processing deactivates the default mode network and activates language networks. As a result, this competition between the two processes to either activate or deactivate the default mode network explains why eye contact hinders conversation. This ultimately generates an urge to break eye contact in order to facilitate verbal processing. Kajimura and Nomura’s conclusions came about from a word association game. When given a noun, the participants were asked to immediately respond with a verb that relates to the noun mentioned. For example, if the word “pie” was given, the participants would then respond with verbs such as, “eat”, “bake”, and so on. Moreover, this game was done while the participants stared at a face on a computer screen that would maintain eye contact and sometimes look away. The results showed that when faced with a challenging word, participants took longer to respond with a verb, but the response time was shorter if the participants broke eye contact.
Likewise, according to Glenberg et al. (1998), people might also avert their gaze if a question or topic being discussed has a high level of complexity or difficulty, and by averting their gaze, responses became more accurate. This study was done through a series of experiments. One of the experiments observed participants’ eye movements while they answered questions with varying levels of difficulty. In another experiment, participants were asked to answer general knowledge and mathematical questions while either closing their eyes or looking at the experimenter. The first experiment showed that participants showed a tendency to avert their gaze more as the difficulty of the questions increased. In the latter experiment, the participants scored more accurately when they had their eyes closed.
So if you ever notice someone breaking eye contact frequently, don’t take it too personally; they’re probably just trying to think. After all, what really matters in a conversation are the words being spoken and the thoughts behind them.
Author: Audrey Kim
Editor: Albert Wang
Have you ever felt tingles in the back of your head or neck while listening to the sound of salmon sizzling on a pan, while watching someone folding clothes, or while getting a haircut? If the answer is yes, then you might experience Autonomous Sensory Meridian Response, or ASMR. The truth is, you are not alone. Although exactly how much of the general population experiences ASMR is unclear, a quick search for ASMR on YouTube now turns up more than 1.5 million videos. ASMR is a term used for an experience characterized by a variety of tingling physical sensations triggered by different acoustic, visual, and digital stimuli, from chopping to whispering, from washing to bubbling, etc. The most popular source of stimuli is video. For instance, the trendy Korean phenomenon known as mukbang, in which a vlogger eats surprisingly large quantities of food while chatting with the audience, has an ASMR spin on it: the audible eating sounds like chewing and crunching may evoke certain ASMR sensations in the brains of some viewers.
Stephen Smith, an associate professor of psychology at the University of Winnipeg, says that only a handful of scientific studies have been done on ASMR since the name itself sounds unscientific. ASMR is considered a perceptual sensory phenomenon rather than a response or a mental disorder, according to Smith.
ASMR is also associated with specific personality traits. In Smith’s study, 290 individuals with ASMR and 290 matched controls completed the Big Five Personality Inventory (BFI; John et al., 1991); participants with ASMR also completed a questionnaire related to their ASMR phenomenology. Results showed that people who experience ASMR demonstrated significantly higher scores on Openness and Neuroticism, and significantly lower levels of Conscientiousness, Extraversion, and Agreeableness compared to matched controls. Hence, if you experience ASMR, you probably tend to be more creative, have a broad range of interests, and are more likely to experience sadness, anxiety, and mood swings. Being low in Conscientiousness, Extraversion, and Agreeableness means that you are more likely to dislike a set schedule, feel exhausted when socializing, and take little interest in others.
Writer: Zijing Sang
Editor: Sophia Hon
The concept of lucid dreaming has been around since the time of the ancient Greeks and even exists as part of enlightenment in Tibetan Buddhism, described as “dream Yoga.” The scientific research regarding dreams and sleep itself is steadily increasing, yet it is somewhat of a new science to many neuroscientists. There are countless articles, videos, and how-to’s on the internet regarding how to start lucid dreaming, yet the true underlying cause of lucid dreaming and what exactly it means is still an area of active research among scientists. Recently, more studies have been conducted and have given more insight as to how consciousness is biologically connected to dreams.
Lucid dreaming is the act of retaining or gaining consciousness while still asleep, usually during the dreaming phase of sleep – Rapid Eye Movement (REM) sleep. While how-to’s and videos show how to lucid sleep, several research papers from Germany’s J.W. Goethe University and Max Planck Institute of Psychiatry have shed light to the various ways how scientists quantify instinctive and intuitive data from participants.
Goethe University has created the Lucidity & Consciousness in Dreams scale (LuCID) where participants who experienced lucid dreaming answered surveys regarding their lucid dreams. Researchers were able to quantify the survey’s data into the LuCID scale and concluded that those who lucid dream had greater control over their thoughts and actions within their dream, had the ability to think logically, and were better at accessing memories from their waking reality.
Max Planck Institute also quantified their data from surveys and concluded that there were varying differences in participant’s states of consciousness. While in lucid dream, participants experienced better “intention enactment” just as Goethe’s study showed that participants had greater control over their thoughts and actions in their dream. Lucid dreamers were not shown to have a better planning ability than while awake. Both of these institutes have shown through their work that dreams and the consciousness are quantifiable to a certain degree, but there are other ways to measure the biological capabilities of the brain while during sleep as opposed to relying on secondhand accounts that are subject to human error.
Various other institutes and universities have been able to use a range of devices like MRI scans and EEGs to eye movement signals. The studies that used EEG and MRI scans showed that lucid dreaming showed an increase activity in the frontal areas of the brain, which are correlated to higher order cognitive functions and an increase in gamma wave activity, brain wave activity that consolidates brain inputs and turns them into memories and such.
While there is a pioneering effort to come to conclusions with the human brain’s consciousness and sleep states, all the research conducted so far is out of reach from obtaining a conclusive answer to how our brain completely functions. That being said, these studies in their own right show that there are correlations that take us one step closer to understanding our consciousness.
Writer: Cindy Wu
Editor: Kawtar Bennani
Money. Fame. Power. These are all things that come to mind when the topic of high intelligence is brought up. It is often thought that possessing high intelligence is advantageous; however, there may also be some negative consequences to such a gift. According to a study by Karpinski et al., possessing high intelligence may lead to the individual having a “hyper body,” or a body that has altered immune and inflammatory responses. To test this, they surveyed 3,715 members of Mensa, whose IQ were 130 or higher. Questions that were asked fell along the lines of reporting their experiences with mood and anxiety disorders, ADHD, autism spectrum disorder, as well as physiological diseases such as allergies or autoimmune diseases. The survey data was then compared against the national average for each disorder or disease. They found that the Mensa population had higher rates for all the diseases and that possessing a high IQ makes the chances of having a diagnosis 2 to 4 times more likely. This study explains that these effects may be due to the “overexcitabilities,” or intense behaviors that are psychomotor, sensual, emotional, intellectual, or imaginational, that are correlated with higher intelligence. Perhaps having a higher IQ means one’s entire body is more susceptible to the environment around them, leading to an increase in knowledge but also an increase in the risk factors as well.
Although high intelligence may lead to an increase in risk factors, it may also lead to an increase in life expectancy. Professor Deary and his colleagues at the University of Edinburgh carried out a study that compared the IQ of 11-year olds from 1932 and checked whether the cohort was alive at the age of 76. What they found was interesting: a 15-point IQ advantage led to a 21% greater survival rate. This shows that certain genes may link IQ and longevity, almost contradicting the previous study. More studies would need to be conducted in order to find the true correlation between IQ and health. Ultimately, having a high IQ can be seen as a double-edged sword: you may live a longer life, but you would also be more susceptible to risk factors throughout your life.
Writer: Albert Wang
Editor: Sophia Hon
Without a doubt, we all remember the dark, dreary days of high school: waking up at 6:00 in the morning, sitting on the bed, eyes still half-closed, while your mind drifts in and out of light dreams. After finally mustering enough energy to check your phone, it’s 6:30. Your mind snaps awake once you realize it’s only 15 minutes before the bell rings for the first period of class.
As schools get more competitive, many students end up overloading their classes and extracurricular activities, resulting in shorter sleeping times and poor sleep hygiene. Along with a rigorous schedule, there’s the burdensome amount of anxiety and pressure to succeed. As such, it wouldn’t be very far-fetched to assume that stress and anxiety would be major contributing factors to the increasing rates of anxiety disorders and depression among adolescents. According to census “in 2015, 3 million teens ages 12 to 17 had had at least one major depressive episode in the past year,” and 6.3 million teens experienced some form of anxiety disorder.
Jack S. Peltz, an assistant professor from Daemon College, conducted a recent study that correlates later school start times with improved mental health and reinforces the idea that good sleep hygiene is crucial to the functioning of adolescents during the day time. Data was collected from 197 different adolescents from the ages 14 to 17. The adolescents were separated into two groups, those who started school before 8:30 a.m. and those who started school after 8:30 a.m. Participants were instructed to keep a sleep journal that recorded both the quality and duration of their sleep, and any signs of depression or anxiety experienced during the day. Sleep hygiene was measured through a self-report survey, using the Adolescent Sleep Hygiene Scale which considers nine different factors that can inhibit sleep: “physiological, cognitive, emotional, sleep environment, daytime sleep, substance use, bedtime routine, sleep stability, and bedroom sharing”. Conceivably, students who maintained better sleep hygiene practices showed a lower number of anxiety and depression symptoms. However, this relationship between sleep hygiene and mental health was much stronger in students who started school after 8:30 a.m.
Although many studies, such as the one described above, demonstrate convincing evidence that longer sleeping times and better sleep quality have positive impact on mental illness in adolescents, only 14% of schools in the United States have changed their school starting times to later than 8:30 in the morning. Hopefully, schools will eventually recognize the increasing mental health problems in adolescents, and ultimately change the system to ensure longer sleeping times and to promote mental health.
Writer: Audrey Kim
Editor: Albert Wang
What makes an object beautiful? Is beauty inherent in the object itself or is it because people say the object is beautiful that we assume it is? A new study by Zhang et al. used fMRI to see what exactly goes on in our heads when we think of beauty. Using fMRI technology, they had 19 right-handed college students perform 2 tasks during the scan: they were asked to judge whether a pictograph or bone script character was beautiful or ugly and a baseline task where they were asked to judge the luminance of a square to see if it was high or low. They found that different brain regions became activated when judging beauty in either bone scripts or pictographs. When looking at beautiful judgements, areas of perceptual, cognitive, emotional, and reward processing were activated. Specifically, areas such as the orbitofrontal cortex and motor areas were found to be activated in beautiful judgements for pictographs. Beautiful judgements for bone scripts were activated by the putamen, showing a stronger beautiful experience when looking at the pictographs. For negative images or ugly images, only the visual areas of the brain were activated, which shows less involvement and integration of the stimuli. This demonstrates that the brain is not only selective about the type of stimuli it receives, preferring a pictograph over a bone script, but it also prefers positive and beautiful images over negative and ugly images.
Interestingly, according to this study, these processes seem to work differently in men and women. Using MEG, Cela-Conde et al. found that brain activity for processing beauty is bilateral for women, whereas for men, it is lateralized to the right hemisphere. The images used to test this were unknown paintings by artists in art school as well as more natural pictures depicting various landscapes. While the main activity was done by the parietal lobe, the localization was different for men and women. It has been suggested that this could be due to evolution, so the difference in spatial ability may be determined by the division of labor between hunting and gathering. It would be interesting to see whether these areas are still influenced by everyday life or if they can be localized bilaterally for both men and women. The question to dwell on is this: is it our psychology that activates these brain regions when perceiving something as beautiful or are these brain regions controlling the way we perceive beauty?
Writer: Albert Wang
Editor: Sophia Hon
The ability to picture things in our mind seems so natural, and yet certain people, as researchers have discovered, are unable to. Referred to as “aphantasia,” the condition describes an inability to form mental images. What is currently known about the condition comes mostly from the work of neurologist Adam Zeman.
In 2005, Dr. Zeman at the University of Exeter Medical School performed a minor surgical procedure on a patient referred to as “MX” who thereafter lost his ability to create mental images. Dr. Zeman failed to find any description of such condition in medical literature, so he gave MX a series of examinations. In addition to performing well in problem solving and semantic memory tests, MX could look at faces of famous people and name them; however, if only given their names, he could not be asked to picture their faces. Brain scans revealed that face-recognition regions that would active in normal brains in such a test were not active in MX’s brain.
Dr. Zeman and his colleagues more recently performed another study on several people believed to have the same condition. Their results found similar symptoms between each subject in that they could all perform general knowledge tasks like counting the windows in their house, but could not be asked to picture things like a sunrise. However, in their report, the scientists noted that many of the subjects did not become afflicted with aphantasia from an injury, but had it since birth. As a matter of fact, the condition is believed to affect as much as 2% of the population, but as of now, Dr. Zeman hopes to find more people with it in order to perform a bigger scanning study by comparing their brains with people who can form mental images and find out how common the condition really is.
Firefox co-creator Blake Ross described how it feels to have aphantasia since birth and his surprise at his discovery that other people can visualize things. “I can’t ‘see’ my father’s face or a bouncing blue ball, my childhood bedroom or the run I went on ten minutes ago,” he wrote on Facebook. “I thought ‘counting sheep’ was a metaphor. I’m 30 years old and I never knew a human could do any of this. And it is blowing my goddamned mind.”
Writer: Nathaniel Meshberg
Editor: Kawtar Bennani
Starting at conception, your genes lay out a neural map for the nervous system: your cells multiply and migrate to form the primitive beginnings of your brain. Much of what happens during this time is co-determined by your environment, which in turn is determined by that of the mother. Sometimes, this interplay, or even purely genetic factors, can result in congenital deafness. Upon birth, a baby who can hear nothing has no audio input traveling to the brain. But once out of the womb, neural development keeps going at a rapid pace, and your brain continues to shape itself. For deaf infants, this means it develops without any sound, likely facilitating the encroachment of other functioning areas upon the cortex traditionally reserved for auditory processing.
An interesting question is what this type of plasticity (the ability for the brain to change and adapt) means for language learning in children given cochlear implants. An Ohio State University Research team attempted to tease apart this issue by observing parent interactions with congenitally deaf children using cochlear implants. The parents presented new toys with unique names to the children and the researchers recorded the entire interaction at different angles and with eye trackers to see what strongly draws the child’s attention.
The official results have not yet been revealed, but there is hope that more studies such as this will help reveal why children with cochlear implants, even with implants from a very young age, have language delays and appear to learn language differently. In the video accompanying the study description, parents of one subject describe how narrating daily activities to their son has made a world of difference in their communication, a tip they learned from participating in the experiment. However, it’s possible that this has to with simply exposing the child to more language, or maybe even his developmental stage.
Once the results of this study, and others like it, are published, parents will hopefully be able to bridge the gap between themselves and their deaf children, better understanding the differences between being born with hearing and born without it.