Visuocortical changes during a freezing-like state in humans

June 12th, 2018

Neuroimage (In press)
Maria Lojowska, Sam Ling, Karin Roelofs, Erno Hermans

Screen Shot 2018-06-12 at 9.39.44 AMAn adaptive response to threat requires optimized detection of critical sensory cues. This optimization is thought to be aided by freezing – an evolutionarily preserved defensive state of immobility characterized by parasympathetically mediated fear bradycardia and regulated by the amygdala-periaqueductal grey (PAG) circuit. Behavioral observations in humans and animals have suggested that freezing is also a state of enhanced visual sensitivity, particularly for coarse visual information, but the underlying neural mechanisms remain unclear. We induced a freezing-like state in healthy volunteers using threat of electrical shock and measured threat-related changes in both stimulus-independent (baseline) and stimulus-evoked visuocortical activity to low- vs. high-spatial frequency gratings, using functional MRI. As measuring immobility is not feasible in MRI environments, we used fear bradycardia and amygdala- PAG coupling in inferring a freezing-like state. An independent functional localizer and retinotopic mapping were used to assess the retinotopic specificity of visuocortical modulations. We found a threat- induced increase in baseline (stimulus-independent) visuocortical activity that was retinotopically nonspecific, which was accompanied by increased connectivity with the amygdala. A positive correlation between visuocortical activity and fear bradycardia (while controlling for sympathetic activation), and a concomitant increase in amygdala-PAG connectivity, suggest the specificity of these findings for the parasympathetically dominated freezing-like state. Visuocortical responses to gratings were retinotopically specific but did not differ between threat and safe conditions across participants. However, individuals who exhibited better discrimination of low-spatial frequency stimuli showed reduced stimulus-evoked V1 responses under threat. Our findings suggest that a defensive state of freezing involves an integration of preparatory defensive and perceptual changes that is regulated by a common mechanism involving the amygdala.

 Download it here 

Visual memories bypass normalization

December 10th, 2017

Psychological Science (2018)
Ilona Bloem, Yurika Watanabe, Melissa Kibbe & Sam Ling

Screen Shot 2017-12-10 at 9.17.33 AMHow distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Participants were asked to remember the contrast of visual stimuli, which were pit against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores – neither between representations in memory, nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

 Download it here 

Postdoctoral position in The Ling Lab

October 23rd, 2017

The Ling Lab (http://sites.bu.edu/visionat Boston University has funding available for a postdoctoral fellow to work on studies investigating the neural computations subserving early visual processing, and how they interact with processes such as attention and interocular suppression.  Research methods that are currently employed in the lab include fMRI, psychophysics, and computational modeling. The laboratory is part of Boston University’s Department of Psychological and Brain Sciences (www.bu.edu/psych), and is affiliated with the Center for Integrated Life Sciences and Engineering, and the Center for Systems Neuroscience.

Applicants must have a Ph.D. in neuroscience, psychology or related fields, and should possesses a strong programming background.  Prior experience with neuroimaging or advanced psychophysical techniques is highly preferred. The position is available immediately and applications will be reviewed until the position is filled.

Application should include: a CV, a brief statement of research interests, the expected date of availability, and the names and contact information for three referees. Email applications to: samling@bu.edu

Attentional modulation interacts with orientation anisotropies in contrast perception

September 1st, 2017

Journal of Vision (2017)
Ilona Bloem & Sam Ling

Exp2_results_v2Orientation perception is not comparable across all orientations –a phenomenon commonly referred to as the oblique effect. Here, we first assess the interaction between stimulus contrast and the oblique effect. Specifically, we examined whether the impairment in behavioral performance for oblique versus cardinal orientations is best explained by a contrast- or response gain modulation of the contrast psychometric function. Results revealed a robust oblique effect, whereby asymptotic performance for oblique orientations was substantially lower than for cardinal orientations, which we interpret as the result of multiplicative attenuation of contrast responses for oblique orientations. Next, we assessed how orientation anisotropies interact with attention by measuring psychometric functions for orientations under low or high attentional load. Interestingly, attentional load affects the performance for cardinal and oblique orientations differently: while attentional load multiplicatively attenuates contrast psychometric functions for both cardinal and oblique orientation conditions, the magnitude of this effect is greater for the obliques. Thus, having less attentional resources available seems to impair the response for oblique orientations to a larger degree than for cardinal orientations.

Download it here 

Characterizing the effects of feature salience and top-down attention in the early visual system

May 14th, 2017

Journal of Neurophysiology (2017)
Sonia Poltoratski, Sam Ling, Devin McCormack, Frank Tong

TheScreen Shot 2017-05-14 at 10.25.17 AM visual system employs a sophisticated balance of attentional mechanisms: salient stimuli are prioritized for visual processing, yet observers can also ignore such stimuli when their goals require directing attention elsewhere. A powerful determinant of visual salience is local feature contrast: if a local region differs from its immediate surround along one or more feature dimensions, it will appear more salient. Here, we used high-resolution fMRI at 7T to characterize the modulatory effects of bottom-up salience and top-down voluntary attention within multiple sites along the early visual pathway, including visual areas V1-V4 and the lateral geniculate nucleus (LGN). Observers viewed arrays of spatially distributed gratings, where one of the gratings immediately to the left or right of fixation differed from all other items in orientation or motion direction, making it salient. To investigate the effects of directed attention, observers were cued to attend to the grating to the left or right of fixation, which was either salient or non-salient. Results revealed reliable additive effects of top-down attention and stimulus-driven salience throughout visual areas V1-hV4. In comparison, the LGN exhibited significant attentional enhancement but was not reliably modulated by orientation- or motion-defined salience. Our findings indicate that top-down effects of spatial attention can influence visual processing at the earliest possible site along the visual pathway, including the LGN, while the processing of orientation- and motion-driven salience primarily involves feature-selective interactions that take place in early cortical visual areas.

 pdf_iconDownload it here 

BU’s new MRI scanner is here!

March 2nd, 2017

CILSE_MRI

Timelapse of the install (courtesy of Louis Vinke)

Elevated arousal levels enhance contrast perception

March 2nd, 2017

Journal of Vision (2017)
Dongho Kim, Savannah Lokey & Sam Ling

Our state of arousal fluctuates from moment to moment—fluctuations that can have profound impacts on behavior. Arousal has been proposed to play a powerful, widespread role in the brain, influencing processes as far ranging as perception, memory, learning, and decision making. Although arousal clearly plays a critical role in modulating behavior, the mechanisms underlying this modulation remain poorly understood. To address this knowledge gap, we examined the modulatory role of arousal on one of the cornerstones of visual perception: contrast perception. Using a reward-driven paradigm to manipulate arousal state, we discovered that elevated arousal state substantially enhances visual sensitivity, incurring a multiplicative modulation of contrast response. Contrast defines vision, determining whether objects appear visible or invisible to us, and these results indicate that one of the consequences of decreased arousal state is an impaired ability to visually process our environment.

 pdf_iconDownload it here 

Best animated gif ever

October 19th, 2016

Nice, Louis.FIR_UNIONv1_log5_fitINDIvox

Perceptual learning increases orientation sampling efficiency

March 15th, 2016

Journal of Vision (2016)
Denise Moerel, Sam Ling & Janneke Jehee

Screen Shot 2016-03-15 at 7.29.10 PM

Visual orientation discrimination is known to improve with extensive training, but the mechanisms underlying this behavioral benefit remain poorly understood. Here, we examine the possibility that more reliable task performance could arise in part because observers learn to sample information from a larger portion of the stimulus. We used a variant of the classification image method in combination with a global orientation discrimination task to test whether a change in information sampling underlies training-based benefits in behavioral performance. The results revealed that decreases in orientation thresholds with perceptual learning were accompanied by increases in stimulus sampling. In particular, while stimulus sampling was restricted to the parafoveal, inner portion of the stimulus before training, we observed an outward spread of sampling after training. These results demonstrate that the benefits of perceptual learning may arise, in part, from a strategic increase in the efficiency with which the observer samples information from a visual stimulus.

 pdf_iconDownload it here 

The Occipital Face Area is Causally Involved in Facial Viewpoint Perception

November 15th, 2015

Journal of Neuroscience (2015)
Tim Kietzmann, Sonia Poltoratski, Peter König, Randolph Blake, Frank Tong & Sam Ling

ofa

Humans reliably recognize faces across a range of viewpoints, but the neural substrates supporting this ability remain unclear. Recent work suggests that neural selectivity to mirror-symmetric viewpoints of faces, found across a large network of visual areas, may constitute a key computational step in achieving full viewpoint invariance. In this study, we used repetitive transcranial magnetic stimulation (rTMS) to test the hypothesis that the occipital face area (OFA), putatively a key node in the face network, plays a causal role in face viewpoint symmetry perception. Each participant underwent both offline rTMS to the right OFA and sham stimulation, preceding blocks of behavioral trials. After each stimulation period, the participant performed one of two behavioral tasks involving presentation of faces in the peripheral visual field: judging the viewpoint symmetry or judging the angular rotation. rTMS applied to the right OFA significantly impaired performance in both tasks when stimuli were presented in the contralateral, left visual field. Interestingly, however, rTMS had a differential effect on the two tasks performed ipsilaterally. While viewpoint symmetry judgments were significantly disrupted, we observed no impact on the angle judgment task. This interaction, caused by ipsilateral rTMS, provides support for models emphasizing the role of inter-hemispheric crosstalk in the formation of viewpoint-invariant face perception.

pdf_icon Download it here 

Farewell, Dongho!

November 10th, 2015

Dongho’s own crystal brain:

IMG_0021

 

 

 

 

 

 

 


Dongho Kim has moved to Seoul to begin his new adventure as MR Scientific Consultant at ASAN Medical Center.  We’ll miss having you around Dongho!

Sam joins BU’s Hariri Institute Junior Faculty Fellows

September 28th, 2015

apple_iie_largerThe Ling Lab now has affiliation w/ the Hariri Institute for Computing at Boston University!

Sam was selected as a 2015 Hariri Institute Junior Faculty Fellows.  The program recognizes junior faculty at Boston University working in diverse areas of computing and the computational sciences. Institute Fellows help connect like-minded researchers at BU and beyond, also providing a focal point for supporting broader collaborative research.

Official announcement here: http://www.bu.edu/hic/2015/09/28/institute-announces-2015-junior-faculty-fellows/

Sam awarded Peter Paul Career Professorship

September 17th, 2015

More info on the Peter Paul Award (and a brief interview) here: http://www.bu.edu/today/2015/four-junior-faculty-awarded-peter-paul-professorships/

Dongho got a position!!

September 14th, 2015

Congrats to Dongho Kim, who just accepted a position as Scientific Consultant at ASAN Medical Center, where he’ll be conducting neuroimaging research in a clinical setting.

We’re all very excited for you, Dongho!

Asan_medical_center_front

New opinion piece by Dongho, Sam and Takeo Watanabe in F1000 Research!

September 10th, 2015

New opinion paper on the relationship between rewards and perceptual learning.

Kim, D., Ling, S., & Watanabe, T. (2015)
Dual mechanisms governing reward-driven perceptual learning.
F1000 Research

Great writeup, Dongho!

get pdf here