Conferences
9th Annual Neurophotonics Symposium: New Developments in Imaging Naturalistic, Social, and Freely Moving Behaviors
Graduate Student Alexandria Barlowe presented her research project at BU on 1/23/2026 titled “A Computational Model of Arousal Dynamics in Autism Spectrum Disorder using the Pupillary Light Response Function”.

Abstract:
Hypo or hyper reactivity to sensory input, is a hallmark of many developmental conditions such as Autism Spectrum Disorder (ASD), which has become increasingly common in the United States (USDHHS, 2025). Such reactivity, or arousal state, has been reliably measured using pupil dynamics (Joshi and Gold, 2020), largely controlled by the locus coeruleus (LC) in both humans (Murphy et. al, 2014) and rodents (Privitera et. al, 2020). But how the LC computes stimuli in different neurodiverse conditions such as ASD has only begun to be uncovered. By applying the modified Naka-Rushton (Naka and Rushton, 1966) function from Pan et. al, 2022 to an additive shunting network equation (Grossberg, 1993), this project simulates a potential mechanism for the shifted dynamics seen in different developmental models. This project will highlight the subcortical contribution of the LC to higher order sensory interpretation.
Poster: Barlowe Pupil Project
Visual Abstract: PupilAbstractAB
2025 NEPA Conference
The Caldwell-Harris Lab presented some of its recent research projects at the 2025 New England Psychological Association Conference in October. A full list of presenters can be found on NEPA’s website, while our contributions are included below.
In what language does your brain talk to itself? And when?
Authors: Dila Bostanci, Angel Lee, Behnoosh Saberinezhad, Cem Eralp, Luna Lenzi, Min Zeng, Sila Acar, Seojin Song, Catherine Caldwell-Harris
Abstract: This study explores the role of bilingualism during inner speech. Inner speech is defined as the subjective experience of language in the absence of overt and audible articulation and is used to perform executive functioning tasks such as verbal rehearsal and speech monitoring (Alderson-Day and Fernyhough, 2015). Inner speech has been implicated to play an important role in emotional self-regulation as structured, positive inner speech helps regulate emotion and excessive unchecked inner speech can dysregulate emotion (Salas et al., 2018). Inner speech regulation amongst bilinguals tends to occur in the form of code-switching which is alternating between two or more languages during a single conversation (Williams et al., 2019). Considering this, we wanted to further explore how having multiple languages affects one’s inner speech while regulating their emotions. We hypothesized that having a learned foreign language would be a useful tool to talk back to negative inner speech in one’s first language (L1), thus being able to rationalize in the second language (L2).

Therap-e? AI, Therapy, and the Future of Care
Authors: Samishti Bhatia, Sadiya Buccino, Favour Adiagwai
Abstract: Since 2019, Artificial Intelligence (AI) has increasingly been integrated into diverse sectors of society, including healthcare. As of 2023, only 50% of all adults diagnosed with mental illness receive care, a disparity largely driven by the lack of affordable, accessible, and private care options (Substance Abuse and Mental Health Services Administration [SAMHSA], 2023). Given the evident crisis in mental health care and the projected expansion of AI in healthcare, AI has been positioned as a potential solution. The U.S. Food and Drug Administration has approved over 1,000 of AI products for medical use as of July 2025 (U.S. Food and Drug Administration, 2025). Notably, none of these products directly address counseling or psychotherapy, raising important questions about the applicability and effectiveness of AI mental health products on the market across diverse populations.

