{"id":831,"date":"2021-06-07T13:20:15","date_gmt":"2021-06-07T17:20:15","guid":{"rendered":"https:\/\/sites.bu.edu\/denisonlab\/?page_id=831"},"modified":"2025-11-04T12:39:55","modified_gmt":"2025-11-04T17:39:55","slug":"projects","status":"publish","type":"page","link":"https:\/\/sites.bu.edu\/denisonlab\/projects\/","title":{"rendered":"Projects"},"content":{"rendered":"<div class=\"contentbox\">\n<div class=\"column sizeprojectimg\">\n<img src=\"\/denisonlab\/files\/2021\/08\/tempatten-large1.png\">\n  <\/div>\n<div class=\"column sizeprojecttext\">\n<h4>Temporal attention<\/h4>\n<p>The dynamics and limitations of visual attention across short time intervals (~1 s) shape perception and our ability to interact with the world. In this ongoing project, we combine psychophysics, MEG, and computational modeling to understand the continuous interaction of voluntary (goal-directed) and involuntary (stimulus-driven) temporal attention. We have found:<\/p>\n<ul>\n<li style=\"margin:10pt 0\"><a href=\"\/denisonlab\/files\/2021\/07\/Denison_et_al_2017_PBR.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Voluntary attention to a specific point in time leads to perceptual tradeoffs<\/a>: performance is better at the attended time but worse at unattended times. This finding indicates perceptual limits over short time intervals that can be flexibly managed by attention.<\/li>\n<li style=\"margin:10pt 0\"><a href=\"\/denisonlab\/files\/2021\/06\/DenisonCarrascoHeeger_2021_NHB.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Our model of temporal attention<\/a> explains behavioral data and predicts that attentional resources are limited across time.<\/li>\n<li style=\"margin:10pt 0\">Attending to a moment in time changes neural activity both <a href=\"\/denisonlab\/files\/2025\/09\/DenisonTianHeegerCarrasco_2024_NatureCommunications.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">leading up to<\/a> and <a href=\"\/denisonlab\/files\/2025\/09\/ZhuTianCarrascoDenison_2024_PNASNexus.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">following<\/a> an attended time, revealing unique neural mechanisms for temporal attention that may be driven by the distinctive demands of temporal processing.<\/li>\n<li style=\"margin:10pt 0\">Small eye movements called <a href=\"\/denisonlab\/files\/2021\/07\/DenisonYuval-GreenbergCarrasco_2019_JNeurosci.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">microsaccades decrease in anticipation of an attended stimulus<\/a>. This stabilization of eye position with voluntary temporal attention may help us see brief stimuli.<\/li>\n<li style=\"margin:10pt 0\">Despite differences in visual processing across space, <a href=\"\/denisonlab\/files\/2021\/07\/FernandezDenisonCarrasco_2019_JOV.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">voluntary temporal attention affects perception similarly across the visual field<\/a>.<\/li>\n<\/ul><\/div>\n<\/div>\n<hr>\n<div class=\"contentbox\">\n<div class=\"column sizeprojectimg\">\n<img src=\"\/denisonlab\/files\/2021\/08\/au-large2.jpg\">\n  <\/div>\n<div class=\"column sizeprojecttext\">\n<h4>Attention and uncertainty in perceptual decision making<\/h4>\n<p>Visual information is always uncertain. Sometimes that uncertainty comes from the external world &#8211; for example, on a foggy night. In this project, we ask how people make perceptual decisions when uncertainty comes not from the external world, but from their own attentional state. We found that people&#8217;s <a href=\"\/denisonlab\/files\/2021\/07\/DenisonAdlerCarrascoMa_2018_PNAS.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">perceptual decisions and confidence  take into account attention-dependent uncertainty<\/a>, in a statistically appropriate way.<\/p>\n<\/p><\/div>\n<\/div>\n<hr>\n<div class=\"contentbox\">\n<div class=\"column sizeprojectimg\">\n<img src=\"\/denisonlab\/files\/2025\/09\/arc-foho-logo.png\">\n  <\/div>\n<div class=\"column sizeprojecttext\">\n<h4>Perceptual awareness<\/h4>\n<p>Our lab is part of two large &#8220;adversarial collaborations&#8221; to adjudicate theories of perceptual awareness. One project aims to distinguish <a href=\"http:\/\/arc-foho.org\/\" target=\"_blank\" rel=\"noopener noreferrer\">first-order vs. higher order theories of consciousness<\/a>, while another aims to test <a href=\"http:\/\/arc-ethos.org\/\" target=\"_blank\" rel=\"noopener noreferrer\">multiple higher-order theories<\/a>.<\/p>\n<p>As part of these projects, we have found that <a href=\"https:\/\/www.biorxiv.org\/content\/10.1101\/2025.07.03.661972\" target=\"_blank\" rel=\"noopener noreferrer\">spatial attention strongly decouples subjective awareness and objective perceptual performance<\/a>, giving rise to the phenomenon of &#8220;subjective inflation&#8221; across a wide range of stimulus and task conditions.<\/p>\n<\/p><\/div>\n<\/div>\n<hr>\n<div class=\"contentbox\">\n<div class=\"column sizeprojectimg\">\n<img src=\"\/denisonlab\/files\/2021\/08\/pupil-optim.gif\">\n  <\/div>\n<div class=\"column sizeprojecttext\">\n<h4>Pupil modeling<\/h4>\n<p>The pupil dilates not only when light levels increase, but also in response to many kinds of perceptual and cognitive events. But the dilation is sluggish, so if multiple events occur in quick succession, we must disentangle how each one affected the pupil. We developed and tested a  <a href=\"\/denisonlab\/files\/2021\/07\/DenisonParkerCarrasco_2020_BRM.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">pupil modeling<\/a> framework to estimate how perceptual and cognitive events affect pupil dilation.<\/p>\n<p><a href=\"https:\/\/github.com\/jacobaparker\/PRET\" target=\"_blank\" rel=\"noopener noreferrer\">Code<\/a> for our Pupil Response Estimation Toolbox (PRET) is avaiable on GitHub.<\/p>\n<\/p><\/div>\n<\/div>\n<hr>\n<div class=\"contentbox\">\n<div class=\"column sizeprojectimg\">\n<img src=\"\/denisonlab\/files\/2021\/08\/lgnmp-large1.png\">\n  <\/div>\n<div class=\"column sizeprojecttext\">\n<h4>Human LGN M\/P imaging<\/h4>\n<p>The lateral geniculate nucleus (LGN) is the primary thalamic relay from the retina to the visual cortex. Its magnocellular (M) and parvocellular (P) subdivisions process complementary types of visual information, but they have been difficult to study in humans. We used 3T and 7T fMRI and specialized visual stimuli to <a href=\"\/denisonlab\/files\/2021\/07\/Denison_et_al_2014_NeuroImage.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">functionally map the M and P subdivisions of the human LGN<\/a> noninvasively for the first time.<\/p>\n<p><a href=\"https:\/\/github.com\/racheldenison\/MPLocalizer\" target=\"_blank\" rel=\"noopener noreferrer\">Code<\/a> for the M\/P localizer is avaiable on GitHub.<\/p>\n<\/p><\/div>\n<\/div>\n<hr>\n<div class=\"contentbox\">\n<div class=\"column sizeprojectimg\">\n<img src=\"\/denisonlab\/files\/2021\/08\/imseq-large1.png\">\n  <\/div>\n<div class=\"column sizeprojecttext\">\n<h4>Prediction and perceptual selection<\/h4>\n<p>How do expectations affect what we see? We studied how predictive visual context, which contains information about what is likely to appear next, influences perceptual selection during binocular rivalry. Intriguingly, we found prediction effects (see what you expect to see) with a <a href=\"\/denisonlab\/files\/2021\/07\/Denison_et_al_2011_FrontHumNeurosci.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">rotation sequence<\/a> and <a href=\"\/denisonlab\/files\/2021\/07\/PiazzaDenisonSilver_2018_JOV.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">newly learned audio-visual pairings<\/a>, but surprise effects (see the unexpected) with <a href=\"\/denisonlab\/files\/2021\/07\/Denison_et_al_2016_JOV.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">newly learned natural image sequences<\/a>. In this line of research, we try to understand when, how, and why perception prioritizes the expected vs. the surprising.<\/p>\n<\/p><\/div>\n<\/div>\n<hr>\n<div class=\"contentbox\">\n<div class=\"column sizeprojectimg\">\n<img src=\"\/denisonlab\/files\/2021\/08\/jumpingpen.png\">\n  <\/div>\n<div class=\"column sizeprojecttext\">\n<h4>Perceptual selection of illusory content<\/h4>\n<p>We developed the <b>Jumping Pen Illusion<\/b>, which you can see for yourself with nothing more than a strip of paper and a pen. In the illusion, a pen held behind a strip of paper appears to jump in front of the strip! In lab experiments, this illusion helped us learn that <a href=\"\/denisonlab\/files\/2021\/07\/Chen_et_al_2017_JOV.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">classic dynamics of perceptual competition occur even between illusory percepts<\/a> &#8212; here, percepts filled in across the retinal blind spot. Surprisingly, the <a href=\"\/denisonlab\/files\/2021\/07\/ChenDenisonWhitneyMaus_2018_SciRep.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">filled-in percepts can affect depth perception<\/a> of other objects in the scene that have unambiguous, disparity-defined depth.<\/p>\n<p><a href=\"\/denisonlab\/files\/2021\/07\/Jumping_Pen_Instructions.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Demo instructions<\/a>! If you try it in your classroom, we&#8217;d love to hear.<\/p>\n<\/p><\/div>\n<\/div>\n<hr>\n","protected":false},"excerpt":{"rendered":"<p>Temporal attention The dynamics and limitations of visual attention across short time intervals (~1 s) shape perception and our ability to interact with the world. In this ongoing project, we combine psychophysics, MEG, and computational modeling to understand the continuous interaction of voluntary (goal-directed) and involuntary (stimulus-driven) temporal attention. We have found: Voluntary attention to [&hellip;]<\/p>\n","protected":false},"author":18220,"featured_media":0,"parent":0,"menu_order":3,"comment_status":"closed","ping_status":"closed","template":"page-templates\/no-sidebars.php","meta":[],"_links":{"self":[{"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/pages\/831"}],"collection":[{"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/users\/18220"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/comments?post=831"}],"version-history":[{"count":23,"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/pages\/831\/revisions"}],"predecessor-version":[{"id":1684,"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/pages\/831\/revisions\/1684"}],"wp:attachment":[{"href":"https:\/\/sites.bu.edu\/denisonlab\/wp-json\/wp\/v2\/media?parent=831"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}