Current Projects

What does joint attention look like in interactions between deaf children and their parents?

Joint attention refers to moments within an interaction when children and their parent or caregiver are both attending to the same thing at the same time. In spoken language, joint attention occurs when parents and children are looking at an object, and parents label or talk about it. How does this multi-modal process adapt when all input (language and information about the world) is perceived visually? We examine interactions between deaf children and their deaf and hearing caregivers to see how joint visual attention is achieved.  We do this by recording parents and children during naturalistic play, and then coding these interactions to understand how eye gaze, handling objects, and language input are coordinated.

loading slideshow...

 

How do deaf children learn new ASL signs?

Early childhood is a time of rapid word learning. How do children map new labels to new objects? We study the process of word learning in ASL. In particular, we investigate how deaf children learn to manage their eye gaze and visual attention so that they can connect language and objects. Our studies use eye-tracking technology, which allows us to monitor children’s gaze as they perceive signs, pictures, and videos on a computer. We also record parents and children interacting with novel objects to determine how new labels are introduced during naturalistic play.

 

How do we know if deaf children are reaching their language milestones?

Vocabulary development in young childhood is an important predictor of later language outcomes. Yet, few measures exist to track the development of ASL in very young children. With our collaborators Dr. Naomi Caselli and Dr. Jennie Pyers, we are developing measures of productive and receptive language for use with infants, toddlers, and children learning ASL.

loading slideshow...

 

How do adult ASL-signers process language?

We study ASL production, comprehension, and processing in deaf adults from a range of backgrounds. We are interested in how adults process ASL as they are perceiving signs; how ASL phonology and semantics influences comprehension; and how signers choose to express various concepts in ASL. We use a range of approaches, primarily eye-tracking, to understand adult ASL processing.