Current Grant Support
- ONR MURI FOA N00014-18-S-F006: Neuro-Autonomy: Neuroscience-Inspired Perception, Navigation, and Spatial Awareness for Autonomous Robots
- NSF 1829398: Functional Organization of Navigational Coding in the Human Brain
- ONR MURI N00014-16-1-2832: Neural circuits underlying symbolic processing in primate cortex and basal ganglia
- ONR DURIP N00014-17-1-2304: High Performance Computing Cluster for Cognitive Neuroscience Analysis and Modeling
- NSF 1625552: MRI: Acquisition of a 3-Tesla Magnetic Resonance Imaging (MRI) Scanner for Cognitive and Systems Neuroscience
The ONR MURI provides funding to explore how humans and animals process sensory stimuli to engage in goal-directed navigation. These biological insights are being applied to algorithmic methods which permit autonomous navigation for robots. Implementation is agnostic to specific robot design, enabling validation studies with ground, aerial, and underwater autonomous robots using experimental testbeds at BU, MIT, and Australian institutions. Prof. Yannis Paschalidis (Systems Engineering, CSN) leads the interdisciplinary research team which includes Prof. John Baillieul (Systems Engineering, CSN), Prof. Margrit Betke (Computer Science, CSN), Prof. Michael Hasselmo (Psychological and Brain Sciences, CSN), Prof. John Leonard (Engineering, MIT), Prof. Nicholas Roy (Aeronautics, MIT), and Prof. Roberto Tron (Systems Engineering).
With funding from NSF, the researchers will study how the human brain represents dimensions that are needed for finding our way around in the world. The interdisciplinary approach combines cognitive and visual neuroscience methods, aimed towards understanding how the human brain processes spatial navigation information. The studies incorporate behavioral, cognitive, and neuroimaging techniques to examine how the human brain codes for distance, heading direction, speed, and time, which may contribute to higher-level navigation mechanisms, such as planning a route to a known destination or finding one’s way home. The results have the potential to impact other fields, including robotics and spatial sciences. In robotics, autonomous systems have difficulty determining whether they have successfully returned back to their origin after an outbound journey, which robotics researchers call the loop closure problem. In contrast, humans and animals can readily solve this problem. Understanding how visual information is used to localize and orient will provide knowledge that could potentially facilitate innovation in mobile robots and self-driving cars or training for more efficient navigation in humans. Greater knowledge of the basic properties of navigation in humans could also lead to improved electronic navigation systems, emergency response training, and more effective transportation signage.
The scientific goals harness the strengths of cognitive neuroscience, visual neuroscience, and spatial sciences to examine navigation in humans. While much is known about the navigation system in rodents, the rat and primate have fundamentally different visual systems. Contributions from the visual system provide critical information necessary for self-motion guided navigation, and the theoretical basis for this proposal stems from computational models that posit that perceptual information, including optic flow, speed, and direction signals, are necessary for successful navigation. The researchers propose a framework in which spatial representations transform from a retinotopic to a spatiotopic organization. This framework posits testable hypotheses about the nature of self-motion guided navigational representations in the brain. A series of experiments will examine how the human brain codes lower-level representations, such as distance, heading direction, speed, and time, which may serve as basis functions for generating higher-level level navigational representations. The studies will examine how these selective properties are spatially organized in the brain, as well as the higher-level computations that bring this information together to compute path integration. To do so, the proposed studies employ innovative functional MRI paradigms adapted from visual neuroscience, including population receptive-field mapping, phase-encoded analyses, and model-based time-series analyses. The proposed work is critical for extending computational models of navigation to the systems level in humans.
The ONR MURI combines a team of researchers from Boston University, the Massachusetts Institute of Technology, and Brown University. The MURI includes neurocomputational modeling and neurophysiological analysis of neural circuit mechanisms for symbolic processing in monkeys and humans in different tasks including: 1. Hierarchical rule learning involving location cues coding the application and reversal of rules based on associations between specific visual cues. 2. Hierarchical rule learning guided by cues regulating the focus on different dimensions of stimuli. 3. Tasks requiring responses based on temporal translation to make responses based on the time of individual stimuli. 4. Tasks involving application of rules in a variant of Raven’s progressive matrices. This research is performed under the direction of Principal Investigator Prof. Michael Hasselmo, who is overseeing development of computational models of neural circuits involved in symbolic processing. Development of different models of symbolic processing is also performed under Prof. Marc Howard at Boston University and by consultant Prof. Chris Eliasmith. Modeling predictions are tested in experiments using multiple single-neuron recording and field potential recording in monkeys in the lab of Prof. Earl Miller in the Department of Brain and Cognitive Systems at the Massachusetts Institute of Technology. In addition, predictions of these models are tested in neuroimaging experiments in the laboratories of Prof. David Badre in the Department of Psychology at Brown University and Prof. Chantal Stern in the Center for Systems Neuroscience at Boston University.
This grant provides funding for a high-performance computing cluster for cognitive data analysis and modeling. New compute nodes with priority access for CSN/CNC members were installed at the Massachusetts Green High Performance Computing Center through the BU research computing buy-in program. The additional compute nodes enhance the quality and speed of data analysis and modeling work, especially that which examines symbolic processing. Prof. Chantal Stern (Psychological and Brain Sciences, CSN) is the principal investigator with co-investigators Prof. Michael Hasselmo (Psychological and Brain Sciences, CSN), Prof. Marc Howard (Psychological and Brain Sciences, CSN), Prof. Earl Miller (MIT), and Prof. David Badre (Brown).
This grant provides funding for a Siemens 3T MAGNETOM Prisma MRI scanner for human structural and functional neuroimaging research in the Cognitive Neuroimaging Center (CNC) in the new Kilachand Center for Integrated Life Sciences and Engineering. Development of the research facilities within the Kilachand Center, and more specifically the CNC, has provided Boston University neuroscience researchers with state-of-the-art facilities as a shared, core facility for research, with the goal of developing an understanding of brain function that bridges across multiple scales – from the cellular level, to the systems level, to the level of human cognition and behavior. The new MRI scanner within the multidisciplinary center fosters neuroscience research that does not adhere to the traditional human/animal divide, allowing the CNC to develop models that link the understanding of neural circuits to cognition and behavior.
Previous Grant Support
- NIH/NIMH P50 MH094263: Prefrontal and Medial-Temporal Interactions in Memory
- NIH/NEI R01 EY022229: Human Fronto-Parietal Networks for Visual Attention and Memory
- NSF Science of Learning Center (CELEST) SBE0354378
- NIMH Silvio O. Conte Center 1P50MH094263-01
- ONR MURI N00014-10-1-0936
- Effects of Parkinson’s Disease on Perception, Cognition, and Gait