Brain-Machine Interfaces for Robotic Control
In a collaboration between the Guenther lab and the Neuromorphics lab at Boston University, we developed an EEG-based brain-machine interface (BMI) for controlling an adaptive mobile agent consisting of an iRobot Create enhanced with a rotatable camera and robotic arm. Using EEG signals, the user can navigate the mobile robot to a desired location in a room, orient the camera to fixate on a target object, and pick up an attended object with the robotic arm. The video below, created by Sean Lorenz, illustrates the navigation component of this system.
In a second collaboration with Dr. Daniela Rus and colleagues at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), we are developing a BMI that uses error-related potentials detected in a human observer using electroencephalography (EEG) to correct movements of a robotic arm. The video below, created by CSAIL, illustrates this system, which has received widespread coverage in the popular press.
Publications
- Galbraith, B.V., Guenther, F.H., and Versace, M. (2015). A neural network-based exploratory learning and motor planning system for co-robots. Frontiers in Neurobotics, 9, article 7.
- Brumberg, J.S., Lorenz, S.D., Galbraith, B.V., and Guenther, F.H. (2012). The Unlock Project: A Python-based framework for practical brain-computer interface communication “app” development. 34th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA. PMCID: PMC3637898
- Galbraith, B.V. (2016) A brain-machine interface for assistive robotic control. Boston University Thesis. Boston, MA: Boston University.