The Unlock Project: BCIs for Locked-In Syndrome
The Unlock Project was a research initiative involving researchers from Boston University, Northeastern University, Massachusetts Institute of Technology, and the University of Kansas whose purpose was to develop brain-computer interface (BCI) technology for individuals suffering from locked-in syndrome (LIS), characterized by complete or near-complete loss of voluntary motor function with intact sensation and cognition. LIS is typically the result of brain stem stroke or late stage amyotrophic lateral schlerosis (ALS, also known as Lou Gehrig’s disease). Being locked in has been compared to being buried alive; sufferers of LIS often feel completely isolated from friends and family due to their inability to communicate. A video from Science Nation describing locked-in syndrome and the Unlock Project is included below.
In addition to generating a number of important BCI research contributions (see Publications below), Unlock Project programmers developed a Python-based modular software framework for developing “apps” that can be controlled by a wide range of hardware input devices, including commercially available electroencephalography (EEG) and electromyography (EMG) systems. This software package is freely available through Github. Further information about the Unlock Project software framework is available in this document.
The Unlock Project was funded by CELEST, an NSF Science of Learning Center (NSF SMA-0835976), and ran from 2009 to 2016.
Publications
- Jia, N., Brincat, S.L., Salazar-Gómez, A.F., Panko, M., Guenther, F.H., and Miller, E. (under review). Decoding of intended saccade direction in an oculomotor brain-computer interface.
- Cler M.J., Nieto-Castanon A., Guenther F.H., Fager, S., and Stepp C.E. (2016). Surface electromyographic control of a novel phonemic interface for speech synthesis. Augmentative and Alternative Communication, 32, pp. 120-130. NIHMSID: NIHMS802522
- Sourati, J., Akcakaya, M. Dy, J.G., Leen, T.K., and Erdogmus, D. (2016). Classification active learning based on mutual information. Entropy, 18, article 51.
- Galbraith, B.V. (2016) A brain-machine interface for assistive robotic control. Boston University Thesis. Boston, MA: Boston University.
- Galbraith, BV, Guenther, FH, and Versace, M (2015). A neural network-based exploratory learning and motor planning system for co-robots. Frontiers in Neurobotics, 9, article 7.
- Akcaya, M., Peters, B., Moghadamfalahi, M., Mooney, A., Oken, B., Erdogmus, E., and Fried-Oken, M. (2014). Noninvasive brain-computer interfaces for augmentative and alternative communication. IEEE Reviews in Biomedical Engineering, 7, pp. 31-49.
- Ahani, A., Wiegand, K., Orhan, U., Akcakaya, M., Moghadamfalahi, M., Nezamfar, H., Patel, R., and Erdogmus, D. (2014). RSVP IconMessenger: Icon-based brain-interfaced
alternative and augmentative communication. Brain-Computer Interfaces, 1, 192-203. - Smith, D.J., Varghese, L.A., Stepp, C.E., and Guenther, F.H. (2014). Comparison of steady-state visual and somatosensory evoked potentials for brain-computer interface control. Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1234-1237.
- Brincat, S., Salazar-Gomez, A.F., Jia, N., Panko, M., Miller, E.K., and Guenther, F.H. (2013). Development of an intracortical eye movement-based brain-computer interface. Proceedings of the 2013 International BCI Meeting, Pacific Grove, CA.
- Brincat, S., Jia, N., Salazar-Gomez, A.F., Panko, M., Miller, E.K., and Guenther, F.H., (2013). Which neural signals are optimal for brain-Computer interface control? Proceedings of the 2013 International BCI Meeting, Pacific Grove, CA.
- Lorenz, S. (2013). Development of a practical and mobile brain-computer communication device for profoundly paralyzed individuals. Boston University Thesis. Boston, MA: Boston University.
- Brumberg, J.S., Lorenz, S.D., Galbraith, B.V., and Guenther, F.H. (2012). The Unlock Project: A Python-based framework for practical brain-computer interface communication “app” development. 34th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA. PMCID: PMC3637898
- Nezamfar, H., Orhan, U., Purwar, S., Hild, K., Oken, B., and Erdogmus, D. (2011). Decoding of multichannel EEG activity from the visual cortex in response to pseudorandom binary sequences of visual stimuli. International Journal of Imaging Systems and Technology, 21, pp. 139-147.
- Brumberg, J. S. and Guenther, F. H. (2010). Development of speech prostheses: current status and recent advances. Expert Review of Medical Devices. 7 (5), 667-679.
- Guenther, F.H., Brumberg, J.S., Wright, E.J., Nieto-Castanon, A., Tourville, J.A., Panko, M., Law, R., Siebert, S.A., Bartels, J.L., Andreasen, D.S., Ehirim, P., Mao, H., and Kennedy, P.R. (2009). A wireless brain-machine interface for real-time speech synthesis. PLoS ONE, 4 (12), e8218. PMCID:PMC2784218