{"id":1507,"date":"2016-10-29T19:22:23","date_gmt":"2016-10-29T23:22:23","guid":{"rendered":"https:\/\/sites.bu.edu\/guentherlab\/?page_id=1507"},"modified":"2016-10-29T21:15:12","modified_gmt":"2016-10-30T01:15:12","slug":"the-first-speech-brain-computer-interface","status":"publish","type":"page","link":"https:\/\/sites.bu.edu\/guentherlab\/miscellaneous-videos-and-oddities\/the-first-speech-brain-computer-interface\/","title":{"rendered":"The First Speech Brain-Computer Interface"},"content":{"rendered":"<p>In a collaboration with Dr. Philip Kennedy and colleagues at <a href=\"http:\/\/www.neuralsignals.com\/\" target=\"_blank\">Neural Systems, Inc.<\/a>, we developed the first brain-computer interface (BCI) capable of translating neural signals from speech motor cortex into synthetic speech output in real time. The video below shows the participant (second from right, facing the camera), who suffers from complete paralysis due to\u00a0<a href=\"https:\/\/en.wikipedia.org\/wiki\/Locked-in_syndrome\" target=\"_blank\">locked-in syndrome<\/a>, producing vowels with the BCI. The computer first says &#8220;listen&#8221;, then provides the participant with an example of the vowel sound to produce. Then the computer says &#8220;speak&#8221;, and the locked-in participant generates the vowel using the BCI. The computer screen at the bottom of the video indicates the target vowel in green, and the participant&#8217;s production is indicated by the moving cursor. A bell rings three times when the vowel target is reached. Further details about this project and subsequent research\u00a0are available on our<a href=\"https:\/\/sites.bu.edu\/guentherlab\/research-projects\/neural-prosthetics-for-speech-restoration\/\" target=\"_blank\"> Neural Prosthetics for Speech Restoration<\/a> page.<\/p>\n<div style=\"width: 640px;\" class=\"wp-video\"><!--[if lt IE 9]><script>document.createElement('video');<\/script><![endif]-->\n<video class=\"wp-video-shortcode\" id=\"video-1507-1\" width=\"640\" height=\"480\" preload=\"metadata\" controls=\"controls\"><source type=\"video\/mp4\" src=\"\/guentherlab\/files\/2016\/10\/0528_Full_NEW.m4v?_=1\" \/><a href=\"\/guentherlab\/files\/2016\/10\/0528_Full_NEW.m4v\">\/guentherlab\/files\/2016\/10\/0528_Full_NEW.m4v<\/a><\/video><\/div>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In a collaboration with Dr. Philip Kennedy and colleagues at Neural Systems, Inc., we developed the first brain-computer interface (BCI) capable of translating neural signals from speech motor cortex into synthetic speech output in real time. The video below shows the participant (second from right, facing the camera), who suffers from complete paralysis due to\u00a0locked-in [&hellip;]<\/p>\n","protected":false},"author":3849,"featured_media":0,"parent":1408,"menu_order":11,"comment_status":"closed","ping_status":"closed","template":"","meta":[],"_links":{"self":[{"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/pages\/1507"}],"collection":[{"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/users\/3849"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/comments?post=1507"}],"version-history":[{"count":9,"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/pages\/1507\/revisions"}],"predecessor-version":[{"id":1510,"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/pages\/1507\/revisions\/1510"}],"up":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/pages\/1408"}],"wp:attachment":[{"href":"https:\/\/sites.bu.edu\/guentherlab\/wp-json\/wp\/v2\/media?parent=1507"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}