J. Contreras-Vidal, Jesus G. Cruz-Garza, Anastasiya E. Kopteva
{"title":"面向表达性动作意图解码的全身脑机接口系统的挑战与机遇","authors":"J. Contreras-Vidal, Jesus G. Cruz-Garza, Anastasiya E. Kopteva","doi":"10.1109/IWW-BCI.2017.7858142","DOIUrl":null,"url":null,"abstract":"The restoration and rehabilitation of human bipedal locomotion represent major goals for brain machine interfaces (BMIs), i.e., devices that translate neural activity into motor commands to control wearable robots to enable locomotive and non-locomotive tasks by individuals with gait disabilities. Prior BMI efforts based on scalp electroencephalography (EEG) have revealed that fluctuations in the amplitude of slow cortical potentials in the delta band contain information that can be used to infer motor intent, and more specifically, the kinematics of walking and non-locomotive tasks such as sitting and standing. However, little is known about the extent to which EEG can be used to discern the expressive qualities that influence such functional movements. Here, we discuss how novel experimental approaches integrated with machine learning techniques can deployed to decode expressive qualities of movement. Applications to artistic brain-computer interfaces (BCIs), movement aesthetics, and gait neuroprostheses endowed with expressive qualities are discussed.","PeriodicalId":443427,"journal":{"name":"2017 5th International Winter Conference on Brain-Computer Interface (BCI)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Towards a whole body brain-machine interface system for decoding expressive movement intent Challenges and Opportunities\",\"authors\":\"J. Contreras-Vidal, Jesus G. Cruz-Garza, Anastasiya E. Kopteva\",\"doi\":\"10.1109/IWW-BCI.2017.7858142\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The restoration and rehabilitation of human bipedal locomotion represent major goals for brain machine interfaces (BMIs), i.e., devices that translate neural activity into motor commands to control wearable robots to enable locomotive and non-locomotive tasks by individuals with gait disabilities. Prior BMI efforts based on scalp electroencephalography (EEG) have revealed that fluctuations in the amplitude of slow cortical potentials in the delta band contain information that can be used to infer motor intent, and more specifically, the kinematics of walking and non-locomotive tasks such as sitting and standing. However, little is known about the extent to which EEG can be used to discern the expressive qualities that influence such functional movements. Here, we discuss how novel experimental approaches integrated with machine learning techniques can deployed to decode expressive qualities of movement. Applications to artistic brain-computer interfaces (BCIs), movement aesthetics, and gait neuroprostheses endowed with expressive qualities are discussed.\",\"PeriodicalId\":443427,\"journal\":{\"name\":\"2017 5th International Winter Conference on Brain-Computer Interface (BCI)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-02-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 5th International Winter Conference on Brain-Computer Interface (BCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IWW-BCI.2017.7858142\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 5th International Winter Conference on Brain-Computer Interface (BCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWW-BCI.2017.7858142","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Towards a whole body brain-machine interface system for decoding expressive movement intent Challenges and Opportunities
The restoration and rehabilitation of human bipedal locomotion represent major goals for brain machine interfaces (BMIs), i.e., devices that translate neural activity into motor commands to control wearable robots to enable locomotive and non-locomotive tasks by individuals with gait disabilities. Prior BMI efforts based on scalp electroencephalography (EEG) have revealed that fluctuations in the amplitude of slow cortical potentials in the delta band contain information that can be used to infer motor intent, and more specifically, the kinematics of walking and non-locomotive tasks such as sitting and standing. However, little is known about the extent to which EEG can be used to discern the expressive qualities that influence such functional movements. Here, we discuss how novel experimental approaches integrated with machine learning techniques can deployed to decode expressive qualities of movement. Applications to artistic brain-computer interfaces (BCIs), movement aesthetics, and gait neuroprostheses endowed with expressive qualities are discussed.