Kirill Kokorin;Syeda R. Zehra;Jing Mu;Peter Yoo;David B. Grayden;Sam E. John
{"title":"利用增强现实脑机接口实现半自主连续机械臂控制","authors":"Kirill Kokorin;Syeda R. Zehra;Jing Mu;Peter Yoo;David B. Grayden;Sam E. John","doi":"10.1109/TNSRE.2024.3500217","DOIUrl":null,"url":null,"abstract":"Noninvasive augmented-reality (AR) brain-computer interfaces (BCIs) that use steady-state visually evoked potentials (SSVEPs) typically adopt a fully-autonomous goal-selection framework to control a robot, where automation is used to compensate for the low information transfer rate of the BCI. This scheme improves task performance but users may prefer direct control (DC) of robot motion. To provide users with a balance of autonomous assistance and manual control, we developed a shared control (SC) system for continuous control of robot translation using an SSVEP AR-BCI, which we tested in a 3D reaching task. The SC system used the BCI input and robot sensor data to continuously predict which object the user wanted to reach, generated an assistance signal, and regulated the level of assistance based on prediction confidence. Eighteen healthy participants took part in our study and each completed 24 reaching trials using DC and SC. Compared to DC, SC significantly improved (paired two-tailed t-test, Holm-corrected \n<inline-formula> <tex-math>$\\alpha \\lt 0.05$ </tex-math></inline-formula>\n) mean task success rate (\n<inline-formula> <tex-math>${p} \\lt 0.0001$ </tex-math></inline-formula>\n, \n<inline-formula> <tex-math>$\\mu =36.1$ </tex-math></inline-formula>\n%, 95% CI [25.3%, 46.9%]), normalised reaching trajectory length (\n<inline-formula> <tex-math>${p} \\lt 0.0001$ </tex-math></inline-formula>\n, \n<inline-formula> <tex-math>$\\mu = -26.8$ </tex-math></inline-formula>\n%, 95% CI [−36.0%, −17.7%]), and participant workload (\n<inline-formula> <tex-math>${p} =0.02$ </tex-math></inline-formula>\n, \n<inline-formula> <tex-math>$\\mu = -11.6$ </tex-math></inline-formula>\n, 95% CI [−21.1, −2.0]) measured with the NASA Task Load Index. Therefore, users of SC can control the robot effectively, while experiencing increased agency. Our system can personalise assistive technology by providing users with the ability to select their preferred level of autonomous assistance.","PeriodicalId":13419,"journal":{"name":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","volume":"32 ","pages":"4098-4108"},"PeriodicalIF":4.8000,"publicationDate":"2024-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10755142","citationCount":"0","resultStr":"{\"title\":\"Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface\",\"authors\":\"Kirill Kokorin;Syeda R. Zehra;Jing Mu;Peter Yoo;David B. Grayden;Sam E. John\",\"doi\":\"10.1109/TNSRE.2024.3500217\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Noninvasive augmented-reality (AR) brain-computer interfaces (BCIs) that use steady-state visually evoked potentials (SSVEPs) typically adopt a fully-autonomous goal-selection framework to control a robot, where automation is used to compensate for the low information transfer rate of the BCI. This scheme improves task performance but users may prefer direct control (DC) of robot motion. To provide users with a balance of autonomous assistance and manual control, we developed a shared control (SC) system for continuous control of robot translation using an SSVEP AR-BCI, which we tested in a 3D reaching task. The SC system used the BCI input and robot sensor data to continuously predict which object the user wanted to reach, generated an assistance signal, and regulated the level of assistance based on prediction confidence. Eighteen healthy participants took part in our study and each completed 24 reaching trials using DC and SC. Compared to DC, SC significantly improved (paired two-tailed t-test, Holm-corrected \\n<inline-formula> <tex-math>$\\\\alpha \\\\lt 0.05$ </tex-math></inline-formula>\\n) mean task success rate (\\n<inline-formula> <tex-math>${p} \\\\lt 0.0001$ </tex-math></inline-formula>\\n, \\n<inline-formula> <tex-math>$\\\\mu =36.1$ </tex-math></inline-formula>\\n%, 95% CI [25.3%, 46.9%]), normalised reaching trajectory length (\\n<inline-formula> <tex-math>${p} \\\\lt 0.0001$ </tex-math></inline-formula>\\n, \\n<inline-formula> <tex-math>$\\\\mu = -26.8$ </tex-math></inline-formula>\\n%, 95% CI [−36.0%, −17.7%]), and participant workload (\\n<inline-formula> <tex-math>${p} =0.02$ </tex-math></inline-formula>\\n, \\n<inline-formula> <tex-math>$\\\\mu = -11.6$ </tex-math></inline-formula>\\n, 95% CI [−21.1, −2.0]) measured with the NASA Task Load Index. Therefore, users of SC can control the robot effectively, while experiencing increased agency. Our system can personalise assistive technology by providing users with the ability to select their preferred level of autonomous assistance.\",\"PeriodicalId\":13419,\"journal\":{\"name\":\"IEEE Transactions on Neural Systems and Rehabilitation Engineering\",\"volume\":\"32 \",\"pages\":\"4098-4108\"},\"PeriodicalIF\":4.8000,\"publicationDate\":\"2024-11-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10755142\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Neural Systems and Rehabilitation Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10755142/\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10755142/","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface
Noninvasive augmented-reality (AR) brain-computer interfaces (BCIs) that use steady-state visually evoked potentials (SSVEPs) typically adopt a fully-autonomous goal-selection framework to control a robot, where automation is used to compensate for the low information transfer rate of the BCI. This scheme improves task performance but users may prefer direct control (DC) of robot motion. To provide users with a balance of autonomous assistance and manual control, we developed a shared control (SC) system for continuous control of robot translation using an SSVEP AR-BCI, which we tested in a 3D reaching task. The SC system used the BCI input and robot sensor data to continuously predict which object the user wanted to reach, generated an assistance signal, and regulated the level of assistance based on prediction confidence. Eighteen healthy participants took part in our study and each completed 24 reaching trials using DC and SC. Compared to DC, SC significantly improved (paired two-tailed t-test, Holm-corrected
$\alpha \lt 0.05$
) mean task success rate (
${p} \lt 0.0001$
,
$\mu =36.1$
%, 95% CI [25.3%, 46.9%]), normalised reaching trajectory length (
${p} \lt 0.0001$
,
$\mu = -26.8$
%, 95% CI [−36.0%, −17.7%]), and participant workload (
${p} =0.02$
,
$\mu = -11.6$
, 95% CI [−21.1, −2.0]) measured with the NASA Task Load Index. Therefore, users of SC can control the robot effectively, while experiencing increased agency. Our system can personalise assistive technology by providing users with the ability to select their preferred level of autonomous assistance.
期刊介绍:
Rehabilitative and neural aspects of biomedical engineering, including functional electrical stimulation, acoustic dynamics, human performance measurement and analysis, nerve stimulation, electromyography, motor control and stimulation; and hardware and software applications for rehabilitation engineering and assistive devices.