Low-intensity focused ultrasound stimulation (LIFUS) has been proved effective in eliciting vibrotactile in addition to warm, cold and nociceptive pain when applied to human peripheral endings. However, if it can evoke fine tactile sensations has been rarely investigated by far despite the importance of fine tactile feedback in motor control. To explore this issue, 14 healthy volunteers were recruited in this study. A psychophysical experiment was firstly conducted to determine the appropriate range of pulse repetition frequency (PRF) and acoustic intensity (AI). Then, participants were asked to perceive and discriminate different tactile stimulations under LIFUS, so as to evaluate if multiple fine tactile sensations could be reliably elicited by modulating the PRF and AI. For objective assessment, the local blood perfusion volume (BPV) response beneath stimulated fingertip was recorded and characterized. Our results showed that four types of tactile sensations, including tapping, vibrating, electrical, and pressure could be reliably elicited by modulating the PRF and AI within a specific range, and there was a significant impact of PRF and AI on both participants’ tactile discrimination and amplitude features of BPV response. This study would facilitate the application of LIFUS to some human-machine interaction scenarios, and shed valuable insights on the physiological mechanisms of peripherally applied ultrasound stimulation.
事实证明,低强度聚焦超声波刺激(LIFUS)应用于人体外周末梢时,除了能引起温痛、冷痛和痛觉外,还能有效地引起振动触觉。然而,尽管精细触觉反馈在运动控制中具有重要作用,但迄今为止,关于它能否唤起精细触觉的研究却很少。为了探讨这一问题,本研究招募了 14 名健康志愿者。首先进行了心理物理实验,以确定合适的脉冲重复频率(PRF)和声强(AI)范围。然后,要求参与者在 LIFUS 下感知和分辨不同的触觉刺激,以评估通过调节 PRF 和 AI 是否能可靠地激发多种精细触觉。为了进行客观评估,我们记录并描述了受刺激指尖下的局部血液灌注量(BPV)反应。结果表明,在特定范围内调节 PRF 和 AI 可以可靠地诱发四种触觉,包括敲击、振动、电和压力,而且 PRF 和 AI 对参与者的触觉辨别力和 BPV 反应的振幅特征都有显著影响。这项研究有助于将 LIFUS 应用于某些人机交互场景,并对外周超声刺激的生理机制提出了宝贵的见解。
{"title":"Low-Intensity Focused Ultrasound Stimulation on Fingertip Can Evoke Fine Tactile Sensations and Different Local Hemodynamic Responses","authors":"Liuni Qin;Mingyang Dou;Lili Niu;Laixin Huang;Fei Li;Shichun Bao;Xinping Deng;Guanglin Li;Yanjuan Geng","doi":"10.1109/TNSRE.2024.3493925","DOIUrl":"https://doi.org/10.1109/TNSRE.2024.3493925","url":null,"abstract":"Low-intensity focused ultrasound stimulation (LIFUS) has been proved effective in eliciting vibrotactile in addition to warm, cold and nociceptive pain when applied to human peripheral endings. However, if it can evoke fine tactile sensations has been rarely investigated by far despite the importance of fine tactile feedback in motor control. To explore this issue, 14 healthy volunteers were recruited in this study. A psychophysical experiment was firstly conducted to determine the appropriate range of pulse repetition frequency (PRF) and acoustic intensity (AI). Then, participants were asked to perceive and discriminate different tactile stimulations under LIFUS, so as to evaluate if multiple fine tactile sensations could be reliably elicited by modulating the PRF and AI. For objective assessment, the local blood perfusion volume (BPV) response beneath stimulated fingertip was recorded and characterized. Our results showed that four types of tactile sensations, including tapping, vibrating, electrical, and pressure could be reliably elicited by modulating the PRF and AI within a specific range, and there was a significant impact of PRF and AI on both participants’ tactile discrimination and amplitude features of BPV response. This study would facilitate the application of LIFUS to some human-machine interaction scenarios, and shed valuable insights on the physiological mechanisms of peripherally applied ultrasound stimulation.","PeriodicalId":13419,"journal":{"name":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","volume":"32 ","pages":"4086-4097"},"PeriodicalIF":4.8,"publicationDate":"2024-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10755139","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142679288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-11-18DOI: 10.1109/TNSRE.2024.3500217
Kirill Kokorin;Syeda R. Zehra;Jing Mu;Peter Yoo;David B. Grayden;Sam E. John
Noninvasive augmented-reality (AR) brain-computer interfaces (BCIs) that use steady-state visually evoked potentials (SSVEPs) typically adopt a fully-autonomous goal-selection framework to control a robot, where automation is used to compensate for the low information transfer rate of the BCI. This scheme improves task performance but users may prefer direct control (DC) of robot motion. To provide users with a balance of autonomous assistance and manual control, we developed a shared control (SC) system for continuous control of robot translation using an SSVEP AR-BCI, which we tested in a 3D reaching task. The SC system used the BCI input and robot sensor data to continuously predict which object the user wanted to reach, generated an assistance signal, and regulated the level of assistance based on prediction confidence. Eighteen healthy participants took part in our study and each completed 24 reaching trials using DC and SC. Compared to DC, SC significantly improved (paired two-tailed t-test, Holm-corrected $alpha lt 0.05$