{"title":"通过人体阻抗调制改善视觉-触觉感知","authors":"Xiaoxiao Cheng, Shixian Shen, Ekaterina Ivanova, Gerolamo Carboni, Atsushi Takagi, Etienne Burdet","doi":"arxiv-2409.06124","DOIUrl":null,"url":null,"abstract":"Humans activate muscles to shape the mechanical interaction with their\nenvironment, but can they harness this control mechanism to best sense the\nenvironment? We investigated how participants adapt their muscle activation to\nvisual and haptic information when tracking a randomly moving target with a\nrobotic interface. The results exhibit a differentiated effect of these sensory\nmodalities, where participants' muscle cocontraction increases with the haptic\nnoise and decreases with the visual noise, in apparent contradiction to\nprevious results. These results can be explained, and reconciled with previous\nfindings, when considering muscle spring like mechanics, where stiffness\nincreases with cocontraction to regulate motion guidance. Increasing\ncocontraction to more closely follow the motion plan favors accurate visual\nover haptic information, while decreasing it avoids injecting visual noise and\nrelies on accurate haptic information. We formulated this active sensing\nmechanism as the optimization of visuo-haptic information and effort. This OIE\nmodel can explain the adaptation of muscle activity to unimodal and multimodal\nsensory information when interacting with fixed or dynamic environments, or\nwith another human, and can be used to optimize human-robot interaction.","PeriodicalId":501541,"journal":{"name":"arXiv - CS - Human-Computer Interaction","volume":"10 5 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Human Impedance Modulation to Improve Visuo-Haptic Perception\",\"authors\":\"Xiaoxiao Cheng, Shixian Shen, Ekaterina Ivanova, Gerolamo Carboni, Atsushi Takagi, Etienne Burdet\",\"doi\":\"arxiv-2409.06124\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Humans activate muscles to shape the mechanical interaction with their\\nenvironment, but can they harness this control mechanism to best sense the\\nenvironment? We investigated how participants adapt their muscle activation to\\nvisual and haptic information when tracking a randomly moving target with a\\nrobotic interface. The results exhibit a differentiated effect of these sensory\\nmodalities, where participants' muscle cocontraction increases with the haptic\\nnoise and decreases with the visual noise, in apparent contradiction to\\nprevious results. These results can be explained, and reconciled with previous\\nfindings, when considering muscle spring like mechanics, where stiffness\\nincreases with cocontraction to regulate motion guidance. Increasing\\ncocontraction to more closely follow the motion plan favors accurate visual\\nover haptic information, while decreasing it avoids injecting visual noise and\\nrelies on accurate haptic information. We formulated this active sensing\\nmechanism as the optimization of visuo-haptic information and effort. This OIE\\nmodel can explain the adaptation of muscle activity to unimodal and multimodal\\nsensory information when interacting with fixed or dynamic environments, or\\nwith another human, and can be used to optimize human-robot interaction.\",\"PeriodicalId\":501541,\"journal\":{\"name\":\"arXiv - CS - Human-Computer Interaction\",\"volume\":\"10 5 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Human-Computer Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.06124\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06124","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Human Impedance Modulation to Improve Visuo-Haptic Perception
Humans activate muscles to shape the mechanical interaction with their
environment, but can they harness this control mechanism to best sense the
environment? We investigated how participants adapt their muscle activation to
visual and haptic information when tracking a randomly moving target with a
robotic interface. The results exhibit a differentiated effect of these sensory
modalities, where participants' muscle cocontraction increases with the haptic
noise and decreases with the visual noise, in apparent contradiction to
previous results. These results can be explained, and reconciled with previous
findings, when considering muscle spring like mechanics, where stiffness
increases with cocontraction to regulate motion guidance. Increasing
cocontraction to more closely follow the motion plan favors accurate visual
over haptic information, while decreasing it avoids injecting visual noise and
relies on accurate haptic information. We formulated this active sensing
mechanism as the optimization of visuo-haptic information and effort. This OIE
model can explain the adaptation of muscle activity to unimodal and multimodal
sensory information when interacting with fixed or dynamic environments, or
with another human, and can be used to optimize human-robot interaction.