J. Tu, Ming Wang, Wenlong Li, Jiangtao Su, Yanzhen Li, Zhisheng Lv, Haicheng Li, Xue Feng, Xiaodong Chen
{"title":"Electronic skins with multimodal sensing and perception","authors":"J. Tu, Ming Wang, Wenlong Li, Jiangtao Su, Yanzhen Li, Zhisheng Lv, Haicheng Li, Xue Feng, Xiaodong Chen","doi":"10.20517/ss.2023.15","DOIUrl":null,"url":null,"abstract":"Multiple types of sensory information are detected and integrated to improve perceptual accuracy and sensitivity in biological cognition. However, current studies on electronic skin (e-skin) systems have mainly focused on the optimization of the modality-specific data acquisition and processing. Endowing e-skins with the abilities of multimodal sensing and even perception that can achieve high-level perception behaviors has been insufficiently explored. Moreover, the perception progress of multisensory e-skin systems is faced with challenges at both device and software levels. Here, we provide a perspective on the multisensory fusion of e-skins. The recent progress in e-skins realizing multimodal sensing is reviewed, followed by bottom-up and top-down multimodal perception. With the deepening understanding of neuroscience and the rapid advance of novel algorithms and devices, multimodal perception function becomes possible and will promote the development of highly intelligent e-skin systems.","PeriodicalId":74837,"journal":{"name":"Soft science","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Soft science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20517/ss.2023.15","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Multiple types of sensory information are detected and integrated to improve perceptual accuracy and sensitivity in biological cognition. However, current studies on electronic skin (e-skin) systems have mainly focused on the optimization of the modality-specific data acquisition and processing. Endowing e-skins with the abilities of multimodal sensing and even perception that can achieve high-level perception behaviors has been insufficiently explored. Moreover, the perception progress of multisensory e-skin systems is faced with challenges at both device and software levels. Here, we provide a perspective on the multisensory fusion of e-skins. The recent progress in e-skins realizing multimodal sensing is reviewed, followed by bottom-up and top-down multimodal perception. With the deepening understanding of neuroscience and the rapid advance of novel algorithms and devices, multimodal perception function becomes possible and will promote the development of highly intelligent e-skin systems.