WiOpen: A Robust Wi-Fi-Based Open-Set Gesture Recognition Framework

IF 4.4 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Human-Machine Systems Pub Date : 2025-02-21 DOI:10.1109/THMS.2025.3532910
Xiang Zhang;Jinyang Huang;Huan Yan;Yuanhao Feng;Peng Zhao;Guohang Zhuang;Zhi Liu;Bin Liu
{"title":"WiOpen: A Robust Wi-Fi-Based Open-Set Gesture Recognition Framework","authors":"Xiang Zhang;Jinyang Huang;Huan Yan;Yuanhao Feng;Peng Zhao;Guohang Zhuang;Zhi Liu;Bin Liu","doi":"10.1109/THMS.2025.3532910","DOIUrl":null,"url":null,"abstract":"Recent years have witnessed a growing interest in Wi-Fi-based gesture recognition. However, existing works have predominantly focused on closed-set paradigms, where all testing gestures are predefined during training. This poses a significant challenge in real-world applications, as unseen gestures might be misclassified as known class during testing. To address this issue, we propose WiOpen, a robust Wi-Fi-based open-set gesture recognition (OSGR) framework. Implementing OSGR requires addressing challenges caused by the unique uncertainty in Wi-Fi sensing. This uncertainty, resulting from noise and domains, leads to widely scattered and irregular data distributions in collected Wi-Fi sensing data. Consequently, data ambiguity between classes and challenges in defining appropriate decision boundaries to identify unknowns arise. To tackle these challenges, WiOpen adopts a twofold approach to eliminate uncertainty and define precise decision boundaries. Initially, it addresses uncertainty induced by noise during data preprocessing by utilizing the channel state information (CSI) ratio. Next, it designs the OSGR network based on an uncertainty quantification method. Throughout the learning process, this network effectively mitigates uncertainty stemming from domains. Ultimately, the network leverages relationships among samples' neighbors to dynamically define open-set decision boundaries, successfully realizing OSGR. Comprehensive experiments on publicly accessible datasets confirm WiOpen's effectiveness.","PeriodicalId":48916,"journal":{"name":"IEEE Transactions on Human-Machine Systems","volume":"55 2","pages":"234-245"},"PeriodicalIF":4.4000,"publicationDate":"2025-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Human-Machine Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10899398/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Recent years have witnessed a growing interest in Wi-Fi-based gesture recognition. However, existing works have predominantly focused on closed-set paradigms, where all testing gestures are predefined during training. This poses a significant challenge in real-world applications, as unseen gestures might be misclassified as known class during testing. To address this issue, we propose WiOpen, a robust Wi-Fi-based open-set gesture recognition (OSGR) framework. Implementing OSGR requires addressing challenges caused by the unique uncertainty in Wi-Fi sensing. This uncertainty, resulting from noise and domains, leads to widely scattered and irregular data distributions in collected Wi-Fi sensing data. Consequently, data ambiguity between classes and challenges in defining appropriate decision boundaries to identify unknowns arise. To tackle these challenges, WiOpen adopts a twofold approach to eliminate uncertainty and define precise decision boundaries. Initially, it addresses uncertainty induced by noise during data preprocessing by utilizing the channel state information (CSI) ratio. Next, it designs the OSGR network based on an uncertainty quantification method. Throughout the learning process, this network effectively mitigates uncertainty stemming from domains. Ultimately, the network leverages relationships among samples' neighbors to dynamically define open-set decision boundaries, successfully realizing OSGR. Comprehensive experiments on publicly accessible datasets confirm WiOpen's effectiveness.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
WiOpen:基于 Wi-Fi 的鲁棒开放集手势识别框架
近年来,人们对基于wi - fi的手势识别越来越感兴趣。然而,现有的工作主要集中在闭集范式上,其中所有的测试手势都是在训练期间预定义的。这在实际应用程序中构成了一个重大挑战,因为在测试期间,不可见的手势可能会被错误地分类为已知的类。为了解决这个问题,我们提出了WiOpen,一个基于wi - fi的鲁棒开放集手势识别(OSGR)框架。实现OSGR需要解决Wi-Fi传感中独特的不确定性带来的挑战。这种由噪声和域引起的不确定性导致收集到的Wi-Fi传感数据分布广泛分散和不规则。因此,类之间的数据歧义和定义适当的决策边界以识别未知因素的挑战出现了。为了应对这些挑战,WiOpen采用了一种双重方法来消除不确定性并定义精确的决策边界。首先,它利用信道状态信息(CSI)比率来解决数据预处理过程中噪声引起的不确定性。其次,基于不确定性量化方法设计OSGR网络。在整个学习过程中,该网络有效地减轻了源于域的不确定性。最终,网络利用样本邻居之间的关系动态定义开集决策边界,成功实现OSGR。在可公开访问的数据集上进行的全面实验证实了WiOpen的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Human-Machine Systems
IEEE Transactions on Human-Machine Systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
7.10
自引率
11.10%
发文量
136
期刊介绍: The scope of the IEEE Transactions on Human-Machine Systems includes the fields of human machine systems. It covers human systems and human organizational interactions including cognitive ergonomics, system test and evaluation, and human information processing concerns in systems and organizations.
期刊最新文献
2025 Index IEEE Transactions on Human-Machine Systems Table of Contents Call for Papers: IEEE Transactions on Human-Machine Systems IEEE Transactions on Human-Machine Systems Information for Authors IEEE Systems, Man, and Cybernetics Society Information
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1