{"title":"GesPlayer: Using Augmented Gestures to Empower Video Players","authors":"Xiang Li, Yuzheng Chen, Xiaohang Tang","doi":"10.1145/3532104.3571456","DOIUrl":null,"url":null,"abstract":"In this paper, we introduce GesPlayer, a gesture-based empowered video player that explores how users can experience their hands as an interface through gestures. We provide three semantic gestures based on the camera of a computer or other smart device to detect and adjust the progress of video playback, volume, and screen brightness, respectively. Our goal is to enable users to control video playback simply by their gestures in the air, without the need to use a mouse or keyboard, especially when it is not convenient to do so. Ultimately, we hope to expand our understanding of gesture-based interaction by understanding the inclusiveness of designing the hand as an interactive interface, and further broaden the state of semantic gestures in an interactive environment through computational interaction methods.","PeriodicalId":431929,"journal":{"name":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion Proceedings of the 2022 Conference on Interactive Surfaces and Spaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3532104.3571456","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
In this paper, we introduce GesPlayer, a gesture-based empowered video player that explores how users can experience their hands as an interface through gestures. We provide three semantic gestures based on the camera of a computer or other smart device to detect and adjust the progress of video playback, volume, and screen brightness, respectively. Our goal is to enable users to control video playback simply by their gestures in the air, without the need to use a mouse or keyboard, especially when it is not convenient to do so. Ultimately, we hope to expand our understanding of gesture-based interaction by understanding the inclusiveness of designing the hand as an interactive interface, and further broaden the state of semantic gestures in an interactive environment through computational interaction methods.