Yu-An Chen, Te-Yen Wu, Tim Chang, Jun You Liu, Yuan-Chang Hsieh, L. Hsu, Ming-Wei Hsu, Paul Taele, Neng-Hao Yu, Mike Y. Chen
{"title":"ARPilot","authors":"Yu-An Chen, Te-Yen Wu, Tim Chang, Jun You Liu, Yuan-Chang Hsieh, L. Hsu, Ming-Wei Hsu, Paul Taele, Neng-Hao Yu, Mike Y. Chen","doi":"10.1145/3229434.3229475","DOIUrl":null,"url":null,"abstract":"Drones offer camera angles that are not possible with traditional cameras and are becoming increasingly popular for videography. However, flying a drone and controlling its camera simultaneously requires manipulating 5-6 degrees of freedom (DOF) that needs significant training. We present ARPilot, a direct-manipulation interface that lets users plan an aerial video by physically moving their mobile devices around a miniature 3D model of the scene, shown via Augmented Reality (AR). The mobile devices act as the viewfinder, making them intuitive to explore and frame the shots. We leveraged AR technology to explore three 6DOF video-shooting interfaces on mobile devices: AR keyframe, AR continuous, and AR hybrid, and compared against a traditional touch interface in a user study. The results show that AR hybrid is the most preferred by the participants and expends the least effort among all the techniques, while the users' feedback suggests that AR continuous empowers more creative shots. We discuss several distinct usage patterns and report insights for further design.","PeriodicalId":344738,"journal":{"name":"Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3229434.3229475","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Drones offer camera angles that are not possible with traditional cameras and are becoming increasingly popular for videography. However, flying a drone and controlling its camera simultaneously requires manipulating 5-6 degrees of freedom (DOF) that needs significant training. We present ARPilot, a direct-manipulation interface that lets users plan an aerial video by physically moving their mobile devices around a miniature 3D model of the scene, shown via Augmented Reality (AR). The mobile devices act as the viewfinder, making them intuitive to explore and frame the shots. We leveraged AR technology to explore three 6DOF video-shooting interfaces on mobile devices: AR keyframe, AR continuous, and AR hybrid, and compared against a traditional touch interface in a user study. The results show that AR hybrid is the most preferred by the participants and expends the least effort among all the techniques, while the users' feedback suggests that AR continuous empowers more creative shots. We discuss several distinct usage patterns and report insights for further design.