基于约束的徒手沉浸式3D建模

Q1 Social Sciences i-com Pub Date : 2023-05-31 DOI:10.1515/icom-2023-0013
Thomas Jung, J. H. Israel, Ruben Ahlhelm, Patrick Bauer
{"title":"基于约束的徒手沉浸式3D建模","authors":"Thomas Jung, J. H. Israel, Ruben Ahlhelm, Patrick Bauer","doi":"10.1515/icom-2023-0013","DOIUrl":null,"url":null,"abstract":"Abstract Three-dimensional user interfaces that are controlled by the user’s bare hands are mostly based on purely gesture-based interaction techniques. However, these interfaces are often slow and error prone. Especially in the field of immersive 3D modelling, gestures are unsuitable because they complicate and delay the modelling process. To address these problems, we present a new gesture-free 3D modelling technique called “3D touch-and-drag”, which allows users to select vertices by approaching them and to terminate operations by moving the 3D cursor (e.g. the forefinger) away from the constraint geometry (e.g. a straight line or a plane). Our proposed technique makes it possible to transfer the existing 3D modelling concepts (“3D widgets”) to virtual environments, as shown by an experimental 3D modelling tool. The gesture-free bare-hand interaction also improves the possibility of tactile feedback during 3D manipulation. We compared different modelling techniques for controlling the 3D widgets. We found that controller-based techniques are significantly faster than finger-tracking-based techniques. The 3D touch-and-drag technique is about as fast as gesture-based interactions. Mouse interaction in a two-dimensional GUI is only slightly faster than the 3D modelling techniques. Since our proposed technique has proven to be at least equivalent to gesture-based interaction techniques in terms of accuracy and efficiency, its further development using more accurate tracking techniques seems promising to exploit the advantages of hands-free and gesture-free interaction for immersive 3D modelling.","PeriodicalId":37105,"journal":{"name":"i-com","volume":"11 1","pages":"125 - 141"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Constraint-based bare-hand immersive 3D modelling\",\"authors\":\"Thomas Jung, J. H. Israel, Ruben Ahlhelm, Patrick Bauer\",\"doi\":\"10.1515/icom-2023-0013\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Three-dimensional user interfaces that are controlled by the user’s bare hands are mostly based on purely gesture-based interaction techniques. However, these interfaces are often slow and error prone. Especially in the field of immersive 3D modelling, gestures are unsuitable because they complicate and delay the modelling process. To address these problems, we present a new gesture-free 3D modelling technique called “3D touch-and-drag”, which allows users to select vertices by approaching them and to terminate operations by moving the 3D cursor (e.g. the forefinger) away from the constraint geometry (e.g. a straight line or a plane). Our proposed technique makes it possible to transfer the existing 3D modelling concepts (“3D widgets”) to virtual environments, as shown by an experimental 3D modelling tool. The gesture-free bare-hand interaction also improves the possibility of tactile feedback during 3D manipulation. We compared different modelling techniques for controlling the 3D widgets. We found that controller-based techniques are significantly faster than finger-tracking-based techniques. The 3D touch-and-drag technique is about as fast as gesture-based interactions. Mouse interaction in a two-dimensional GUI is only slightly faster than the 3D modelling techniques. Since our proposed technique has proven to be at least equivalent to gesture-based interaction techniques in terms of accuracy and efficiency, its further development using more accurate tracking techniques seems promising to exploit the advantages of hands-free and gesture-free interaction for immersive 3D modelling.\",\"PeriodicalId\":37105,\"journal\":{\"name\":\"i-com\",\"volume\":\"11 1\",\"pages\":\"125 - 141\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"i-com\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1515/icom-2023-0013\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"i-com","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/icom-2023-0013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0

摘要

由用户徒手控制的三维用户界面大多基于纯粹的基于手势的交互技术。然而,这些接口通常很慢,而且容易出错。特别是在沉浸式三维建模领域,手势由于使建模过程复杂化和延迟而不适合。为了解决这些问题,我们提出了一种新的无手势3D建模技术,称为“3D触摸和拖动”,它允许用户通过接近顶点来选择顶点,并通过将3D光标(例如食指)从约束几何(例如直线或平面)移动来终止操作。我们提出的技术可以将现有的3D建模概念(“3D小部件”)转移到虚拟环境中,如实验性3D建模工具所示。无手势的徒手交互也提高了3D操作过程中触觉反馈的可能性。我们比较了控制3D部件的不同建模技术。我们发现基于控制器的技术比基于手指跟踪的技术要快得多。3D触摸和拖动技术与基于手势的交互一样快。鼠标在二维GUI中的交互只比3D建模技术快一点点。由于我们提出的技术已被证明在准确性和效率方面至少相当于基于手势的交互技术,因此使用更精确的跟踪技术的进一步发展似乎有希望利用无手和无手势交互的优势进行沉浸式3D建模。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Constraint-based bare-hand immersive 3D modelling
Abstract Three-dimensional user interfaces that are controlled by the user’s bare hands are mostly based on purely gesture-based interaction techniques. However, these interfaces are often slow and error prone. Especially in the field of immersive 3D modelling, gestures are unsuitable because they complicate and delay the modelling process. To address these problems, we present a new gesture-free 3D modelling technique called “3D touch-and-drag”, which allows users to select vertices by approaching them and to terminate operations by moving the 3D cursor (e.g. the forefinger) away from the constraint geometry (e.g. a straight line or a plane). Our proposed technique makes it possible to transfer the existing 3D modelling concepts (“3D widgets”) to virtual environments, as shown by an experimental 3D modelling tool. The gesture-free bare-hand interaction also improves the possibility of tactile feedback during 3D manipulation. We compared different modelling techniques for controlling the 3D widgets. We found that controller-based techniques are significantly faster than finger-tracking-based techniques. The 3D touch-and-drag technique is about as fast as gesture-based interactions. Mouse interaction in a two-dimensional GUI is only slightly faster than the 3D modelling techniques. Since our proposed technique has proven to be at least equivalent to gesture-based interaction techniques in terms of accuracy and efficiency, its further development using more accurate tracking techniques seems promising to exploit the advantages of hands-free and gesture-free interaction for immersive 3D modelling.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
i-com
i-com Social Sciences-Communication
CiteScore
3.80
自引率
0.00%
发文量
24
期刊最新文献
Social anthropology 4.0 The future of interactive information radiators for knowledge workers The future of HCI – editorial Towards new realities: implications of personalized online layers in our daily lives Broadening the mind: how emerging neurotechnology is reshaping HCI and interactive system design
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1