Pub Date : 2025-03-10DOI: 10.1109/TOH.2025.3549677
Lynette A Jones, Hsin-Ni Ho
This review focuses on the interactions between the cutaneous senses, and in particular touch and temperature, as these are the most relevant for developing skin-based display technologies for use in virtual reality (VR) and for designing multimodal haptic devices. A broad spectrum of research is reviewed ranging from studies that have examined the mechanisms involved in thermal intensification and tactile masking, to more applied work that has focused on implementing thermal-tactile illusions such as thermal referral and illusory wetness in VR environments. Research on these tactile-thermal illusions has identified the differences between the senses of cold and warmth in terms of their effects on the perception of object properties and the prevalence of the perceptual experiences elicited. They have also underscored the fundamental spatial and temporal differences between the tactile and thermal senses. The wide-ranging body of research on compound sensations such as wetness and stickiness has highlighted the mechanisms involved in sensing moisture and provided a framework for measuring these sensations in a variety of contexts. Although the interactions between the two senses are complex, it is clear that the addition of thermal inputs to a tactile display enhances both user experience and enables novel sensory experiences.
{"title":"Tactile-Thermal Interactions: Cooperation and Competition.","authors":"Lynette A Jones, Hsin-Ni Ho","doi":"10.1109/TOH.2025.3549677","DOIUrl":"https://doi.org/10.1109/TOH.2025.3549677","url":null,"abstract":"<p><p>This review focuses on the interactions between the cutaneous senses, and in particular touch and temperature, as these are the most relevant for developing skin-based display technologies for use in virtual reality (VR) and for designing multimodal haptic devices. A broad spectrum of research is reviewed ranging from studies that have examined the mechanisms involved in thermal intensification and tactile masking, to more applied work that has focused on implementing thermal-tactile illusions such as thermal referral and illusory wetness in VR environments. Research on these tactile-thermal illusions has identified the differences between the senses of cold and warmth in terms of their effects on the perception of object properties and the prevalence of the perceptual experiences elicited. They have also underscored the fundamental spatial and temporal differences between the tactile and thermal senses. The wide-ranging body of research on compound sensations such as wetness and stickiness has highlighted the mechanisms involved in sensing moisture and provided a framework for measuring these sensations in a variety of contexts. Although the interactions between the two senses are complex, it is clear that the addition of thermal inputs to a tactile display enhances both user experience and enables novel sensory experiences.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143596871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-03-07DOI: 10.1109/TOH.2025.3549036
Bryan A MacGavin, Jennifer L Tennison, Terra Edwards, Jenna L Gorlewicz
Despite the richness of the human tactile capacity, remote communication practices often lack touch-based interactions. This leads to overtaxing our visual and auditory channels, a lack of connection and engagement, and inaccessibility for diverse sensory groups. In this paper, we learn from haptic intuitions of the blind and low vision (BLV) and Protactile DeafBlind (PT-DB) communities to investigate how core functions of communication can be routed through tactile channels. We investigate this re-routing by designing the Conversational Haptic Technology (CHAT) system, a wearable haptic system to explore the feasibility of language recreation through core functions of communication and emotional expression via touch. We contribute the design evolution of an input (sensing) pad and an output (actuation) pad, which enable a bidirectional, wireless system to support remote, touch-based communication. These systems were iteratively evaluated through a series of user studies with sighted-hearing (N=20), BLV (N=4), and PT-DB (N=7) participants to uncover touch profiles for relaying specific communication functions and emotional responses. Results indicate trends and similarities in the touch-based cues organically employed across the diverse groups and provide an initial framework for demonstrating the feasibility of communicating core functions through touch in a wearable form factor.
{"title":"The CHAT System: A Wearable Haptic System For Facilitating Tactile Communication.","authors":"Bryan A MacGavin, Jennifer L Tennison, Terra Edwards, Jenna L Gorlewicz","doi":"10.1109/TOH.2025.3549036","DOIUrl":"https://doi.org/10.1109/TOH.2025.3549036","url":null,"abstract":"<p><p>Despite the richness of the human tactile capacity, remote communication practices often lack touch-based interactions. This leads to overtaxing our visual and auditory channels, a lack of connection and engagement, and inaccessibility for diverse sensory groups. In this paper, we learn from haptic intuitions of the blind and low vision (BLV) and Protactile DeafBlind (PT-DB) communities to investigate how core functions of communication can be routed through tactile channels. We investigate this re-routing by designing the Conversational Haptic Technology (CHAT) system, a wearable haptic system to explore the feasibility of language recreation through core functions of communication and emotional expression via touch. We contribute the design evolution of an input (sensing) pad and an output (actuation) pad, which enable a bidirectional, wireless system to support remote, touch-based communication. These systems were iteratively evaluated through a series of user studies with sighted-hearing (N=20), BLV (N=4), and PT-DB (N=7) participants to uncover touch profiles for relaying specific communication functions and emotional responses. Results indicate trends and similarities in the touch-based cues organically employed across the diverse groups and provide an initial framework for demonstrating the feasibility of communicating core functions through touch in a wearable form factor.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143575649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-03-07DOI: 10.1109/TOH.2025.3548880
Romain Le Magueresse, Fabrice Casset, Frederic Giraud, Munique Kazar Mendes, Daniel Mermin, Remi Franiatte, Anis Kaci, Mikael Colin
Current flexible haptic technologies struggle to render textures as effectively as rigid surfaces with friction reduction due to poor propagation of elastic waves in flexible substrates. Alternative solutions using different actuators have been explored, but their low density hampers fine renderings, and so texture rendering. To overcome these limits, we propose in this paper the development, the characterization, and the evaluation of an innovative haptic solution enabling localized or continuous texture rendering on a flexible surface. On the basis of previous work, the developed surface is composed of several haptic resonators vibrating at an ultrasonic frequency, driven by piezoelectric actuators, and associated with a polymer matrix. The solution combines the advantages of a rigid haptic surface, implementing friction modulation to obtain texture stimulation, and the conformability of a 75 m thick polymer sheet. By powering or not the actuators, it is possible to display simple tactile shapes. Tribological measurements confirm that the friction reduction matches the desired shape. Two studies demonstrated the device's effectiveness: participants identified simple geometric shapes with a 96 success rate and 14 s detection time, and two users simultaneously recognized independent tactile patterns, achieving 89 accuracy. This flexible device supports simple geometric shape display with texture rendering, multi-touch and multi-user interaction, offering potential for various applications.
{"title":"Reconfigurable Flexible Haptic Interface Using Localized Friction Modulation.","authors":"Romain Le Magueresse, Fabrice Casset, Frederic Giraud, Munique Kazar Mendes, Daniel Mermin, Remi Franiatte, Anis Kaci, Mikael Colin","doi":"10.1109/TOH.2025.3548880","DOIUrl":"https://doi.org/10.1109/TOH.2025.3548880","url":null,"abstract":"<p><p>Current flexible haptic technologies struggle to render textures as effectively as rigid surfaces with friction reduction due to poor propagation of elastic waves in flexible substrates. Alternative solutions using different actuators have been explored, but their low density hampers fine renderings, and so texture rendering. To overcome these limits, we propose in this paper the development, the characterization, and the evaluation of an innovative haptic solution enabling localized or continuous texture rendering on a flexible surface. On the basis of previous work, the developed surface is composed of several haptic resonators vibrating at an ultrasonic frequency, driven by piezoelectric actuators, and associated with a polymer matrix. The solution combines the advantages of a rigid haptic surface, implementing friction modulation to obtain texture stimulation, and the conformability of a 75 m thick polymer sheet. By powering or not the actuators, it is possible to display simple tactile shapes. Tribological measurements confirm that the friction reduction matches the desired shape. Two studies demonstrated the device's effectiveness: participants identified simple geometric shapes with a 96 success rate and 14 s detection time, and two users simultaneously recognized independent tactile patterns, achieving 89 accuracy. This flexible device supports simple geometric shape display with texture rendering, multi-touch and multi-user interaction, offering potential for various applications.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143575646","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-03-05DOI: 10.1109/TOH.2025.3548478
Justine Saint-Aubert
The Snail is a wearable haptic interface that enables users to experience force feedback when grasping objects in Virtual Reality. It consists of a 3D-printed prop attached to the tip of the thumb that can rotate thanks to a small actuator. The prop is shaped like a snail to display different grasping sizes, ranging from to , according to its orientation. The prop displays the force feedback, so forces over can be displayed between fingers using small and low-power actuation. Very rigid objects can be rendered when the prop remains static, but rotations when the users grasp the prop also allow for the simulation of soft objects. The Snail is portable, low-cost, and easy to reproduce because it is made of 3D-printed parts. The design and performance of the device were evaluated through technical evaluations and 3 user experiments. They show that participants can discriminate different grasping sizes and levels of softness with the interface. The Snail also enhances user experience and performances in Virtual Reality compared to standard vibration feedback.
蜗牛 "是一种可穿戴的触觉界面,能让用户在虚拟现实中抓取物体时体验力反馈。它由一个 3D 打印的道具组成,该道具连接在拇指尖上,通过一个小型致动器可以旋转。该道具的形状像一只蜗牛,可根据方向显示不同的抓取大小,从到 ,不等。该道具可以显示力反馈,因此可以使用小巧、低功耗的致动器显示手指间的力。当道具保持静止时,可以呈现非常坚硬的物体,但当用户抓握道具时发生旋转,也可以模拟柔软的物体。蜗牛由三维打印部件制成,便于携带,成本低,易于复制。该装置的设计和性能通过技术评估和 3 个用户实验进行了评估。实验结果表明,参与者可以通过界面分辨不同的抓握尺寸和柔软度。与标准振动反馈相比,蜗牛还增强了虚拟现实中的用户体验和性能。
{"title":"The Snail: A Wearable Actuated Prop to Simulate Grasp of Rigid and Soft Objects in Virtual Reality.","authors":"Justine Saint-Aubert","doi":"10.1109/TOH.2025.3548478","DOIUrl":"https://doi.org/10.1109/TOH.2025.3548478","url":null,"abstract":"<p><p>The Snail is a wearable haptic interface that enables users to experience force feedback when grasping objects in Virtual Reality. It consists of a 3D-printed prop attached to the tip of the thumb that can rotate thanks to a small actuator. The prop is shaped like a snail to display different grasping sizes, ranging from to , according to its orientation. The prop displays the force feedback, so forces over can be displayed between fingers using small and low-power actuation. Very rigid objects can be rendered when the prop remains static, but rotations when the users grasp the prop also allow for the simulation of soft objects. The Snail is portable, low-cost, and easy to reproduce because it is made of 3D-printed parts. The design and performance of the device were evaluated through technical evaluations and 3 user experiments. They show that participants can discriminate different grasping sizes and levels of softness with the interface. The Snail also enhances user experience and performances in Virtual Reality compared to standard vibration feedback.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143566935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-03-03DOI: 10.1109/TOH.2025.3546979
Chandler Stubbs, Kathleen Steadman, David M Bevly, Chad G Rose
While much work is being done to advance autonomous capabilities of mobile robotics, specifically unmanned ground vehicles (UGVs), some applications might currently be too complex or undesirable for full autonomy. Maintaining a human in the loop has proven to be a reliable strategy in these applications, yet there are currently limitations to the efficacy of human operators. Haptic feedback has been proposed as a method of addressing these limitations, and aiding UGV operators in safe and effective operation. This manuscript presents the experimental validation of LARIAT (Lowering Attention Requirements in semi-Autonomous Teleoperation), a portable haptic device for teleoperated semi-autonomous UGVs. This device utilizes an adapted predictive form of the Zero-Moment Point (ZMP) rollover index to inform haptic squeeze cues provided to the UGV operator for human-on-the-loop notifications. First, a brief design overview of LARIAT, implemented haptic control, and the ZMP index are presented. In addition to experimental device characterization of the just noticeable difference, we present a case study that demonstrates LARIAT's abilities to improve teleoperation performance. In an experiment involving a simulation of walking behind a semi-autonomous UGV, LARIAT reduced the number of UGV rollovers by up to 50%, with comparable or increased performance in a concurrent secondary tasks.
{"title":"LARIAT: Predictive Haptic Feedback to Improve Semi-Autonomous UGV Safety in a Case Study.","authors":"Chandler Stubbs, Kathleen Steadman, David M Bevly, Chad G Rose","doi":"10.1109/TOH.2025.3546979","DOIUrl":"https://doi.org/10.1109/TOH.2025.3546979","url":null,"abstract":"<p><p>While much work is being done to advance autonomous capabilities of mobile robotics, specifically unmanned ground vehicles (UGVs), some applications might currently be too complex or undesirable for full autonomy. Maintaining a human in the loop has proven to be a reliable strategy in these applications, yet there are currently limitations to the efficacy of human operators. Haptic feedback has been proposed as a method of addressing these limitations, and aiding UGV operators in safe and effective operation. This manuscript presents the experimental validation of LARIAT (Lowering Attention Requirements in semi-Autonomous Teleoperation), a portable haptic device for teleoperated semi-autonomous UGVs. This device utilizes an adapted predictive form of the Zero-Moment Point (ZMP) rollover index to inform haptic squeeze cues provided to the UGV operator for human-on-the-loop notifications. First, a brief design overview of LARIAT, implemented haptic control, and the ZMP index are presented. In addition to experimental device characterization of the just noticeable difference, we present a case study that demonstrates LARIAT's abilities to improve teleoperation performance. In an experiment involving a simulation of walking behind a semi-autonomous UGV, LARIAT reduced the number of UGV rollovers by up to 50%, with comparable or increased performance in a concurrent secondary tasks.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143541865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-27DOI: 10.1109/TOH.2025.3546522
Jacob Carducci, Noah J Cowan, Jeremy D Brown
Robotic teleoperators introduce novel electromechanical dynamics between the user and the environment. While considerable effort has focused on minimizing these dynamics, we lack a robust understanding of their impact on user task performance across the range of human motor control ability. Here, we utilize a 1-DoF teleoperator testbed with interchangeable mechanical and electromechanical couplings between the leader and follower to investigate to what extent, if any, the dynamics of the teleoperator influence performance in a visual-motor pursuit tracking task. We recruited N = 30 participants to perform the task at frequencies ranging from 0.55 - 2.35 Hz, with the testbed configured into Mechanical, Unilateral, and Bilateral configurations. Results demonstrate that tracking performance at the follower was similar across configurations. However, participants' adjustment at the leader differed between Mechanical, Unilateral, and Bilateral configurations. In addition, participants applied different grip forces between the Mechanical and Unilateral configurations. Finally, participants' ability to compensate for coupling dynamics diminished significantly as execution speed increased. Overall, these findings support the argument that humans are capable of incorporating teleoperator dynamics into their motor control scheme and producing compensatory control strategies to account for these dynamics; however, this compensation is significantly affected by the leader-follower coupling dynamics and the speed of task execution.
{"title":"Teleoperator Coupling Dynamics Impact Human Motor Control Across Pursuit Tracking Speeds.","authors":"Jacob Carducci, Noah J Cowan, Jeremy D Brown","doi":"10.1109/TOH.2025.3546522","DOIUrl":"https://doi.org/10.1109/TOH.2025.3546522","url":null,"abstract":"<p><p>Robotic teleoperators introduce novel electromechanical dynamics between the user and the environment. While considerable effort has focused on minimizing these dynamics, we lack a robust understanding of their impact on user task performance across the range of human motor control ability. Here, we utilize a 1-DoF teleoperator testbed with interchangeable mechanical and electromechanical couplings between the leader and follower to investigate to what extent, if any, the dynamics of the teleoperator influence performance in a visual-motor pursuit tracking task. We recruited N = 30 participants to perform the task at frequencies ranging from 0.55 - 2.35 Hz, with the testbed configured into Mechanical, Unilateral, and Bilateral configurations. Results demonstrate that tracking performance at the follower was similar across configurations. However, participants' adjustment at the leader differed between Mechanical, Unilateral, and Bilateral configurations. In addition, participants applied different grip forces between the Mechanical and Unilateral configurations. Finally, participants' ability to compensate for coupling dynamics diminished significantly as execution speed increased. Overall, these findings support the argument that humans are capable of incorporating teleoperator dynamics into their motor control scheme and producing compensatory control strategies to account for these dynamics; however, this compensation is significantly affected by the leader-follower coupling dynamics and the speed of task execution.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143541869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In recent years, tactile presentation technology using airborne ultrasound has attracted attention. To achieve an ideal tactile presentation using ultrasound, the acoustic field on the user's skin surface must be determined, particularly the location of the focal point. Previous studies have suggested that thermal images can be used to immediately visualize sound pressure patterns on finger surfaces. In this study, we comprehensively investigated the performance of thermal imaging for measuring the ultrasound focus on the skin. First, we confirmed that the sound pressure peak at the focus and the temperature change peak were matched using silicone that mimicked the skin. In addition, we confirmed that when human skin was irradiated, a temperature increase was observed at above 4.0 kPa in 9 out of 10 participants. Moreover, a 5.5 kPa focus could be employed to track the focal position if the moving velocity was less than 100 mm/s and to detect the orbit if the velocity was less than 2000 mm/s. These results clarify the situation in which the focus can be measured by using thermal images and provide guidelines for practical use.
{"title":"Measurement of Airborne Ultrasound Focus on Skin Surface Using Thermal Imaging.","authors":"Ryoya Onishi, Sota Iwabuchi, Shun Suzuki, Takaaki Kamigaki, Yasutoshi Makino, Hiroyuki Shinoda","doi":"10.1109/TOH.2025.3546270","DOIUrl":"https://doi.org/10.1109/TOH.2025.3546270","url":null,"abstract":"<p><p>In recent years, tactile presentation technology using airborne ultrasound has attracted attention. To achieve an ideal tactile presentation using ultrasound, the acoustic field on the user's skin surface must be determined, particularly the location of the focal point. Previous studies have suggested that thermal images can be used to immediately visualize sound pressure patterns on finger surfaces. In this study, we comprehensively investigated the performance of thermal imaging for measuring the ultrasound focus on the skin. First, we confirmed that the sound pressure peak at the focus and the temperature change peak were matched using silicone that mimicked the skin. In addition, we confirmed that when human skin was irradiated, a temperature increase was observed at above 4.0 kPa in 9 out of 10 participants. Moreover, a 5.5 kPa focus could be employed to track the focal position if the moving velocity was less than 100 mm/s and to detect the orbit if the velocity was less than 2000 mm/s. These results clarify the situation in which the focus can be measured by using thermal images and provide guidelines for practical use.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143541866","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-25DOI: 10.1109/TOH.2025.3545473
Muge Cavdan, Knut Drewing
Research has shown that affective visual and auditory events (e.g., a crying baby) are perceived as lasting longer compared to neutral ones. However, the impact of affective haptic experiences on time perception has hardly been studied. This study investigates the influence of interacting with affective materials on time perception. We selected three materials that are known to evoke pleasant (velvet), unpleasant (sandpaper), and neutral (paper) affective responses. Participants completed a temporal bisection task to assess how each material influenced their perception of time. The task involved presenting the materials in time intervals from 1000 to 2200ms in 200ms increments. In each trial, a participant stroked one of the materials, with the duration being limited by two vibrotactile feedback, and judged whether the duration felt closer to a previously learned short or long interval. Expectedly, velvet yielded lower bisection points than paper. Contrary to expectations, bisection points for sandpaper - despite being an unpleasant material - did not significantly differ from that for the control material, paper. These findings suggest that while pleasant haptic material experiences can extend perceived time, unpleasant materials may not have an effect. This effect is partially consistent with the observed time lengthening during affective auditory and visual events.
{"title":"Stretching Time With Velvet: How Affective Materials Shape our Perception of Time.","authors":"Muge Cavdan, Knut Drewing","doi":"10.1109/TOH.2025.3545473","DOIUrl":"https://doi.org/10.1109/TOH.2025.3545473","url":null,"abstract":"<p><p>Research has shown that affective visual and auditory events (e.g., a crying baby) are perceived as lasting longer compared to neutral ones. However, the impact of affective haptic experiences on time perception has hardly been studied. This study investigates the influence of interacting with affective materials on time perception. We selected three materials that are known to evoke pleasant (velvet), unpleasant (sandpaper), and neutral (paper) affective responses. Participants completed a temporal bisection task to assess how each material influenced their perception of time. The task involved presenting the materials in time intervals from 1000 to 2200ms in 200ms increments. In each trial, a participant stroked one of the materials, with the duration being limited by two vibrotactile feedback, and judged whether the duration felt closer to a previously learned short or long interval. Expectedly, velvet yielded lower bisection points than paper. Contrary to expectations, bisection points for sandpaper - despite being an unpleasant material - did not significantly differ from that for the control material, paper. These findings suggest that while pleasant haptic material experiences can extend perceived time, unpleasant materials may not have an effect. This effect is partially consistent with the observed time lengthening during affective auditory and visual events.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143541868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-20DOI: 10.1109/TOH.2025.3544134
Suraj Suman, Pranav Mamidanna, Jimmy Jessen Nielsen, Federico Chiariotti, Cedomir Stefanovic, Strahinja Dosen, Petar Popovski
The emergence of low-latency wireless connectivity has opened significant new possibilities for closed-loop human-machine interaction (HMI) systems. However, data transmission, particularly over wireless links, suffers from impairments, such as random latency fluctuations and packet loss, affecting the overall control performance of HMI systems. In this study, we have evaluated the impact of wireless impairments for a closed-loop joystick-controlled trajectory tracking task. This has been done with two different types of feedback, visual and tactile, respectively. Wireless links were used both in the uplink transmission of command signal and the downlink transmission of feedback signal. The effects of wireless impairments were incorporated by artificially introducing latency, jitter, and packet loss recorded in real-life deployment scenarios, both in the uplink and the downlink. The results obtained across 12 able-bodied participants showed that the tracking performance was better with visual feedback than with tactile feedback across all impairment conditions. The average latency significantly affected performance, while random latency fluctuations did not. Interestingly, the performance degradation due to increasing impairments in case of tactile feedback was similar to the one observed for visual feedback. One of the main novelties brought by this study is the quantification of the impact of wireless impairments on closed-loop teleoperation tasks with tactile feedback. The results provide valuable insights for designing wireless infrastructure for tactile internet applications.
{"title":"Closed-Loop Manual Control With Tactile or Visual Feedback Under Wireless Link Impairments.","authors":"Suraj Suman, Pranav Mamidanna, Jimmy Jessen Nielsen, Federico Chiariotti, Cedomir Stefanovic, Strahinja Dosen, Petar Popovski","doi":"10.1109/TOH.2025.3544134","DOIUrl":"https://doi.org/10.1109/TOH.2025.3544134","url":null,"abstract":"<p><p>The emergence of low-latency wireless connectivity has opened significant new possibilities for closed-loop human-machine interaction (HMI) systems. However, data transmission, particularly over wireless links, suffers from impairments, such as random latency fluctuations and packet loss, affecting the overall control performance of HMI systems. In this study, we have evaluated the impact of wireless impairments for a closed-loop joystick-controlled trajectory tracking task. This has been done with two different types of feedback, visual and tactile, respectively. Wireless links were used both in the uplink transmission of command signal and the downlink transmission of feedback signal. The effects of wireless impairments were incorporated by artificially introducing latency, jitter, and packet loss recorded in real-life deployment scenarios, both in the uplink and the downlink. The results obtained across 12 able-bodied participants showed that the tracking performance was better with visual feedback than with tactile feedback across all impairment conditions. The average latency significantly affected performance, while random latency fluctuations did not. Interestingly, the performance degradation due to increasing impairments in case of tactile feedback was similar to the one observed for visual feedback. One of the main novelties brought by this study is the quantification of the impact of wireless impairments on closed-loop teleoperation tasks with tactile feedback. The results provide valuable insights for designing wireless infrastructure for tactile internet applications.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143556733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-02-17DOI: 10.1109/TOH.2025.3542471
Ido Gurevich, Shani Arusi, Ilana Nisky
During interactions with elastic objects, we integrate haptic and visual information to create stiffness perception. In many practical applications, either haptic or visual feedback may be delayed. Previous studies have investigated stiffness perception with delayed force or visual feedback in vertical interactions using the right hand. However, most daily interactions entail bimanual interactions that may be performed horizontally. Here, we studied the effect of visual delay sizes on stiffness perception during horizontal right-hand unimanual and bimanual interactions. We designed two forced-choice paradigm experiments. We asked right-handed participants to interact with pairs of elastic objects with either their right hand or both hands and determine which object felt stiffer. We delayed the visual information of one of the objects. In right-hand unimanual and bimanual interactions, consistent with previous studies, visual delay caused an overestimation of stiffness that increased with delay size. Interestingly, the participants' sensitivity to small differences in stiffness deteriorated due to delay only in right-hand unimanual and not bimanual interactions. The advantage in sensitivity of bimanual interactions compared to right-hand unimanual interactions could be considered in designing visual-haptic interfaces with delayed feedback. However, future studies are needed to determine the sensory mechanism that is responsible for this result.
{"title":"Stiffness Perception With Delayed Visual Feedback During Unimanual and Bimanual Interactions.","authors":"Ido Gurevich, Shani Arusi, Ilana Nisky","doi":"10.1109/TOH.2025.3542471","DOIUrl":"https://doi.org/10.1109/TOH.2025.3542471","url":null,"abstract":"<p><p>During interactions with elastic objects, we integrate haptic and visual information to create stiffness perception. In many practical applications, either haptic or visual feedback may be delayed. Previous studies have investigated stiffness perception with delayed force or visual feedback in vertical interactions using the right hand. However, most daily interactions entail bimanual interactions that may be performed horizontally. Here, we studied the effect of visual delay sizes on stiffness perception during horizontal right-hand unimanual and bimanual interactions. We designed two forced-choice paradigm experiments. We asked right-handed participants to interact with pairs of elastic objects with either their right hand or both hands and determine which object felt stiffer. We delayed the visual information of one of the objects. In right-hand unimanual and bimanual interactions, consistent with previous studies, visual delay caused an overestimation of stiffness that increased with delay size. Interestingly, the participants' sensitivity to small differences in stiffness deteriorated due to delay only in right-hand unimanual and not bimanual interactions. The advantage in sensitivity of bimanual interactions compared to right-hand unimanual interactions could be considered in designing visual-haptic interfaces with delayed feedback. However, future studies are needed to determine the sensory mechanism that is responsible for this result.</p>","PeriodicalId":13215,"journal":{"name":"IEEE Transactions on Haptics","volume":"PP ","pages":""},"PeriodicalIF":2.4,"publicationDate":"2025-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143556734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}