Tactile animation illusions are used to display dynamic information with haptic cues. In this study, we investigate two forms of tactile animation illusions that leverage the Funneling effect and Apparent Haptic Motion (AHM) on a one-dimensional circular tactile display. We define new parameters for the description of AHM that describe both the temporal and spatial aspects of these animations: Angle per Actuator (APA) and Revolution Duration (RD). We present three user studies about the perception of angular animations produced with these effects. Our results show that people can interpret AHM animations regardless of the APA value and that they can interpret tactile animation illusions slower than one degree per second. We also showed that the participants' ability to discriminate angular animations improves proportionally with the angle presented.
This review focuses on the interactions between the cutaneous senses, and in particular touch and temperature, as these are the most relevant for developing skin-based display technologies for use in virtual reality (VR) and for designing multimodal haptic devices. A broad spectrum of research is reviewed ranging from studies that have examined the mechanisms involved in thermal intensification and tactile masking, to more applied work that has focused on implementing thermal-tactile illusions such as thermal referral and illusory wetness in VR environments. Research on these tactile-thermal illusions has identified the differences between the senses of cold and warmth in terms of their effects on the perception of object properties and the prevalence of the perceptual experiences elicited. They have also underscored the fundamental spatial and temporal differences between the tactile and thermal senses. The wide-ranging body of research on compound sensations such as wetness and stickiness has highlighted the mechanisms involved in sensing moisture and provided a framework for measuring these sensations in a variety of contexts. Although the interactions between the two senses are complex, it is clear that the addition of thermal inputs to a tactile display enhances both user experience and enables novel sensory experiences.
Despite the richness of the human tactile capacity, remote communication practices often lack touch-based interactions. This leads to overtaxing our visual and auditory channels, a lack of connection and engagement, and inaccessibility for diverse sensory groups. In this paper, we learn from haptic intuitions of the blind and low vision (BLV) and Protactile DeafBlind (PT-DB) communities to investigate how core functions of communication can be routed through tactile channels. We investigate this re-routing by designing the Conversational Haptic Technology (CHAT) system, a wearable haptic system to explore the feasibility of language recreation through core functions of communication and emotional expression via touch. We contribute the design evolution of an input (sensing) pad and an output (actuation) pad, which enable a bidirectional, wireless system to support remote, touch-based communication. These systems were iteratively evaluated through a series of user studies with sighted-hearing (N=20), BLV (N=4), and PT-DB (N=7) participants to uncover touch profiles for relaying specific communication functions and emotional responses. Results indicate trends and similarities in the touch-based cues organically employed across the diverse groups and provide an initial framework for demonstrating the feasibility of communicating core functions through touch in a wearable form factor.
Current flexible haptic technologies struggle to render textures as effectively as rigid surfaces with friction reduction due to poor propagation of elastic waves in flexible substrates. Alternative solutions using different actuators have been explored, but their low density hampers fine renderings, and so texture rendering. To overcome these limits, we propose in this paper the development, the characterization, and the evaluation of an innovative haptic solution enabling localized or continuous texture rendering on a flexible surface. On the basis of previous work, the developed surface is composed of several haptic resonators vibrating at an ultrasonic frequency, driven by piezoelectric actuators, and associated with a polymer matrix. The solution combines the advantages of a rigid haptic surface, implementing friction modulation to obtain texture stimulation, and the conformability of a 75 m thick polymer sheet. By powering or not the actuators, it is possible to display simple tactile shapes. Tribological measurements confirm that the friction reduction matches the desired shape. Two studies demonstrated the device's effectiveness: participants identified simple geometric shapes with a 96 success rate and 14 s detection time, and two users simultaneously recognized independent tactile patterns, achieving 89 accuracy. This flexible device supports simple geometric shape display with texture rendering, multi-touch and multi-user interaction, offering potential for various applications.
The Snail is a wearable haptic interface that enables users to experience force feedback when grasping objects in Virtual Reality. It consists of a 3D-printed prop attached to the tip of the thumb that can rotate thanks to a small actuator. The prop is shaped like a snail to display different grasping sizes, ranging from to , according to its orientation. The prop displays the force feedback, so forces over can be displayed between fingers using small and low-power actuation. Very rigid objects can be rendered when the prop remains static, but rotations when the users grasp the prop also allow for the simulation of soft objects. The Snail is portable, low-cost, and easy to reproduce because it is made of 3D-printed parts. The design and performance of the device were evaluated through technical evaluations and 3 user experiments. They show that participants can discriminate different grasping sizes and levels of softness with the interface. The Snail also enhances user experience and performances in Virtual Reality compared to standard vibration feedback.
While much work is being done to advance autonomous capabilities of mobile robotics, specifically unmanned ground vehicles (UGVs), some applications might currently be too complex or undesirable for full autonomy. Maintaining a human in the loop has proven to be a reliable strategy in these applications, yet there are currently limitations to the efficacy of human operators. Haptic feedback has been proposed as a method of addressing these limitations, and aiding UGV operators in safe and effective operation. This manuscript presents the experimental validation of LARIAT (Lowering Attention Requirements in semi-Autonomous Teleoperation), a portable haptic device for teleoperated semi-autonomous UGVs. This device utilizes an adapted predictive form of the Zero-Moment Point (ZMP) rollover index to inform haptic squeeze cues provided to the UGV operator for human-on-the-loop notifications. First, a brief design overview of LARIAT, implemented haptic control, and the ZMP index are presented. In addition to experimental device characterization of the just noticeable difference, we present a case study that demonstrates LARIAT's abilities to improve teleoperation performance. In an experiment involving a simulation of walking behind a semi-autonomous UGV, LARIAT reduced the number of UGV rollovers by up to 50%, with comparable or increased performance in a concurrent secondary tasks.
In recent years, tactile presentation technology using airborne ultrasound has attracted attention. To achieve an ideal tactile presentation using ultrasound, the acoustic field on the user's skin surface must be determined, particularly the location of the focal point. Previous studies have suggested that thermal images can be used to immediately visualize sound pressure patterns on finger surfaces. In this study, we comprehensively investigated the performance of thermal imaging for measuring the ultrasound focus on the skin. First, we confirmed that the sound pressure peak at the focus and the temperature change peak were matched using silicone that mimicked the skin. In addition, we confirmed that when human skin was irradiated, a temperature increase was observed at above 4.0 kPa in 9 out of 10 participants. Moreover, a 5.5 kPa focus could be employed to track the focal position if the moving velocity was less than 100 mm/s and to detect the orbit if the velocity was less than 2000 mm/s. These results clarify the situation in which the focus can be measured by using thermal images and provide guidelines for practical use.