We developed a new aerial push-button with tactile feedback using focused airborne ultrasound. This study has two significant novelties compared to past related studies: 1) ultrasound emitters are equipped behind the user's finger and reflected ultrasound emission that is focused just above the solid plane placed under the finger presents tactile feedback to a finger pad and 2) tactile feedback is presented at two stages during pressing motion; at the time of pushing the button and withdrawing the finger from it. The former has a significant advantage in apparatus implementation in that the input surface of the device can be composed of a generic thin plane including touch panels, potentially capable of presenting input touch feedback only when the user touches objects on the screen. We experimentally found that the two-stage tactile presentation is much more effective in strengthening perceived tactile stimulation and believability of input completion when compared with a conventional single-stage method. This study proposes a composition of an aerial push-button in much more practical use than ever. The proposed system composition is expected to be one of the simplest frameworks in the airborne ultrasound tactile interface.
Drilling is an essential procedure in many orthopedic surgeries. Especially in femoral neck fracture surgery, any loss of control can lead to secondary injury to the patient. Consequently, it is imperative for orthopaedists to undergo extensive training to master the appropriate haptic interaction. In this work, a virtual drilling surgery training system with high-fidelity haptic feedback is developed for femoral neck fracture. First, to achieve realistic visual rendering, the external boundary of femur is reconstructed using Triple dexel, allowing for fast and realistic surface reconstruction. Next, the heterogeneous and hierarchical structure of femur is reconstructed using multi-morphology with three periodic minimal surface (TPMS) design approaches. Then, a haptic interaction algorithm based on Triple dexel and TPMS is proposed. This algorithm uses the TPMS equations to quickly compute the densities of the bone chips produced by Triple dexel Boolean operations, thus simulating the variation in porosity at different positions of the bone and providing precise haptic feedback during the drilling process. Finally, a robotic experimental platform is established, and the proposed force model is verified using bovine bones. The results show that the measured values are consistent with the experimental values. Furthermore, evaluation of the training system with medical students as participants shows that a fine haptic rendering improves users' hand-eye coordination skills.
Rendering affective touch through haptic interfaces has gathered significant interest due to its ability to elicit emotional responses. Among various forms of affective touch, this study focuses on stroke stimuli. An illusory stroke stimulus is rendered using eight discrete Pneumatic Unit Cell (PUC) actuators on the left forearm. The study systematically investigates how rendering parameters-including the traveling speed of the illusory stroke, the stimulus onset asynchrony (SOA) of consecutive indentations, and indentation pressure-affect the perceived pleasantness and continuity of the stimulus. Results reveal that higher speeds significantly improved both pleasantness and continuity, with speed emerging as the most influential factor. In contrast, SOA has no significant effect on either perceived pleasantness or continuity. Indentation pressure shows a moderate impact on pleasantness, with high pressures reducing pleasantness but having no significant effect on continuity. Additionally, a positive correlation is observed between perceived pleasantness and continuity, underscoring the relevance of the continuity illusion created by sequential indentations with discrete actuators in evoking pleasant sensations. These findings demonstrate the potential of PUC actuators for creating affective touch stimuli and provide preliminary insights into the influence of rendering parameters on affective touch in human-machine and human-robot interactions.
Online retail is still mostly limited to the visual channel despite haptic interface technology advances. One potential strategy for overcoming the lack of touch in online retail is using pseudo-haptics: illusory haptic sensations resulting from manipulating the visual feedback of mouse or touchscreen interactions. Previous research used computer-generated graphics for pseudo-haptic experiences, while online retailers rely heavily on accurate photos of their products. Therefore, our study proposes a novel approach to designing pseudo-haptics using interactive photograph series together with mouse cursor gain modulations, called Pseudo-Haptic Photograph Interaction (PHPI). Unlike prior approaches that rely on simulated or stylized imagery, PHPI introduces pseudo-haptic effects through real photographic sequences of fabric motion, bridging the gap between visual realism and interactive haptic simulation. We conducted user studies on the perception of stiffness and weight to validate our approach. In experiment 1, we investigated the relation between the perception of weight and stiffness and increased or decreased gain of mouse movement. The study reveals a strong relation between mouse gain and perception. To test whether this corresponded to pseudo-haptic sensations, we performed experiment 2, in which actual fabrics had to be matched with those displayed through PHPI. We found a correlation between the haptically perceived weight and stiffness of fabrics, and their digital surrogate mediated by visual cues, confirming the potential of PHPI for multimodal experiences in online retail and other photographic presentations.
Kinesthetic illusions, which arise when muscle spindles are activated by vibration, provide a compact means of presenting kinesthetic sensations. Because muscle spindles contribute not only to sensing body movement but also to perceiving heaviness, vibration-induced illusions could potentially modulate weight perception. While prior studies have primarily focused on conveying virtual movement, the modulation of perceived heaviness has received little attention. Presenting a sense of heaviness is essential for enriching haptic interactions with virtual objects. This study investigates whether multi-point tendon vibration can increase or decrease perceived heaviness (Experiment 1) and how the magnitude of the effect can be systematically controlled (Experiment 2). The results show that tendon vibration significantly increases perceived heaviness but does not significantly decrease it, although a decreasing trend was observed. Moreover, the increase can be adjusted across at least three levels within the range from 350 g to 450 g. Finally, we discuss plausible mechanisms underlying this vibration-induced modulation of weight perception.
Multimodal haptic feedback that combines electrical muscle stimulation (EMS) and vibrotactile signals can create richer, more immersive experiences than those using a single modality. EMS delivers kinesthetic feedback by inducing muscle contractions, simulating force sensations that complement tactile stimuli from mechanical vibrations. However, presenting these stimuli concurrently can lead to perceptual interference, where one modality masks or alters the perception of the other. Temporal alignment between stimuli is also critical, as asynchrony can affect the perceived quality of haptic sensations. To investigate these phenomena, we conducted three user studies with a total of 40 participants (12, 12, and 16, respectively), focusing on mutual masking effects and temporal order perception between EMS and vibration. Our findings suggest that vibration can alleviate the tingling and discomfort commonly associated with EMS, effectively mitigating these unwanted sensations. Conversely, the presence of EMS increases the Just Noticeable Difference (JND) in vibration frequency discrimination, indicating a decrease in sensitivity to vibratory changes. Additionally, participants generally perceived the stimuli as simultaneous when EMS preceded vibration by 100 to 200 milliseconds. We discuss these findings and present four design guidelines for multimodal haptic rendering with EMS and vibrations in user applications.
This paper investigates the notion of "Persuasive Vibrations", which showed that augmenting a person's speech with vibrotactile feedback could artificially increase persuasion. However, while the initial paper has shown the effect, the underlying reasons why vibrations enhance persuasion remain unknown. Through two different user studies, this paper aims to study how the underlying parameters of the vibratory feedback (e.g., frequency, amplitude, or audio-vibration synchronization) influence persuasion. The first study aimed to identify the parameters of vibrotactile feedback that can positively influence persuasion. The second study evaluated vibrotactile feedback that might impair the persuasive effect. In a nutshell, the first experiment suggests that the isolation of different properties of the vibratory signal could tend to provide higher persuasion compared to no vibratory feedback. A lower frequency at 100 Hz seems the most efficient way to generate a persuasive effect. In contrast, the second experiment suggests that some alteration of the vibratory signal (e.g., latency) does not decrease the levels of persuasion compared to the no-vibration condition. All in all, the results suggest that using lower frequencies could have a better effect on persuasion. These results could serve as a basis for haptic design in applications like videoconferencing, virtual meetings, and training systems where supporting user speech is essential.
This paper proposes the use of reaction wheels in parallel mechanisms for physical human-robot interaction during the co-manipulation of large payloads. The concept combines the advantages of a mechanically backdrivable robot - for hands-on-payload interaction - with the reactiveness of flywheels for the compensation of inertial loads, thereby leading to a smooth and low-inertia rendering. In the proposed approach, gravity compensation and dynamic compensation are partitioned and assigned to two subsets of actuators, namely the backdrivable joint actuators and the flywheel actuators, the latter being smaller and properly geared actuators to benefit from faster dynamics for interaction stability purposes. Simulation results of a human interaction with a planar robot to displace a payload show that the desired dynamic behaviour of the moving platform is correctly rendered, while indicating that the inertia compensation torques may vary more quickly than the gravity torques, which supports the proposed idea. Experiments are also conducted to validate the rendering of the desired virtual dynamics to the user.

