Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816164
Xavier de Tinguy, C. Pacchierotti, Mathieu Emily, Mathilde Chevalier, Aurelie Guignardat, Morgan Guillaudeux, Chloe Six, A. Lécuyer, M. Marchal
Tangible objects are used in Virtual Reality to provide human users with distributed haptic sensations when grasping virtual objects. To achieve a compelling illusion, there should be a good correspondence between the haptic features of the tangible object and those of the corresponding virtual one, i.e., what users see in the virtual environment should match as much as possible what they touch in the real world. This paper aims at quantifying how similar tangible and virtual objects need to be, in terms of haptic perception, to still feel the same. As it is often not possible to create tangible replicas of all the virtual objects in the scene, it is important to understand how different tangible and virtual objects can be without the user noticing. This paper reports on the just-noticeable difference (JND) when grasping, with a thumb-index pinch, a tangible object which differ from a seen virtual one on three important haptic features: width, local orientation, and curvature. Results show JND values of 5.75%, 43.8%, and 66.66% of the reference shape for the width, local orientation, and local curvature features, respectively. These results will enable researchers in the field of Virtual Reality to use a reduced number of tangible objects to render multiple virtual ones.
{"title":"How Different Tangible and Virtual Objects Can Be While Still Feeling the Same?","authors":"Xavier de Tinguy, C. Pacchierotti, Mathieu Emily, Mathilde Chevalier, Aurelie Guignardat, Morgan Guillaudeux, Chloe Six, A. Lécuyer, M. Marchal","doi":"10.1109/WHC.2019.8816164","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816164","url":null,"abstract":"Tangible objects are used in Virtual Reality to provide human users with distributed haptic sensations when grasping virtual objects. To achieve a compelling illusion, there should be a good correspondence between the haptic features of the tangible object and those of the corresponding virtual one, i.e., what users see in the virtual environment should match as much as possible what they touch in the real world. This paper aims at quantifying how similar tangible and virtual objects need to be, in terms of haptic perception, to still feel the same. As it is often not possible to create tangible replicas of all the virtual objects in the scene, it is important to understand how different tangible and virtual objects can be without the user noticing. This paper reports on the just-noticeable difference (JND) when grasping, with a thumb-index pinch, a tangible object which differ from a seen virtual one on three important haptic features: width, local orientation, and curvature. Results show JND values of 5.75%, 43.8%, and 66.66% of the reference shape for the width, local orientation, and local curvature features, respectively. These results will enable researchers in the field of Virtual Reality to use a reduced number of tangible objects to render multiple virtual ones.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"81 1","pages":"580-585"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91383251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816158
Heng Xu, R. Klatzky, M. Peshkin, J. Colgate
We have developed a novel button click rendering mechanism based on active lateral force feedback. The effect can be localized because electroadhesion between a finger and a surface can be localized. We did psychophysical experiments to evaluate the quality of a rendered button click, which subjects judged to be acceptable. We can thus generate a button click on a flat surface without macroscopic motion of the surface in the lateral or normal direction, and we can localize this haptic effect to an individual finger. This mechanism is promising for touch-typing keyboard rendering (“multi-click”).
{"title":"Localized Rendering of Button Click Sensation via Active Lateral Force Feedback","authors":"Heng Xu, R. Klatzky, M. Peshkin, J. Colgate","doi":"10.1109/WHC.2019.8816158","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816158","url":null,"abstract":"We have developed a novel button click rendering mechanism based on active lateral force feedback. The effect can be localized because electroadhesion between a finger and a surface can be localized. We did psychophysical experiments to evaluate the quality of a rendered button click, which subjects judged to be acceptable. We can thus generate a button click on a flat surface without macroscopic motion of the surface in the lateral or normal direction, and we can localize this haptic effect to an individual finger. This mechanism is promising for touch-typing keyboard rendering (“multi-click”).","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"2016 1","pages":"509-514"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90848386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816134
Tomohisa Hirano, Junichi Kanebako, M. Y. Saraiji, R. Peiris, K. Minamizawa
Blind marathon is a sport where visually impaired people can run with sighted guides in pairs. In this paper, we present an assistant system for blind marathon runners called "Synchronized Running" that improves the guidance experience for the runners. Our proposed system allows both runners to match their running tempo and synchronize with each other, similar to a three-legged race case, without any direct physical attachment. Two modules are located on the ankle of both runners, that measure the acceleration of the visually impaired runner, and provide haptic feedback to the guide’s ankle according the tempo of the running pace. This synchronization allows the guide to grasp a comfortable running pace towards the visually impaired person, allowing seamless running communication between both runners. The evaluation results indicate that our system encourage runners (primarily novice guides) to achieve comfortable guidance running experience toward the blind runners.
{"title":"Synchronized Running: Running Support System for Guide Runners by Haptic Sharing in Blind Marathon","authors":"Tomohisa Hirano, Junichi Kanebako, M. Y. Saraiji, R. Peiris, K. Minamizawa","doi":"10.1109/WHC.2019.8816134","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816134","url":null,"abstract":"Blind marathon is a sport where visually impaired people can run with sighted guides in pairs. In this paper, we present an assistant system for blind marathon runners called \"Synchronized Running\" that improves the guidance experience for the runners. Our proposed system allows both runners to match their running tempo and synchronize with each other, similar to a three-legged race case, without any direct physical attachment. Two modules are located on the ankle of both runners, that measure the acceleration of the visually impaired runner, and provide haptic feedback to the guide’s ankle according the tempo of the running pace. This synchronization allows the guide to grasp a comfortable running pace towards the visually impaired person, allowing seamless running communication between both runners. The evaluation results indicate that our system encourage runners (primarily novice guides) to achieve comfortable guidance running experience toward the blind runners.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"129 7","pages":"25-30"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91402891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816079
Yoav Golan, Ben Serota, Amir Shapiro, O. Shriki, I. Nisky
Dogs have been helping humans in different ways since prehistoric times. Modern working dogs perform tasks ranging from search-and-rescue to bomb detection, but relatively little work has been done on the use of technology with working dogs. Therefore, communication with working dogs is still predominantly visual and audial. In this paper, we introduce a vest with four embedded vibration motors in specially designed motor housings. The vest applies vibrotactile cues to the dog that wears it, and the dog is trained to associate the cues with practical commands. The commands are issued to the vest from a handler with a wireless remote. We demonstrate the vest using a test subject: a six year old male Labrador Retriever/German Shepherd Dog crossbreed. We test the perception threshold of the test subject to haptic cues, and its proficiency in understanding several haptic cues. These cues differ in location and/or waveform. Our case study shows that the dog was able to successfully learn haptic commands in this way. This apparatus may prove beneficial for search and rescue purposes, working dog operation, training deaf dogs, and training by handlers with speech impairments.
{"title":"A Vibrotactile Vest for Remote Human-Dog Communication","authors":"Yoav Golan, Ben Serota, Amir Shapiro, O. Shriki, I. Nisky","doi":"10.1109/WHC.2019.8816079","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816079","url":null,"abstract":"Dogs have been helping humans in different ways since prehistoric times. Modern working dogs perform tasks ranging from search-and-rescue to bomb detection, but relatively little work has been done on the use of technology with working dogs. Therefore, communication with working dogs is still predominantly visual and audial. In this paper, we introduce a vest with four embedded vibration motors in specially designed motor housings. The vest applies vibrotactile cues to the dog that wears it, and the dog is trained to associate the cues with practical commands. The commands are issued to the vest from a handler with a wireless remote. We demonstrate the vest using a test subject: a six year old male Labrador Retriever/German Shepherd Dog crossbreed. We test the perception threshold of the test subject to haptic cues, and its proficiency in understanding several haptic cues. These cues differ in location and/or waveform. Our case study shows that the dog was able to successfully learn haptic commands in this way. This apparatus may prove beneficial for search and rescue purposes, working dog operation, training deaf dogs, and training by handlers with speech impairments.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"105 1","pages":"556-561"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89530074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816105
P. Sreetharan, A. Israr, Priyanshu Agarwal
We present a skin-shear actuator based on the lead screw mechanism. The lead screw mechanism is simple, reliable, offers fewer components, and accommodates into compact form- factors. We show mechanical design of a single assembly unit and implement multiple units in a single handheld device. We evaluate the actuator in one instrumentation-based test and one preliminary user study. Tests show that the actuator performance matches with the open-loop control scheme when no load is placed on the actuator. The performance deteriorates with loading, particularly when quicker and high amplitude stroke are required. The user study shows that information throughput with the skin-shear is comparable to vibrations through three digits on the hand. It is shown that small compact actuators (~5g) with efficient mechanisms can render displacements (>3mm) and forces (>1N) for easily differentiating skin-shear cues.
{"title":"A Compact Skin-Shear Device using a Lead-Screw Mechanism","authors":"P. Sreetharan, A. Israr, Priyanshu Agarwal","doi":"10.1109/WHC.2019.8816105","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816105","url":null,"abstract":"We present a skin-shear actuator based on the lead screw mechanism. The lead screw mechanism is simple, reliable, offers fewer components, and accommodates into compact form- factors. We show mechanical design of a single assembly unit and implement multiple units in a single handheld device. We evaluate the actuator in one instrumentation-based test and one preliminary user study. Tests show that the actuator performance matches with the open-loop control scheme when no load is placed on the actuator. The performance deteriorates with loading, particularly when quicker and high amplitude stroke are required. The user study shows that information throughput with the skin-shear is comparable to vibrations through three digits on the hand. It is shown that small compact actuators (~5g) with efficient mechanisms can render displacements (>3mm) and forces (>1N) for easily differentiating skin-shear cues.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"104 1","pages":"527-532"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82707122","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816109
Christoph Hellmann, A. Bajrami, W. Kraus
Utilising a two finger robot gripper in physical human robot interaction bears the risk of clamping fingers in the gripper. In this paper, we formulate a new grasp strategy which aborts grasps if a human body part is grasped instead of a workpiece. The strategy integrates a pressure-based haptic exploratory procedure seamlessly into the grasp process. It uses force and deformation data gathered in the exploratory procedure to distinguish human body parts from workpieces. We compare a support vector machine (SVM) and a random forest classifier for this task. The validation of the grasp strategy is carried out by grasping experiments with a two finger gripper in which a dummy hand and real human hands are used. Using this strategy grasps can be aborted without exceeding the maximum permissible grasp force for collisions with humans. The SVM classifier achieves an accuracy of 99.06% and a recall of 99.997% on our experimental data. Classification only takes 3.65 ms on embedded hardware. The SVM outperforms the random forest classifier.
{"title":"Enhancing a robot gripper with haptic perception for risk mitigation in physical human robot interaction","authors":"Christoph Hellmann, A. Bajrami, W. Kraus","doi":"10.1109/WHC.2019.8816109","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816109","url":null,"abstract":"Utilising a two finger robot gripper in physical human robot interaction bears the risk of clamping fingers in the gripper. In this paper, we formulate a new grasp strategy which aborts grasps if a human body part is grasped instead of a workpiece. The strategy integrates a pressure-based haptic exploratory procedure seamlessly into the grasp process. It uses force and deformation data gathered in the exploratory procedure to distinguish human body parts from workpieces. We compare a support vector machine (SVM) and a random forest classifier for this task. The validation of the grasp strategy is carried out by grasping experiments with a two finger gripper in which a dummy hand and real human hands are used. Using this strategy grasps can be aborted without exceeding the maximum permissible grasp force for collisions with humans. The SVM classifier achieves an accuracy of 99.06% and a recall of 99.997% on our experimental data. Classification only takes 3.65 ms on embedded hardware. The SVM outperforms the random forest classifier.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"14 1","pages":"253-258"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83695350","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816168
Anis Kaci, Angelica Torres, F. Giraud, C. Giraud-Audine, M. Amberg, B. Lemaire-Semail
When a finger touches an ultrasonic vibrating plate, non-sinusoidal contact force appears, named acoustical finger force. In this paper, we present a method to observe its fundamental in the case of a friction reduction haptic interface. The capability of the method to be achieved on-line, in a small micro-controller, is established. We show a correlation between this measurement and the friction when sliding the finger. A model that predicts the friction coefficient and the friction contrast is laid down; it gives consistent output for 10 participants out of 12 having different biomechanical skin parameters of the skin.
{"title":"Fundamental Acoustical Finger Force Calculation for Out-of-Plane Ultrasonic Vibration and its Correlation with Friction Reduction","authors":"Anis Kaci, Angelica Torres, F. Giraud, C. Giraud-Audine, M. Amberg, B. Lemaire-Semail","doi":"10.1109/WHC.2019.8816168","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816168","url":null,"abstract":"When a finger touches an ultrasonic vibrating plate, non-sinusoidal contact force appears, named acoustical finger force. In this paper, we present a method to observe its fundamental in the case of a friction reduction haptic interface. The capability of the method to be achieved on-line, in a small micro-controller, is established. We show a correlation between this measurement and the friction when sliding the finger. A model that predicts the friction coefficient and the friction contrast is laid down; it gives consistent output for 10 participants out of 12 having different biomechanical skin parameters of the skin.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"70 1","pages":"413-418"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89311947","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816076
Victor A. Luna Laija, Daniel Cleveland, K. Hashtrudi-Zaad
In typical haptic simulation systems, sampled position from encoders is utilized to compute the virtual environment force. For dynamic environments, velocity is numerically approximated using sampled position. In this paper, we analytically studied the uncoupled stability of a haptic simulation system when the discrete velocity needed to implement a damper-spring virtual environment came from sampling analog velocity. Since typical analog velocity sensors add inertia to the system or contain ripple, we implemented a high-pass filter to estimate the analog velocity from a potentiometer analog position output. We analytically and experimentally assessed the uncoupled stability for this system for various filter cut-off frequencies and sampling rates.
{"title":"Uncoupled Stability of a Haptic System with Position-Velocity Sampling","authors":"Victor A. Luna Laija, Daniel Cleveland, K. Hashtrudi-Zaad","doi":"10.1109/WHC.2019.8816076","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816076","url":null,"abstract":"In typical haptic simulation systems, sampled position from encoders is utilized to compute the virtual environment force. For dynamic environments, velocity is numerically approximated using sampled position. In this paper, we analytically studied the uncoupled stability of a haptic simulation system when the discrete velocity needed to implement a damper-spring virtual environment came from sampling analog velocity. Since typical analog velocity sensors add inertia to the system or contain ripple, we implemented a high-pass filter to estimate the analog velocity from a potentiometer analog position output. We analytically and experimentally assessed the uncoupled stability for this system for various filter cut-off frequencies and sampling rates.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"17 1","pages":"473-478"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88442563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-07-01DOI: 10.1109/WHC.2019.8816174
Anna Metzger, M. Toscani, Matteo Valsecchi, K. Drewing
Haptic search is a common every day task. Here we characterize the movement dynamics in haptic search. Participants searched for a particular configuration of symbols on a tactile display. We compared the exploratory behavior of the fingers in proximity to potential targets: when any of the fingers encountered a potential target, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, the middle and the index fingers dramatically slowed down. Being in contact with the potential target, the index and the middle finger moved in around a smaller area than the other fingers, which rather seemed to move away to leave them space. Our results corroborate a previous hypothesis [1] that haptic search consists of two phases: a process of target search using all fingers, and a target analysis using the middle and the index finger, which might be specialized for fine analysis.
{"title":"Dynamics of exploration in haptic search*","authors":"Anna Metzger, M. Toscani, Matteo Valsecchi, K. Drewing","doi":"10.1109/WHC.2019.8816174","DOIUrl":"https://doi.org/10.1109/WHC.2019.8816174","url":null,"abstract":"Haptic search is a common every day task. Here we characterize the movement dynamics in haptic search. Participants searched for a particular configuration of symbols on a tactile display. We compared the exploratory behavior of the fingers in proximity to potential targets: when any of the fingers encountered a potential target, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, the middle and the index fingers dramatically slowed down. Being in contact with the potential target, the index and the middle finger moved in around a smaller area than the other fingers, which rather seemed to move away to leave them space. Our results corroborate a previous hypothesis [1] that haptic search consists of two phases: a process of target search using all fingers, and a target analysis using the middle and the index finger, which might be specialized for fine analysis.","PeriodicalId":6702,"journal":{"name":"2019 IEEE World Haptics Conference (WHC)","volume":"36 1","pages":"277-282"},"PeriodicalIF":0.0,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75512503","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}