Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287190
M. Salada, J. Colgate, Peter M. Vishton, E. Frankel
This paper describes the results of two experiments that investigate relative motion between a surface and the fingertip (slip) as part of a larger program of research on "fingertip haptics. " The primary intent of both experiments is to evaluate the importance of relative motion with respect to perceiving surface velocity. The perception of surface velocity is crucial to dexterous control and object recognition. The first experiment determines the just noticeable difference (JND) in the speed and direction of slip using the "method of adjustment" difference threshold technique over a range of 80 to 240 mm/sec on two different surface textures. 42 subjects participated the first experiment. We find slip speed perception to be highly dependent on surface texture, with Weber fractions between 0.04 and 0.25. Slip direction difference thresholds range between 3.6 and 11.7 degrees, also highly dependent on surface texture. The second experiment establishes the relative importance of proprioceptive feedback versus slip feedback in perceiving surface velocity. For both surface textures, we utilize an alternative forced choice experimental technique on 40 subjects. We find that slip feedback at the fingertip only subtly influences the perception of speed over proprioceptive feedback from the hand and arm for the two different base speeds studied (250 and 400 mm/sec).
{"title":"Two experiments on the perception of slip at the fingertip","authors":"M. Salada, J. Colgate, Peter M. Vishton, E. Frankel","doi":"10.1109/HAPTIC.2004.1287190","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287190","url":null,"abstract":"This paper describes the results of two experiments that investigate relative motion between a surface and the fingertip (slip) as part of a larger program of research on \"fingertip haptics. \" The primary intent of both experiments is to evaluate the importance of relative motion with respect to perceiving surface velocity. The perception of surface velocity is crucial to dexterous control and object recognition. The first experiment determines the just noticeable difference (JND) in the speed and direction of slip using the \"method of adjustment\" difference threshold technique over a range of 80 to 240 mm/sec on two different surface textures. 42 subjects participated the first experiment. We find slip speed perception to be highly dependent on surface texture, with Weber fractions between 0.04 and 0.25. Slip direction difference thresholds range between 3.6 and 11.7 degrees, also highly dependent on surface texture. The second experiment establishes the relative importance of proprioceptive feedback versus slip feedback in perceiving surface velocity. For both surface textures, we utilize an alternative forced choice experimental technique on 40 subjects. We find that slip feedback at the fingertip only subtly influences the perception of speed over proprioceptive feedback from the hand and arm for the two different base speeds studied (250 and 400 mm/sec).","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122319166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287192
J. Weisenberger, G. Poling
Previous studies of multisensory texture perception have addressed the relative contributions of different modalities, examining visual/haptic and auditory/haptic interactions. In the present study, the ability of observers to use information from three sensory modalities (visual, auditory, and haptic) was examined in a virtual texture discrimination task. Results indicated better performance for two- and three-modality conditions for some stimuli but not for others, suggesting that the interactions of haptic, auditory, and visual inputs are complex and dependent on the specifics of the stimulus condition. Viewed in this manner, the results are consistent with the modality appropriateness hypothesis. These findings are discussed in view of current formulations of multisensory interaction.
{"title":"Multisensory roughness perception of virtual surfaces: effects of correlated cues","authors":"J. Weisenberger, G. Poling","doi":"10.1109/HAPTIC.2004.1287192","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287192","url":null,"abstract":"Previous studies of multisensory texture perception have addressed the relative contributions of different modalities, examining visual/haptic and auditory/haptic interactions. In the present study, the ability of observers to use information from three sensory modalities (visual, auditory, and haptic) was examined in a virtual texture discrimination task. Results indicated better performance for two- and three-modality conditions for some stimuli but not for others, suggesting that the interactions of haptic, auditory, and visual inputs are complex and dependent on the specifics of the stimulus condition. Viewed in this manner, the results are consistent with the modality appropriateness hypothesis. These findings are discussed in view of current formulations of multisensory interaction.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122686818","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287183
Weihang Zhu, Yuan-Shin Lee
This paper presents an analytical methodology for virtual sculpting of complex surfaces with a developed 5-DOF (degree of freedom) force-torque haptic interface. In the proposed methodology, 5-axis tool motion and analytical tool swept volume are formulated for updating the virtual stock material, which is represented with dexel volume model. Based on the tool motion analysis, a dexel-based collision detection method and a force-torque feedback algorithm are proposed for virtual sculpting. Different from the traditional ways of calculating haptic forces based on depth of penetration or impulses, the proposed method determines the haptic force by finding the material removal rate of dexels. A lab-built 5-DOF haptic interface system is developed for the proposed haptic sculpting system. From the haptic sculpting system, both corresponding tool motion of the creative process and the sculpted model can be output. The tool motion can be recorded and output as NC (numerically-controlled) commands. The output STL model of the haptic sculpting system can be processed for machining planning.
{"title":"Product prototyping and manufacturing planning with 5-DOF haptic sculpting and dexel volume updating","authors":"Weihang Zhu, Yuan-Shin Lee","doi":"10.1109/HAPTIC.2004.1287183","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287183","url":null,"abstract":"This paper presents an analytical methodology for virtual sculpting of complex surfaces with a developed 5-DOF (degree of freedom) force-torque haptic interface. In the proposed methodology, 5-axis tool motion and analytical tool swept volume are formulated for updating the virtual stock material, which is represented with dexel volume model. Based on the tool motion analysis, a dexel-based collision detection method and a force-torque feedback algorithm are proposed for virtual sculpting. Different from the traditional ways of calculating haptic forces based on depth of penetration or impulses, the proposed method determines the haptic force by finding the material removal rate of dexels. A lab-built 5-DOF haptic interface system is developed for the proposed haptic sculpting system. From the haptic sculpting system, both corresponding tool motion of the creative process and the sculpted model can be output. The tool motion can be recorded and output as NC (numerically-controlled) commands. The output STL model of the haptic sculpting system can be processed for machining planning.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122718661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287204
Kristin C. Potter, David E. Johnson, E. Cohen
We present a system for haptically rendering large height field datasets. In as much as, height fields are naturally mapped to piecewise bilinear patches. We develop algorithms for intersection, penetration depth, and closest point tracking using bilinear patches. In contrast to many common haptic rendering schemes for polygonal models, this approach does not require preprocessing or additional storage. Thus, it is particularly suitable for the large scale datasets found in geographic and reverse engineering applications.
{"title":"Height field haptics","authors":"Kristin C. Potter, David E. Johnson, E. Cohen","doi":"10.1109/HAPTIC.2004.1287204","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287204","url":null,"abstract":"We present a system for haptically rendering large height field datasets. In as much as, height fields are naturally mapped to piecewise bilinear patches. We develop algorithms for intersection, penetration depth, and closest point tracking using bilinear patches. In contrast to many common haptic rendering schemes for polygonal models, this approach does not require preprocessing or additional storage. Thus, it is particularly suitable for the large scale datasets found in geographic and reverse engineering applications.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122727724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287225
George Pava, Karon E Maclean
In this paper we present The RealTime Platform Middleware (RTPM), an architecture for prototyping realtime multimodal I/O projects. Multimodal applications often require a distributed implementation to meet disparate temporal and platform needs. RTPM provides an extendable, device-independent, network-transparent interface to a set of user I/O devices which eases application integration across different operating systems. RTPM consists of a framework based on Common Object Request Broker Architecture (CORBA) and a custom virtual device abstraction that exports real devices' functionality to user processes. It offers two mechanisms (client/server and consumer/supplier) for communication between user processes. This paper describes the architecture's objectives and implementation, provides examples of its use and analyzes its performance in some typical haptic application configurations.
{"title":"Real time platform middleware for transparent prototyping of haptic applications","authors":"George Pava, Karon E Maclean","doi":"10.1109/HAPTIC.2004.1287225","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287225","url":null,"abstract":"In this paper we present The RealTime Platform Middleware (RTPM), an architecture for prototyping realtime multimodal I/O projects. Multimodal applications often require a distributed implementation to meet disparate temporal and platform needs. RTPM provides an extendable, device-independent, network-transparent interface to a set of user I/O devices which eases application integration across different operating systems. RTPM consists of a framework based on Common Object Request Broker Architecture (CORBA) and a custom virtual device abstraction that exports real devices' functionality to user processes. It offers two mechanisms (client/server and consumer/supplier) for communication between user processes. This paper describes the architecture's objectives and implementation, provides examples of its use and analyzes its performance in some typical haptic application configurations.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121978360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287222
P. G. Griffiths, R. Gillespie
In this paper, a paradigm for shared control is described in which a machine's manual control interface is motorized to allow a human and an automatic controller to simultaneously exert control. The manual interface becomes a haptic display, relaying information to the human about the intentions of the automatic controller while retaining its role as a manual control interface. The human may express his control intentions in a way that either overrides the automation or conforms to it. The automatic controller, by design, aims to create images in the mind of the human of fixtures in the shared workspace that can be incorporated into efficient task completion strategies. The fixtures are animated under the guidance of an algorithm designed to automate part of the human/machine task. Results are presented from 2 experiments in which 11 subjects completed a path following task using a motorized steering wheel on a fixed-base driving simulator. These results indicate that the haptic assist through the steering wheel improves lane keeping by at least 30% reduces visual demand by 29% (p<0.000!) and improves reaction time by 18 ms (p=0.0009).
本文描述了一种共享控制范例,其中机器的手动控制界面是电动化的,以允许人类和自动控制器同时施加控制。手动界面变成了一种触觉显示,在保留手动控制界面的作用的同时,向人类传递有关自动控制器意图的信息。人可以用一种方式来表达他的控制意图,这种方式要么凌驾于自动化之上,要么符合自动化。通过设计,自动控制器的目的是在人类的脑海中创建共享工作空间中固定装置的图像,这些图像可以整合到有效的任务完成策略中。夹具在算法的指导下动画,该算法设计用于自动化部分人机任务。本文介绍了在固定底座驾驶模拟器上,11名受试者使用电动方向盘完成路径跟随任务的2个实验结果。这些结果表明,通过方向盘的触觉辅助使车道保持能力提高了至少30%,视觉需求降低了29% (p<0.000!),反应时间提高了18 ms (p=0.0009)。
{"title":"Shared control between human and machine: haptic display of automation during manual control of vehicle heading","authors":"P. G. Griffiths, R. Gillespie","doi":"10.1109/HAPTIC.2004.1287222","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287222","url":null,"abstract":"In this paper, a paradigm for shared control is described in which a machine's manual control interface is motorized to allow a human and an automatic controller to simultaneously exert control. The manual interface becomes a haptic display, relaying information to the human about the intentions of the automatic controller while retaining its role as a manual control interface. The human may express his control intentions in a way that either overrides the automation or conforms to it. The automatic controller, by design, aims to create images in the mind of the human of fixtures in the shared workspace that can be incorporated into efficient task completion strategies. The fixtures are animated under the guidance of an algorithm designed to automate part of the human/machine task. Results are presented from 2 experiments in which 11 subjects completed a path following task using a motorized steering wheel on a fixed-base driving simulator. These results indicate that the haptic assist through the steering wheel improves lane keeping by at least 30% reduces visual demand by 29% (p<0.000!) and improves reaction time by 18 ms (p=0.0009).","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129043327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287196
Y. Wei, J. Patton
Previous studies on reaching movements have shown that people can adapt to distortions that are either visuomotor (e.g., prism glasses) or mechanical (e.g., force fields) through repetitive training. Other work has shown that these two types of adaptation may share similar neural resources. One effective test of this sharing hypothesis would be to show that one could teach one using the other. This study investigated whether training with a specialized force field could benefit the learning of a visual distortion. Two groups of subjects volunteered to participate in this study. One group of subjects trained directly on a visual rotation. The other group of subjects trained in a "mixed field" condition. The mixed field was primarily a force field that was specially designed so that, after adapting to its characteristics, the subject would make the appropriate movement in the visual rotation condition. The mixed field condition also contained intermittent test movements that evaluated performance in the visual rotation condition. Results showed that errors reduced more rapidly in the mixed field condition. We also found that subjects were able to generalize what they learned to movement directions that were not part of the training, but there was no detectable difference between the two groups. Finally, we found no difference in the rate these training effects washed out after subjects returned to normal conditions. This study shows that training with robotic forces can facilitate the learning of visual rotations. The learning may be enhanced in the mixed condition by the addition of cutaneous and proprioceptive force sensors. Moreover, this study can be applied to telerobotics and the rehabilitation of brain injured individuals, where there is often a distortion in hand-eye coordination.
{"title":"Force field training to facilitate learning visual distortions: a \"sensory crossover\" experiment","authors":"Y. Wei, J. Patton","doi":"10.1109/HAPTIC.2004.1287196","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287196","url":null,"abstract":"Previous studies on reaching movements have shown that people can adapt to distortions that are either visuomotor (e.g., prism glasses) or mechanical (e.g., force fields) through repetitive training. Other work has shown that these two types of adaptation may share similar neural resources. One effective test of this sharing hypothesis would be to show that one could teach one using the other. This study investigated whether training with a specialized force field could benefit the learning of a visual distortion. Two groups of subjects volunteered to participate in this study. One group of subjects trained directly on a visual rotation. The other group of subjects trained in a \"mixed field\" condition. The mixed field was primarily a force field that was specially designed so that, after adapting to its characteristics, the subject would make the appropriate movement in the visual rotation condition. The mixed field condition also contained intermittent test movements that evaluated performance in the visual rotation condition. Results showed that errors reduced more rapidly in the mixed field condition. We also found that subjects were able to generalize what they learned to movement directions that were not part of the training, but there was no detectable difference between the two groups. Finally, we found no difference in the rate these training effects washed out after subjects returned to normal conditions. This study shows that training with robotic forces can facilitate the learning of visual rotations. The learning may be enhanced in the mixed condition by the addition of cutaneous and proprioceptive force sensors. Moreover, this study can be applied to telerobotics and the rehabilitation of brain injured individuals, where there is often a distortion in hand-eye coordination.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128073771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287185
J. Glassmire, M. O'Malley, W. Bluethmann, R. Ambrose
Robonaut is a humanoid robot designed by the Robotic Systems Technology Branch at NASA's Johnson Space Center in a collaborative effort with DARPA. This paper describes the implementation of haptic feedback into Robonaut. We conducted a cooperative manipulation task, inserting a flexible beam into an instrumented receptacle. This task was performed while both a human at the worksite and the teleoperated robot grasped the flexible beam simultaneously. Peak forces in the receptacle were consistently lower when the human operator was provided with kinesthetic force feedback in addition to other modalities of feedback such as gestures and voice commands. These findings are encouraging as the Dexterous Robotics Lab continues to implement force feedback into its teleoperator hardware architecture.
{"title":"Cooperative manipulation between humans and teleoperated agents","authors":"J. Glassmire, M. O'Malley, W. Bluethmann, R. Ambrose","doi":"10.1109/HAPTIC.2004.1287185","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287185","url":null,"abstract":"Robonaut is a humanoid robot designed by the Robotic Systems Technology Branch at NASA's Johnson Space Center in a collaborative effort with DARPA. This paper describes the implementation of haptic feedback into Robonaut. We conducted a cooperative manipulation task, inserting a flexible beam into an instrumented receptacle. This task was performed while both a human at the worksite and the teleoperated robot grasped the flexible beam simultaneously. Peak forces in the receptacle were consistently lower when the human operator was provided with kinesthetic force feedback in addition to other modalities of feedback such as gestures and voice commands. These findings are encouraging as the Dexterous Robotics Lab continues to implement force feedback into its teleoperator hardware architecture.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132524858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287182
Yasutoshi Makino, N. Asamura, H. Shinoda
In this paper we propose a new method for displaying touch sensation by controlling suction pressure. We discovered a tactile illusion that pulling skin through a hole with suction pressure causes a feeling as if a stick is pushing the skin. This illusion is considered to be caused by the insensitivity of our mechanoreceptors to signs of stress (negative or positive) that are sensitive to the strain energy. Our tactile display is based on the key concept of this illusion and that of "multi-primitive stimulation." We show that a simple structure of a sparse stimulator array produces various tactile sensations from a sharp edge to a smooth plane surface.
{"title":"Multi primitive tactile display based on suction pressure control","authors":"Yasutoshi Makino, N. Asamura, H. Shinoda","doi":"10.1109/HAPTIC.2004.1287182","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287182","url":null,"abstract":"In this paper we propose a new method for displaying touch sensation by controlling suction pressure. We discovered a tactile illusion that pulling skin through a hole with suction pressure causes a feeling as if a stick is pushing the skin. This illusion is considered to be caused by the insensitivity of our mechanoreceptors to signs of stress (negative or positive) that are sensitive to the strain energy. Our tactile display is based on the key concept of this illusion and that of \"multi-primitive stimulation.\" We show that a simple structure of a sparse stimulator array produces various tactile sensations from a sharp edge to a smooth plane surface.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130689152","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2004-03-27DOI: 10.1109/HAPTIC.2004.1287178
M. Ueberle, Nico Mock, M. Buss
This paper presents and discusses the design of a novel hyper-redundant haptic interface with 10 degrees-of-freedom (DOF). The use of additional joints allow:; a significantly larger workspace, while reducing the overall device size. Moreover, an increase in a variety of dexterity measures and a singularity robust redundancy resolution can be achieved. Numerical simulations of standard methods for inverse kinematics resolution, namely pseudo inverse control and the projection of some side criterion on the nullspace of the Jacobian are compared and evaluated.
{"title":"VISHARD10, a novel hyper-redundant haptic interface","authors":"M. Ueberle, Nico Mock, M. Buss","doi":"10.1109/HAPTIC.2004.1287178","DOIUrl":"https://doi.org/10.1109/HAPTIC.2004.1287178","url":null,"abstract":"This paper presents and discusses the design of a novel hyper-redundant haptic interface with 10 degrees-of-freedom (DOF). The use of additional joints allow:; a significantly larger workspace, while reducing the overall device size. Moreover, an increase in a variety of dexterity measures and a singularity robust redundancy resolution can be achieved. Numerical simulations of standard methods for inverse kinematics resolution, namely pseudo inverse control and the projection of some side criterion on the nullspace of the Jacobian are compared and evaluated.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129129493","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}