Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759162
S. Crozet, J. Léon, Xavier Merlhiot
Computing multiple contact points between geometric models evolved in a virtual 3D environment is central to many robotic simulation applications. While this task can be performed efficiently and robustly between complex polyhedra, using the exact analytic geometric models issued by CAD modelers still suffers from efficiency limitations. Yet models composed of smooth surfaces are required to ensure smooth contact constraints, thus avoiding possible numerical artifacts which may dramatically affect the behavior of the system in the case of functional contacts. This paper builds on the observation that industrial CAD models are mostly composed of simple surfaces to perform an off-line identification of similar features and build a bounding volume hierarchy in order to locate potential contacts. Those are then computed by dedicated analytic methods, or an iterative root-finder, depending on the actual geometric representations of the features. In the context of dynamic simulation of robotic tasks, our method exhibits interactive computation times while naturally providing better result accuracy than existing polyhedron-specific algorithms.
{"title":"Fast computation of contact points for robotic simulations based on CAD models without tessellation","authors":"S. Crozet, J. Léon, Xavier Merlhiot","doi":"10.1109/IROS.2016.7759162","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759162","url":null,"abstract":"Computing multiple contact points between geometric models evolved in a virtual 3D environment is central to many robotic simulation applications. While this task can be performed efficiently and robustly between complex polyhedra, using the exact analytic geometric models issued by CAD modelers still suffers from efficiency limitations. Yet models composed of smooth surfaces are required to ensure smooth contact constraints, thus avoiding possible numerical artifacts which may dramatically affect the behavior of the system in the case of functional contacts. This paper builds on the observation that industrial CAD models are mostly composed of simple surfaces to perform an off-line identification of similar features and build a bounding volume hierarchy in order to locate potential contacts. Those are then computed by dedicated analytic methods, or an iterative root-finder, depending on the actual geometric representations of the features. In the context of dynamic simulation of robotic tasks, our method exhibits interactive computation times while naturally providing better result accuracy than existing polyhedron-specific algorithms.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123930397","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759506
Sebastian G. Brunner, Franz Steinmetz, Rico Belder, Andreas Dömel
Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to orchestrate the systems to fulfill demanding autonomous tasks. For this purpose, we developed a new graphical tool targeting at the creation and execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept is given. The tool provides many debugging mechanisms and a GUI with a graphical editor, allowing for intuitive visual programming and fast prototyping. The application of RAFCON for an autonomous mobile robot in the SpaceBotCamp competition has already proved to be successful.
{"title":"RAFCON: A graphical tool for engineering complex, robotic tasks","authors":"Sebastian G. Brunner, Franz Steinmetz, Rico Belder, Andreas Dömel","doi":"10.1109/IROS.2016.7759506","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759506","url":null,"abstract":"Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to orchestrate the systems to fulfill demanding autonomous tasks. For this purpose, we developed a new graphical tool targeting at the creation and execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept is given. The tool provides many debugging mechanisms and a GUI with a graphical editor, allowing for intuitive visual programming and fast prototyping. The application of RAFCON for an autonomous mobile robot in the SpaceBotCamp competition has already proved to be successful.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120972545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759263
M. Laiacker, F. Huber, K. Kondak
This paper is devoted to the performance optimization of an aerial manipulation system composed of a Flettner-helicopter and 7 DoF manipulator. With experiments we demonstrate that the time delays in signal propagation between perception and actuation modules play an important role for the overall performance of an aerial manipulator system using visual servoing. We present an approach for estimation of the perception-action time delay and its active compensation based on the predicted motion of the manipulator end-effector.
{"title":"High accuracy visual servoing for aerial manipulation using a 7 degrees of freedom industrial manipulator","authors":"M. Laiacker, F. Huber, K. Kondak","doi":"10.1109/IROS.2016.7759263","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759263","url":null,"abstract":"This paper is devoted to the performance optimization of an aerial manipulation system composed of a Flettner-helicopter and 7 DoF manipulator. With experiments we demonstrate that the time delays in signal propagation between perception and actuation modules play an important role for the overall performance of an aerial manipulator system using visual servoing. We present an approach for estimation of the perception-action time delay and its active compensation based on the predicted motion of the manipulator end-effector.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125839207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759549
Fredrik Bagge Carlson, M. Karlsson, A. Robertsson, Rolf Johansson
We provide a framework for 6 DOF pose estimation in seam-tracking applications using particle filtering. The particle filter algorithm developed incorporates measurements from both a 2 DOF laser seam tracker and the robot forward kinematics under an assumed external force. Special attention is paid to modeling of disturbances in the respective measurements, and methods are developed to assist the selection of sensor configurations for optimal estimation performance. The developed estimation algorithm and simulation environment are provided as an open-source, extendable package, written with an intended balance between readability and performance.
{"title":"Particle filter framework for 6D seam tracking under large external forces using 2D laser sensors","authors":"Fredrik Bagge Carlson, M. Karlsson, A. Robertsson, Rolf Johansson","doi":"10.1109/IROS.2016.7759549","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759549","url":null,"abstract":"We provide a framework for 6 DOF pose estimation in seam-tracking applications using particle filtering. The particle filter algorithm developed incorporates measurements from both a 2 DOF laser seam tracker and the robot forward kinematics under an assumed external force. Special attention is paid to modeling of disturbances in the respective measurements, and methods are developed to assist the selection of sensor configurations for optimal estimation performance. The developed estimation algorithm and simulation environment are provided as an open-source, extendable package, written with an intended balance between readability and performance.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121945440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759519
Ganna Pugach, A. Melnyk, O. Tolochko, Alexandre Pitti, P. Gaussier
Touch perception is an important sense to model in humanoid robots to interact physically and socially with humans. We present a neural controller that can adapt the compliance of the robot arm in four directions using as input the tactile information from an artificial skin and as output the estimated torque for admittance control-loop reference. This adaption is done in a self-organized fashion with a neural system that learns first the topology of the tactile map when we touch it and associates a torque vector to move the arm in the corresponding direction. The artificial skin is based on a large area piezoresistive tactile device (ungridded) that changes its electrical properties in the presence of the contact. Our results show the self-calibration of a robotic arm (2 degrees of freedom) controlled in the four directions and derived combination vectors, by the soft touch on all the tactile surface, even when the torque is not detectable (force applied near the joint). The neural system associates each tactile receptive field with one direction and the correct force. We show that the tactile-motor learning gives better interactive experiments than the admittance control of the robotic arm only. Our method can be used in the future for humanoid adaptive interaction with a human partner.
{"title":"Touch-based admittance control of a robotic arm using neural learning of an artificial skin","authors":"Ganna Pugach, A. Melnyk, O. Tolochko, Alexandre Pitti, P. Gaussier","doi":"10.1109/IROS.2016.7759519","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759519","url":null,"abstract":"Touch perception is an important sense to model in humanoid robots to interact physically and socially with humans. We present a neural controller that can adapt the compliance of the robot arm in four directions using as input the tactile information from an artificial skin and as output the estimated torque for admittance control-loop reference. This adaption is done in a self-organized fashion with a neural system that learns first the topology of the tactile map when we touch it and associates a torque vector to move the arm in the corresponding direction. The artificial skin is based on a large area piezoresistive tactile device (ungridded) that changes its electrical properties in the presence of the contact. Our results show the self-calibration of a robotic arm (2 degrees of freedom) controlled in the four directions and derived combination vectors, by the soft touch on all the tactile surface, even when the torque is not detectable (force applied near the joint). The neural system associates each tactile receptive field with one direction and the correct force. We show that the tactile-motor learning gives better interactive experiments than the admittance control of the robotic arm only. Our method can be used in the future for humanoid adaptive interaction with a human partner.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126961953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759799
Louise Devigne, Vishnu K. Narayanan, François Pasteau, Marie Babel
Motor or visual impairments may prevent a user from steering a wheelchair effectively in indoor environments. In such cases, joystick jerks arising from uncontrolled motions may lead to collisions with obstacles. We here propose a perceptive shared control system that progressively corrects the trajectory as a user manually drives the wheelchair, by means of a sensor-based shared control law capable of smoothly avoiding obstacles. This control law is based on a low complex optimization framework validated through simulations and extensive clinical trials. The provided model uses distance information. Therefore, for low-cost considerations, we use ultrasonic sensors to measure the distances around the wheelchair. The solution therefore provides an efficient assistive tool that does not alter the quality of experience perceived by the user, while ensuring his security in hazardous situations.
{"title":"Low complex sensor-based shared control for power wheelchair navigation","authors":"Louise Devigne, Vishnu K. Narayanan, François Pasteau, Marie Babel","doi":"10.1109/IROS.2016.7759799","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759799","url":null,"abstract":"Motor or visual impairments may prevent a user from steering a wheelchair effectively in indoor environments. In such cases, joystick jerks arising from uncontrolled motions may lead to collisions with obstacles. We here propose a perceptive shared control system that progressively corrects the trajectory as a user manually drives the wheelchair, by means of a sensor-based shared control law capable of smoothly avoiding obstacles. This control law is based on a low complex optimization framework validated through simulations and extensive clinical trials. The provided model uses distance information. Therefore, for low-cost considerations, we use ultrasonic sensors to measure the distances around the wheelchair. The solution therefore provides an efficient assistive tool that does not alter the quality of experience perceived by the user, while ensuring his security in hazardous situations.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"400 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126674601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759591
Ferdian Jovan, J. Wyatt, Nick Hawes, T. Krajník
The efficiency of autonomous robots depends on how well they understand their operating environment. While most of the traditional environment models focus on the spatial representation, long-term mobile robot operation in human populated environments requires that the robots have a basic model of human behaviour.
{"title":"A Poisson-spectral model for modelling temporal patterns in human data observed by a robot","authors":"Ferdian Jovan, J. Wyatt, Nick Hawes, T. Krajník","doi":"10.1109/IROS.2016.7759591","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759591","url":null,"abstract":"The efficiency of autonomous robots depends on how well they understand their operating environment. While most of the traditional environment models focus on the spatial representation, long-term mobile robot operation in human populated environments requires that the robots have a basic model of human behaviour.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133834659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759671
T. Krajník, J. P. Fentanes, Marc Hanheide, T. Duckett
We present a lifelong mapping and localisation system for long-term autonomous operation of mobile robots in changing environments. The core of the system is a spatio-temporal occupancy grid that explicitly represents the persistence and periodicity of the individual cells and can predict the probability of their occupancy in the future. During navigation, our robot builds temporally local maps and integrates then into the global spatio-temporal grid. Through re-observation of the same locations, the spatio-temporal grid learns the long-term environment dynamics and gains the ability to predict the future environment states. This predictive ability allows to generate time-specific 2d maps used by the robot's localisation and planning modules. By analysing data from a long-term deployment of the robot in a human-populated environment, we show that the proposed representation improves localisation accuracy and the efficiency of path planning. We also show how to integrate the method into the ROS navigation stack for use by other roboticists.
{"title":"Persistent localization and life-long mapping in changing environments using the Frequency Map Enhancement","authors":"T. Krajník, J. P. Fentanes, Marc Hanheide, T. Duckett","doi":"10.1109/IROS.2016.7759671","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759671","url":null,"abstract":"We present a lifelong mapping and localisation system for long-term autonomous operation of mobile robots in changing environments. The core of the system is a spatio-temporal occupancy grid that explicitly represents the persistence and periodicity of the individual cells and can predict the probability of their occupancy in the future. During navigation, our robot builds temporally local maps and integrates then into the global spatio-temporal grid. Through re-observation of the same locations, the spatio-temporal grid learns the long-term environment dynamics and gains the ability to predict the future environment states. This predictive ability allows to generate time-specific 2d maps used by the robot's localisation and planning modules. By analysing data from a long-term deployment of the robot in a human-populated environment, we show that the proposed representation improves localisation accuracy and the efficiency of path planning. We also show how to integrate the method into the ROS navigation stack for use by other roboticists.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131759693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759422
Yinoussa Adagolodjo, L. Goffin, M. Mathelin, H. Courtecuisse
This paper introduces a new method for automatic robotic needle steering in deformable tissues. The main contribution relies on the use of an inverse Finite Element (FE) simulation to control an articulated robot interacting with deformable structures. In this work we consider a flexible needle, embedded in the end effector of a 6 arm Mitsubishi RV1A robot, and its insertion into a silicone phantom. Given a trajectory on the rest configuration of the silicone phantom, our method provides in real-time the displacements of the articulated robot which guarantee the permanence of the needle within the predefined path, taking into account any undergoing deformation on both the needle and the trajectory itself. A forward simulation combines i) a kinematic model of the robot, ii) FE models of the needle and phantom gel iii) an interaction model allowing the simulation of friction and puncture force. A Newton-type method is then used to provide the displacement of the robot to minimize the distance between the needle's tip and the desired trajectory. We validate our approach with a simulation in which a virtual robot can successfully perform the insertion while both the needle and the trajectory undergo significant deformations.
{"title":"Inverse real-time Finite Element simulation for robotic control of flexible needle insertion in deformable tissues","authors":"Yinoussa Adagolodjo, L. Goffin, M. Mathelin, H. Courtecuisse","doi":"10.1109/IROS.2016.7759422","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759422","url":null,"abstract":"This paper introduces a new method for automatic robotic needle steering in deformable tissues. The main contribution relies on the use of an inverse Finite Element (FE) simulation to control an articulated robot interacting with deformable structures. In this work we consider a flexible needle, embedded in the end effector of a 6 arm Mitsubishi RV1A robot, and its insertion into a silicone phantom. Given a trajectory on the rest configuration of the silicone phantom, our method provides in real-time the displacements of the articulated robot which guarantee the permanence of the needle within the predefined path, taking into account any undergoing deformation on both the needle and the trajectory itself. A forward simulation combines i) a kinematic model of the robot, ii) FE models of the needle and phantom gel iii) an interaction model allowing the simulation of friction and puncture force. A Newton-type method is then used to provide the displacement of the robot to minimize the distance between the needle's tip and the desired trajectory. We validate our approach with a simulation in which a virtual robot can successfully perform the insertion while both the needle and the trajectory undergo significant deformations.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125094554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-09DOI: 10.1109/IROS.2016.7759136
Bidan Huang, Alessandro Vandini, Yang Hu, Su-Lin Lee, Guang-Zhong Yang
This paper presents an intelligent sewing system for personalized stent graft manufacturing, a challenging sewing task that is currently performed manually. Inspired by medical suturing robots, we have adopted a single-sided sewing technique using a curved needle to perform the task of sewing stents onto fabric. A motorized surgical needle driver was attached to a 7 d.o.f robot arm to manipulate the needle with a second robot controlling the position of the mandrel. A learning-from-demonstration approach was used to program the robot to sew stents onto fabric. The demonstrated sewing skill was segmented to several phases, each of which was encoded with a Gaussian Mixture Model. Generalized sewing movements were then generated from these models and were used for task execution. During execution, a stereo vision system was adopted to guide the robots and adjust the learnt movements according to the needle pose. Two experiments are presented here with this system and the results show that our system can robustly perform the sewing task as well as adapt to various needle poses. The accuracy of the sewing system was within 2mm.
{"title":"A vision-guided dual arm sewing system for stent graft manufacturing","authors":"Bidan Huang, Alessandro Vandini, Yang Hu, Su-Lin Lee, Guang-Zhong Yang","doi":"10.1109/IROS.2016.7759136","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759136","url":null,"abstract":"This paper presents an intelligent sewing system for personalized stent graft manufacturing, a challenging sewing task that is currently performed manually. Inspired by medical suturing robots, we have adopted a single-sided sewing technique using a curved needle to perform the task of sewing stents onto fabric. A motorized surgical needle driver was attached to a 7 d.o.f robot arm to manipulate the needle with a second robot controlling the position of the mandrel. A learning-from-demonstration approach was used to program the robot to sew stents onto fabric. The demonstrated sewing skill was segmented to several phases, each of which was encoded with a Gaussian Mixture Model. Generalized sewing movements were then generated from these models and were used for task execution. During execution, a stereo vision system was adopted to guide the robots and adjust the learnt movements according to the needle pose. Two experiments are presented here with this system and the results show that our system can robustly perform the sewing task as well as adapt to various needle poses. The accuracy of the sewing system was within 2mm.","PeriodicalId":296337,"journal":{"name":"2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126320301","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}