Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560699
Ishara Paranawithana, U-Xuan Tan, Liangjing Yang, Zhong Chen, K. Youcef-Toumi
This work proposes a fusion mechanism that overcomes the traditional limitations in vision-guided micromanipulation in plant cells. Despite the recent advancement in vision-guided micromanipulation, only a handful of research addressed the intrinsic issues related to micromanipulation in plant cells. Unlike single cell manipulation, the structural complexity of plant cells makes visual tracking extremely challenging. There is therefore a need to complement the visual tracking approach with trajectory data from the manipulator. Fusion of the two sources of data is done by combining the projected trajectory data to the image domain and template tracking data using a score-based weighted averaging approach. Similarity score reflecting the confidence of a particular localization result is used as the basis of the weighted average. As the projected trajectory data of the manipulator is not at all affected by the visual disturbances such as regional occlusion, fusing estimations from two sources leads to improved tracking performance. Experimental results suggest that fusion-based tracking mechanism maintains a mean error of 2.15 pixels whereas template tracking and projected trajectory data has a mean error of 2.49 and 2.61 pixels, respectively. Path B of the square trajectory demonstrated a significant improvement with a mean error of 1.11 pixels with 50% of the tracking ROI occluded by plant specimen. Under these conditions, both template tracking and projected trajectory data show similar performances with a mean error of 2.59 and 2.58 pixels, respectively. By addressing the limitations and unmet needs in the application of plant cell bio-manipulation, we hope to bridge the gap in the development of automatic vision-guided micromanipulation in plant cells.
{"title":"Scene-Adaptive Fusion of Visual and Motion Tracking for Vision-Guided Micromanipulation in Plant Cells","authors":"Ishara Paranawithana, U-Xuan Tan, Liangjing Yang, Zhong Chen, K. Youcef-Toumi","doi":"10.1109/COASE.2018.8560699","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560699","url":null,"abstract":"This work proposes a fusion mechanism that overcomes the traditional limitations in vision-guided micromanipulation in plant cells. Despite the recent advancement in vision-guided micromanipulation, only a handful of research addressed the intrinsic issues related to micromanipulation in plant cells. Unlike single cell manipulation, the structural complexity of plant cells makes visual tracking extremely challenging. There is therefore a need to complement the visual tracking approach with trajectory data from the manipulator. Fusion of the two sources of data is done by combining the projected trajectory data to the image domain and template tracking data using a score-based weighted averaging approach. Similarity score reflecting the confidence of a particular localization result is used as the basis of the weighted average. As the projected trajectory data of the manipulator is not at all affected by the visual disturbances such as regional occlusion, fusing estimations from two sources leads to improved tracking performance. Experimental results suggest that fusion-based tracking mechanism maintains a mean error of 2.15 pixels whereas template tracking and projected trajectory data has a mean error of 2.49 and 2.61 pixels, respectively. Path B of the square trajectory demonstrated a significant improvement with a mean error of 1.11 pixels with 50% of the tracking ROI occluded by plant specimen. Under these conditions, both template tracking and projected trajectory data show similar performances with a mean error of 2.59 and 2.58 pixels, respectively. By addressing the limitations and unmet needs in the application of plant cell bio-manipulation, we hope to bridge the gap in the development of automatic vision-guided micromanipulation in plant cells.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"51 1","pages":"1434-1440"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79208392","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560398
R. Berenstein, A. Wallach, Pelagie Elimbi Moudio, Peter Cuellar, Ken Goldberg
Mobile manipulator robots can benefit from utilizing a range of tools such as: screwdrivers, paintbrushes, hammers, drills, and sensors such thermal cameras. Proprietary tool changing systems exist for stationary robots but we are not aware of one for mobile manipulators. We design and implemented a modular tool changer with three components: robot attachment, tool attachment, and tool housing, designed with the following constrains: low-cost, backlash-free, compact, lightweight, passive, and modular. The tool changer is compatible with many robots and was evaluated with the Fetch robot for 100 repetitions of connecting and releasing the tool, of which 92 were successful. All 8 failures were due to inaccurate position of the robot arm. Changing a tool, from pickup to return, took on average of 16 seconds. This work is part of an ongoing research project on precision irrigation. The design is open-source and freely available at: https://goo.gl/zetwct.
{"title":"An Open-Access Passive Modular Tool Changing System for Mobile Manipulation Robots","authors":"R. Berenstein, A. Wallach, Pelagie Elimbi Moudio, Peter Cuellar, Ken Goldberg","doi":"10.1109/COASE.2018.8560398","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560398","url":null,"abstract":"Mobile manipulator robots can benefit from utilizing a range of tools such as: screwdrivers, paintbrushes, hammers, drills, and sensors such thermal cameras. Proprietary tool changing systems exist for stationary robots but we are not aware of one for mobile manipulators. We design and implemented a modular tool changer with three components: robot attachment, tool attachment, and tool housing, designed with the following constrains: low-cost, backlash-free, compact, lightweight, passive, and modular. The tool changer is compatible with many robots and was evaluated with the Fetch robot for 100 repetitions of connecting and releasing the tool, of which 92 were successful. All 8 failures were due to inaccurate position of the robot arm. Changing a tool, from pickup to return, took on average of 16 seconds. This work is part of an ongoing research project on precision irrigation. The design is open-source and freely available at: https://goo.gl/zetwct.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"353 1","pages":"592-598"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78093290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560485
L. Grech, G. Valentino, M. D. Castro, C. V. Almagro
We present a working prototype of a low-cost collision avoidance system for the retractable Radio-Protection (RP) arm, to be mounted on the Train Inspection Monorail (TIM) which is located in the European Organization for Nuclear Research (CERN)'s Large Hadron Collider (LHC) tunnel. Such a system is needed to permit the safe movement of the TIM with personnel present in the tunnel while the RP arm is extended to allow on-board sensors to take radiation measurements. The prototype used a series of eight TeraRanger One (TR1) Infrared (IR) Time-of-Flight (ToF) sensors to take distance measurements of the tunnel floor and overlying obstacles. A real-time system was then designed and deployed on the microcontroller mounted on the TeraRanger Hub (TRH). Sensor characterisation tests were also performed on the TR1 sensors to determine their performance and also allowed for calibration to be performed to improve the measurement accuracy for different materials.
我们提出了一个低成本的防撞系统的工作原型,用于可伸缩的无线电防护(RP)臂,安装在位于欧洲核子研究组织(CERN)的大型强子对撞机(LHC)隧道的列车检查单轨(TIM)上。需要这样一个系统,以便在隧道中有人员的情况下,TIM可以安全移动,同时RP臂可以扩展,以便车载传感器可以进行辐射测量。原型机使用了一系列8个TeraRanger One (TR1)红外(IR)飞行时间(ToF)传感器来测量隧道底板和上覆障碍物的距离。然后设计了一个实时系统,并将其部署在安装在TeraRanger Hub (TRH)上的微控制器上。还对TR1传感器进行了传感器特性测试,以确定其性能,并允许进行校准,以提高对不同材料的测量精度。
{"title":"Collision Avoidance System for the RP Survey and Visual Inspection Train in the CERN Large Hadron Collider","authors":"L. Grech, G. Valentino, M. D. Castro, C. V. Almagro","doi":"10.1109/COASE.2018.8560485","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560485","url":null,"abstract":"We present a working prototype of a low-cost collision avoidance system for the retractable Radio-Protection (RP) arm, to be mounted on the Train Inspection Monorail (TIM) which is located in the European Organization for Nuclear Research (CERN)'s Large Hadron Collider (LHC) tunnel. Such a system is needed to permit the safe movement of the TIM with personnel present in the tunnel while the RP arm is extended to allow on-board sensors to take radiation measurements. The prototype used a series of eight TeraRanger One (TR1) Infrared (IR) Time-of-Flight (ToF) sensors to take distance measurements of the tunnel floor and overlying obstacles. A real-time system was then designed and deployed on the microcontroller mounted on the TeraRanger Hub (TRH). Sensor characterisation tests were also performed on the TR1 sensors to determine their performance and also allowed for calibration to be performed to improve the measurement accuracy for different materials.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"42 1","pages":"817-822"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73886997","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560424
Imtiaz Ahmed, A. Dagnino, Alessandro Bongiovi, Yu Ding
A hydropower generation plant is a complex system and composed of numerous physical components. To monitor the health of different components it is necessary to detect anomalous behavior in time. Establishing a performance guideline along with identification of the critical variables causing anomalous behavior can help the maintenance personnel to detect any potential shift in the process timely. To establish any guideline for future control, at first a mechanism is needed to differentiate anomalous observations from the normal ones. In our work we have employed three different approaches to detect the anomalous observations and compared their performances using a historical data set received from a hydropower plant. The outliers detected are verified by the domain experts. Making use of a decision tree and feature selection process, we have identified some critical variables which are potentially linked to the presence of the outliers. We further developed a one-class classifier using the outlier cleaned dataset, which defines the normal working condition, and therefore, violation of the normal conditions could identify anomalous observations in future operations.
{"title":"Outlier Detection for Hydropower Generation Plant","authors":"Imtiaz Ahmed, A. Dagnino, Alessandro Bongiovi, Yu Ding","doi":"10.1109/COASE.2018.8560424","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560424","url":null,"abstract":"A hydropower generation plant is a complex system and composed of numerous physical components. To monitor the health of different components it is necessary to detect anomalous behavior in time. Establishing a performance guideline along with identification of the critical variables causing anomalous behavior can help the maintenance personnel to detect any potential shift in the process timely. To establish any guideline for future control, at first a mechanism is needed to differentiate anomalous observations from the normal ones. In our work we have employed three different approaches to detect the anomalous observations and compared their performances using a historical data set received from a hydropower plant. The outliers detected are verified by the domain experts. Making use of a decision tree and feature selection process, we have identified some critical variables which are potentially linked to the presence of the outliers. We further developed a one-class classifier using the outlier cleaned dataset, which defines the normal working condition, and therefore, violation of the normal conditions could identify anomalous observations in future operations.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"23 1","pages":"193-198"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90412317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560431
David Tseng, David Wang, Carolyn L. Chen, Lauren Miller, W. Song, J. Viers, S. Vougioukas, Stefano Carpin, J. A. Ojea, Ken Goldberg
Recent advances in unmanned aerial vehicles suggest that collecting aerial agricultural images can be cost-efficient, which can subsequently support automated precision irrigation. To study the potential for machine learning to learn local soil moisture conditions directly from such images, we developed a very fast, linear discrete-time simulation of plant growth based on the Richards equation. We use the simulator to generate large datasets of synthetic aerial images of a vineyard with known moisture conditions and then compare seven methods for inferring moisture conditions from images, in which the “uncorrelated plant” methods look at individual plants and the “correlated field” methods look at the entire vineyard: 1) constant prediction baseline, 2) linear Support Vector Machines (SVM), 3) Random Forests Uncorrelated Plant (RFUP), 4) Random Forests Correlated Field (RFCF), 5) two-layer Neural Networks (NN), 6) Deep Convolutional Neural Networks Uncorrelated Plant (CNNUP), and 7) Deep Convolutional Neural Networks Correlated Field (CNNCF). Experiments on held-out test images show that a globally-connected CNN performs best with normalized mean absolute error of 3.4%. Sensitivity experiments suggest that learned global CNNs are robust to injected noise in both the simulator and generated images as well as in the size of the training sets. In simulation, we compare the agricultural standard of flood irrigation to a proportional precision irrigation controller using the output of the global CNN and find that the latter can reduce water consumption by up to 52% and is also robust to errors in irrigation level, location, and timing. The first-order plant simulator and datasets are available at https://github.com/BerkeleyAutomation/RAPID.
{"title":"Towards Automating Precision Irrigation: Deep Learning to Infer Local Soil Moisture Conditions from Synthetic Aerial Agricultural Images","authors":"David Tseng, David Wang, Carolyn L. Chen, Lauren Miller, W. Song, J. Viers, S. Vougioukas, Stefano Carpin, J. A. Ojea, Ken Goldberg","doi":"10.1109/COASE.2018.8560431","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560431","url":null,"abstract":"Recent advances in unmanned aerial vehicles suggest that collecting aerial agricultural images can be cost-efficient, which can subsequently support automated precision irrigation. To study the potential for machine learning to learn local soil moisture conditions directly from such images, we developed a very fast, linear discrete-time simulation of plant growth based on the Richards equation. We use the simulator to generate large datasets of synthetic aerial images of a vineyard with known moisture conditions and then compare seven methods for inferring moisture conditions from images, in which the “uncorrelated plant” methods look at individual plants and the “correlated field” methods look at the entire vineyard: 1) constant prediction baseline, 2) linear Support Vector Machines (SVM), 3) Random Forests Uncorrelated Plant (RFUP), 4) Random Forests Correlated Field (RFCF), 5) two-layer Neural Networks (NN), 6) Deep Convolutional Neural Networks Uncorrelated Plant (CNNUP), and 7) Deep Convolutional Neural Networks Correlated Field (CNNCF). Experiments on held-out test images show that a globally-connected CNN performs best with normalized mean absolute error of 3.4%. Sensitivity experiments suggest that learned global CNNs are robust to injected noise in both the simulator and generated images as well as in the size of the training sets. In simulation, we compare the agricultural standard of flood irrigation to a proportional precision irrigation controller using the output of the global CNN and find that the latter can reduce water consumption by up to 52% and is also robust to errors in irrigation level, location, and timing. The first-order plant simulator and datasets are available at https://github.com/BerkeleyAutomation/RAPID.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"58 1","pages":"284-291"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90714005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/coase.2018.8560392
{"title":"Technical Program Contents List","authors":"","doi":"10.1109/coase.2018.8560392","DOIUrl":"https://doi.org/10.1109/coase.2018.8560392","url":null,"abstract":"","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"394 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76680362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560418
D. Tomzik, X. Xu
Conventional control systems for machine tools and manufacturing systems are often limited in their computational power, connectivity and interoperability. Cloud-based control systems are a solution that addresses these issues. Advantages of the cloud are the elasticity of computational power (Infrastructure as a Service) and a plethora of development tools (Platform as a Service). The developed solutions are based on a local control system with an additional connection to the cloud. Communication and control of the field level run centralised through this control system. To try for more flexibility, we propose an approach where individual components at the field level are directly connected to the cloud. They are equipped with computational resources, connected directly to a TCP/IP network and communicate with each other and perform control tasks. This had been made possible by ever-shrinking integrated circuits at lower prices. In this paper, a possible use scenario, hardware candidates, and firmware aspects are presented. For an initial examination, the findings were compared against requirements for cloud-based control in the application area of soft-tissue interaction. This proposed architecture will be the basis for a prototype in the future.
{"title":"Architecture of a Cloud-Based Control System Decentralised at Field Level","authors":"D. Tomzik, X. Xu","doi":"10.1109/COASE.2018.8560418","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560418","url":null,"abstract":"Conventional control systems for machine tools and manufacturing systems are often limited in their computational power, connectivity and interoperability. Cloud-based control systems are a solution that addresses these issues. Advantages of the cloud are the elasticity of computational power (Infrastructure as a Service) and a plethora of development tools (Platform as a Service). The developed solutions are based on a local control system with an additional connection to the cloud. Communication and control of the field level run centralised through this control system. To try for more flexibility, we propose an approach where individual components at the field level are directly connected to the cloud. They are equipped with computational resources, connected directly to a TCP/IP network and communicate with each other and perform control tasks. This had been made possible by ever-shrinking integrated circuits at lower prices. In this paper, a possible use scenario, hardware candidates, and firmware aspects are presented. For an initial examination, the findings were compared against requirements for cloud-based control in the application area of soft-tissue interaction. This proposed architecture will be the basis for a prototype in the future.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"28 1","pages":"353-358"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85784557","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560702
Hsieh-Yu Li, Ishara Paranawithana, Liangjing Yang, U-Xuan Tan
There is an increasing number of applications in physical human-robot interaction (pHRI) where the end-effector of the robot is compliant in response to the force exerted by the human. The force sensor is normally mounted with an instrument on the end-effector to measure the human operational force. However, when the robot is in contact with the human and an environment simultaneously, the force sensor reading includes both the human and the environmental force resulting in ineffective contacting interaction within these three objects (robot, human and environment). In addition, if the environment is moving, it is more challenging for the operator to track the target with the robot. Therefore, in this paper, we address the issue of pHRI coupled with a moving environment. More specifically, we use a collaborative robot with an ultrasound probe as an illustration due to its sophisticated condition: the operator needs to contact the environment using a sufficient force to get clearer images and track the moving target. The proposed control scheme is employed using only one force sensor to guarantee a stable physical interaction within three objects and provide the compliant and intuitive operation for human. Experiments with a collaborative robot are conducted to evaluate the effectiveness of the proposed controller.
{"title":"Physical Human-Robot Interaction Coupled with a Moving Environment or Target: Contact and Track","authors":"Hsieh-Yu Li, Ishara Paranawithana, Liangjing Yang, U-Xuan Tan","doi":"10.1109/COASE.2018.8560702","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560702","url":null,"abstract":"There is an increasing number of applications in physical human-robot interaction (pHRI) where the end-effector of the robot is compliant in response to the force exerted by the human. The force sensor is normally mounted with an instrument on the end-effector to measure the human operational force. However, when the robot is in contact with the human and an environment simultaneously, the force sensor reading includes both the human and the environmental force resulting in ineffective contacting interaction within these three objects (robot, human and environment). In addition, if the environment is moving, it is more challenging for the operator to track the target with the robot. Therefore, in this paper, we address the issue of pHRI coupled with a moving environment. More specifically, we use a collaborative robot with an ultrasound probe as an illustration due to its sophisticated condition: the operator needs to contact the environment using a sufficient force to get clearer images and track the moving target. The proposed control scheme is employed using only one force sensor to guarantee a stable physical interaction within three objects and provide the compliant and intuitive operation for human. Experiments with a collaborative robot are conducted to evaluate the effectiveness of the proposed controller.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"17 1","pages":"43-49"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86889790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560382
Jiajun Xu, Linsen Xu, Jinfu Liu, Xiaohu Li, Xuan Wu
In this paper, a multi-mode biomimetic wall-climbing robot is represented, which employs spiny wheels, adhesive treads, spiny treads and a suction cup, and it can switch different modes with self-adapting to different terrains. The robot employs spiny wheels and spiny treads while meeting with rough surfaces and employs adhesive treads while encountering smooth surfaces. And a suction cup is applied all the time for assistive adhesive function. The adhesion property of the adhesive materials is analyzed, and their high reliability is proved. Moreover, the prototype of the robot is manufactured, and some experiments are completed.
{"title":"A Multi-Mode Biomimetic Wall-Climbing Robot","authors":"Jiajun Xu, Linsen Xu, Jinfu Liu, Xiaohu Li, Xuan Wu","doi":"10.1109/COASE.2018.8560382","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560382","url":null,"abstract":"In this paper, a multi-mode biomimetic wall-climbing robot is represented, which employs spiny wheels, adhesive treads, spiny treads and a suction cup, and it can switch different modes with self-adapting to different terrains. The robot employs spiny wheels and spiny treads while meeting with rough surfaces and employs adhesive treads while encountering smooth surfaces. And a suction cup is applied all the time for assistive adhesive function. The adhesion property of the adhesive materials is analyzed, and their high reliability is proved. Moreover, the prototype of the robot is manufactured, and some experiments are completed.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"39 1","pages":"514-519"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87231274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-08-01DOI: 10.1109/COASE.2018.8560587
Fadi Assad, E. Rushforth, Mus'ab H. Ahmad, B. Ahmad, R. Harrison
In today's manufacturing industry, higher productivity and sustainability should go hand-in-hand. This practice is motivated by governmental regulations as well as customers' awareness. For the current time, one of the inexpensive solutions is motion planning for an improved energy consumption. This paper introduces a general approach that is valid for testing and optimising energy consumption of the input motion profile. The Particle Swarm Optimisation method (PSO) is used because of its mathematical simplicity and quick convergence. Being commonly used, s-curve motion profile is reconstructed and optimised for a better energy consumption. The results show potential energy reduction and better positioning for the system configured according to the optimised s-curve.
{"title":"An Approach of Optimising S-curve Trajectory for a Better Energy Consumption","authors":"Fadi Assad, E. Rushforth, Mus'ab H. Ahmad, B. Ahmad, R. Harrison","doi":"10.1109/COASE.2018.8560587","DOIUrl":"https://doi.org/10.1109/COASE.2018.8560587","url":null,"abstract":"In today's manufacturing industry, higher productivity and sustainability should go hand-in-hand. This practice is motivated by governmental regulations as well as customers' awareness. For the current time, one of the inexpensive solutions is motion planning for an improved energy consumption. This paper introduces a general approach that is valid for testing and optimising energy consumption of the input motion profile. The Particle Swarm Optimisation method (PSO) is used because of its mathematical simplicity and quick convergence. Being commonly used, s-curve motion profile is reconstructed and optimised for a better energy consumption. The results show potential energy reduction and better positioning for the system configured according to the optimised s-curve.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"33 1","pages":"98-103"},"PeriodicalIF":0.0,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91300965","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}