{"title":"Reinforcement Learning Strategy-Based Adaptive Tracking Control for Underactuated Dual Ship-Mounted Cranes: Theoretical Design and Hardware Experiments","authors":"Shujie Wu;Haibo Zhang;Yuzhe Qian","doi":"10.1109/TIE.2024.3481885","DOIUrl":null,"url":null,"abstract":"As a flexible transportation equipment, the dual ship-mounted crane (DSMC) systems are widely used to transport cargos/goods under complex marine and harbor environments. However, automatic control of such complex systems still faces significant challenges due to their underactuated characteristics, unexpected sea wave disturbances, and uncertain system parameters. Most existing control methods are based on accurate dynamics model or linearized models, which can hardly suppress unknown interferences or may badly decreasing control effects when there exist system uncertainties. To solve the above problems, a reinforcement learning based adaptive tracking control method is proposed in this article, which can obtain a satisfactory control performance without accurate system parameters. Specifically, an actor and a critic neural network are constructed to execute the reinforcement learning (RL) algorithm, for which, the actor-network executes the control input, and the critic network judges the control performance and feedback reinforcement signal to the action network. In addition, a robust integral of the sign of error feedback signal is introduced to improve the robustness of the system. Based on Lyapunov stability theory, it is proved that the tracking error can converge to zero asymptotically under the proposed controller. Finally, hardware experimental results show the effectiveness and robustness of the proposed controller.","PeriodicalId":13402,"journal":{"name":"IEEE Transactions on Industrial Electronics","volume":"72 5","pages":"5408-5417"},"PeriodicalIF":7.2000,"publicationDate":"2024-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Industrial Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10738215/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
As a flexible transportation equipment, the dual ship-mounted crane (DSMC) systems are widely used to transport cargos/goods under complex marine and harbor environments. However, automatic control of such complex systems still faces significant challenges due to their underactuated characteristics, unexpected sea wave disturbances, and uncertain system parameters. Most existing control methods are based on accurate dynamics model or linearized models, which can hardly suppress unknown interferences or may badly decreasing control effects when there exist system uncertainties. To solve the above problems, a reinforcement learning based adaptive tracking control method is proposed in this article, which can obtain a satisfactory control performance without accurate system parameters. Specifically, an actor and a critic neural network are constructed to execute the reinforcement learning (RL) algorithm, for which, the actor-network executes the control input, and the critic network judges the control performance and feedback reinforcement signal to the action network. In addition, a robust integral of the sign of error feedback signal is introduced to improve the robustness of the system. Based on Lyapunov stability theory, it is proved that the tracking error can converge to zero asymptotically under the proposed controller. Finally, hardware experimental results show the effectiveness and robustness of the proposed controller.
期刊介绍:
Journal Name: IEEE Transactions on Industrial Electronics
Publication Frequency: Monthly
Scope:
The scope of IEEE Transactions on Industrial Electronics encompasses the following areas:
Applications of electronics, controls, and communications in industrial and manufacturing systems and processes.
Power electronics and drive control techniques.
System control and signal processing.
Fault detection and diagnosis.
Power systems.
Instrumentation, measurement, and testing.
Modeling and simulation.
Motion control.
Robotics.
Sensors and actuators.
Implementation of neural networks, fuzzy logic, and artificial intelligence in industrial systems.
Factory automation.
Communication and computer networks.