{"title":"基于人机交互的机器人拆卸信息提取","authors":"Joao Paulo Jacomini Prioli, Jeremy L. Rickli","doi":"10.1080/0951192x.2023.2257667","DOIUrl":null,"url":null,"abstract":"ABSTRACTDisassembly of end-of-use products is critical to the economic feasibility of circular feedstock, reusing, recycling and remanufacturing loops. High-volume disassembly operations are constrained by disassembly complexity and product variability. The capability to buffer against timing, quantity and quality uncertainties of end-of-use products impacts the efficiency and profitability of demanufacturing systems. To achieve a competitive operation in the manufacturing life-cycle, disassembly systems need automated lines, however, the unpredictability of core supply challenges automation adaptability. Disassembly robot trajectories that are programmed manually or controlled by vision systems can be time intensive and subject to variability in lighting conditions and image recognition models. Alternatively, this paper presents a novel human-robot disassembly framework to systematically extract and generate robot trajectories derived from human-collaborative robot (cobot) disassembly. The collaborative training station proposed classifies trajectory segments and then adjusts trajectories to station-specific robots in a high-volume disassembly line. Virtual and physical collaborative disassembly case studies are presented and discussed. Results demonstrate the effectiveness of the disassembly data extraction method but indicate a disparity between the expected and ideal disassembly trajectories due to variability from human handling, which is further discussed in this paper.KEYWORDS: Disassemblycollaborative robotsremanufacturingsystem framework Disclosure statementNo potential conflict of interest was reported by the authors.Additional informationFundingThis work was supported by the Critical Materials Institute in collaboration with Oak Ridge National Laboratory [FA-3.3.11].","PeriodicalId":13907,"journal":{"name":"International Journal of Computer Integrated Manufacturing","volume":"139 1","pages":"0"},"PeriodicalIF":3.7000,"publicationDate":"2023-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Human-robot interaction for extraction of robotic disassembly information\",\"authors\":\"Joao Paulo Jacomini Prioli, Jeremy L. Rickli\",\"doi\":\"10.1080/0951192x.2023.2257667\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACTDisassembly of end-of-use products is critical to the economic feasibility of circular feedstock, reusing, recycling and remanufacturing loops. High-volume disassembly operations are constrained by disassembly complexity and product variability. The capability to buffer against timing, quantity and quality uncertainties of end-of-use products impacts the efficiency and profitability of demanufacturing systems. To achieve a competitive operation in the manufacturing life-cycle, disassembly systems need automated lines, however, the unpredictability of core supply challenges automation adaptability. Disassembly robot trajectories that are programmed manually or controlled by vision systems can be time intensive and subject to variability in lighting conditions and image recognition models. Alternatively, this paper presents a novel human-robot disassembly framework to systematically extract and generate robot trajectories derived from human-collaborative robot (cobot) disassembly. The collaborative training station proposed classifies trajectory segments and then adjusts trajectories to station-specific robots in a high-volume disassembly line. Virtual and physical collaborative disassembly case studies are presented and discussed. Results demonstrate the effectiveness of the disassembly data extraction method but indicate a disparity between the expected and ideal disassembly trajectories due to variability from human handling, which is further discussed in this paper.KEYWORDS: Disassemblycollaborative robotsremanufacturingsystem framework Disclosure statementNo potential conflict of interest was reported by the authors.Additional informationFundingThis work was supported by the Critical Materials Institute in collaboration with Oak Ridge National Laboratory [FA-3.3.11].\",\"PeriodicalId\":13907,\"journal\":{\"name\":\"International Journal of Computer Integrated Manufacturing\",\"volume\":\"139 1\",\"pages\":\"0\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2023-09-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Computer Integrated Manufacturing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/0951192x.2023.2257667\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Integrated Manufacturing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/0951192x.2023.2257667","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Human-robot interaction for extraction of robotic disassembly information
ABSTRACTDisassembly of end-of-use products is critical to the economic feasibility of circular feedstock, reusing, recycling and remanufacturing loops. High-volume disassembly operations are constrained by disassembly complexity and product variability. The capability to buffer against timing, quantity and quality uncertainties of end-of-use products impacts the efficiency and profitability of demanufacturing systems. To achieve a competitive operation in the manufacturing life-cycle, disassembly systems need automated lines, however, the unpredictability of core supply challenges automation adaptability. Disassembly robot trajectories that are programmed manually or controlled by vision systems can be time intensive and subject to variability in lighting conditions and image recognition models. Alternatively, this paper presents a novel human-robot disassembly framework to systematically extract and generate robot trajectories derived from human-collaborative robot (cobot) disassembly. The collaborative training station proposed classifies trajectory segments and then adjusts trajectories to station-specific robots in a high-volume disassembly line. Virtual and physical collaborative disassembly case studies are presented and discussed. Results demonstrate the effectiveness of the disassembly data extraction method but indicate a disparity between the expected and ideal disassembly trajectories due to variability from human handling, which is further discussed in this paper.KEYWORDS: Disassemblycollaborative robotsremanufacturingsystem framework Disclosure statementNo potential conflict of interest was reported by the authors.Additional informationFundingThis work was supported by the Critical Materials Institute in collaboration with Oak Ridge National Laboratory [FA-3.3.11].
期刊介绍:
International Journal of Computer Integrated Manufacturing (IJCIM) reports new research in theory and applications of computer integrated manufacturing. The scope spans mechanical and manufacturing engineering, software and computer engineering as well as automation and control engineering with a particular focus on today’s data driven manufacturing. Terms such as industry 4.0, intelligent manufacturing, digital manufacturing and cyber-physical manufacturing systems are now used to identify the area of knowledge that IJCIM has supported and shaped in its history of more than 30 years.
IJCIM continues to grow and has become a key forum for academics and industrial researchers to exchange information and ideas. In response to this interest, IJCIM is now published monthly, enabling the editors to target topical special issues; topics as diverse as digital twins, transdisciplinary engineering, cloud manufacturing, deep learning for manufacturing, service-oriented architectures, dematerialized manufacturing systems, wireless manufacturing and digital enterprise technologies to name a few.