{"title":"用于生产线精确质量检测的协作式视觉传感系统","authors":"Jiale Chen, Duc Van Le, Rui Tan, Daren Ho","doi":"10.1145/3643136","DOIUrl":null,"url":null,"abstract":"\n Visual sensing has been widely adopted for quality inspection in production processes. This paper presents the design and implementation of a smart collaborative camera system, called\n BubCam\n , for automated quality inspection of manufactured ink bags in Hewlett-Packard (HP) Inc.’s factories. Specifically, BubCam estimates the volume of air bubbles in an ink bag, which may affect the printing quality. The design of BubCam faces challenges due to the dynamic ambient light reflection, motion blur effect, and data labeling difficulty. As a starting point, we design a single-camera system which leverages various deep learning (DL)-based image segmentation and depth fusion techniques. New data labeling and training approaches are proposed to utilize prior knowledge of the production system for training the segmentation model with a small dataset. Then, we design a multi-camera system which additionally deploys multiple wireless cameras to achieve better accuracy due to multi-view sensing. To save power of the wireless cameras, we formulate a configuration adaptation problem and develop the single-agent and multi-agent deep reinforcement learning (DRL)-based solutions to adjust each wireless camera’s operation mode and frame rate in response to the changes of presence of air bubbles and light reflection. The multi-agent DRL approach aims to reduce the retraining costs during the production line reconfiguration process by only retraining the DRL agents for the newly added cameras and the existing cameras with changed positions. Extensive evaluation on a lab testbed and real factory trial shows that BubCam outperforms six baseline solutions including the current manual inspection and existing bubble detection and camera configuration adaptation approaches. In particular, BubCam achieves 1.3x accuracy improvement and 300x latency reduction, compared with the manual inspection approach.\n","PeriodicalId":505086,"journal":{"name":"ACM Transactions on Cyber-Physical Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Collaborative Visual Sensing System for Precise Quality Inspection at Manufacturing Lines\",\"authors\":\"Jiale Chen, Duc Van Le, Rui Tan, Daren Ho\",\"doi\":\"10.1145/3643136\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Visual sensing has been widely adopted for quality inspection in production processes. This paper presents the design and implementation of a smart collaborative camera system, called\\n BubCam\\n , for automated quality inspection of manufactured ink bags in Hewlett-Packard (HP) Inc.’s factories. Specifically, BubCam estimates the volume of air bubbles in an ink bag, which may affect the printing quality. The design of BubCam faces challenges due to the dynamic ambient light reflection, motion blur effect, and data labeling difficulty. As a starting point, we design a single-camera system which leverages various deep learning (DL)-based image segmentation and depth fusion techniques. New data labeling and training approaches are proposed to utilize prior knowledge of the production system for training the segmentation model with a small dataset. Then, we design a multi-camera system which additionally deploys multiple wireless cameras to achieve better accuracy due to multi-view sensing. To save power of the wireless cameras, we formulate a configuration adaptation problem and develop the single-agent and multi-agent deep reinforcement learning (DRL)-based solutions to adjust each wireless camera’s operation mode and frame rate in response to the changes of presence of air bubbles and light reflection. The multi-agent DRL approach aims to reduce the retraining costs during the production line reconfiguration process by only retraining the DRL agents for the newly added cameras and the existing cameras with changed positions. Extensive evaluation on a lab testbed and real factory trial shows that BubCam outperforms six baseline solutions including the current manual inspection and existing bubble detection and camera configuration adaptation approaches. In particular, BubCam achieves 1.3x accuracy improvement and 300x latency reduction, compared with the manual inspection approach.\\n\",\"PeriodicalId\":505086,\"journal\":{\"name\":\"ACM Transactions on Cyber-Physical Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-01-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Cyber-Physical Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3643136\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Cyber-Physical Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3643136","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Collaborative Visual Sensing System for Precise Quality Inspection at Manufacturing Lines
Visual sensing has been widely adopted for quality inspection in production processes. This paper presents the design and implementation of a smart collaborative camera system, called
BubCam
, for automated quality inspection of manufactured ink bags in Hewlett-Packard (HP) Inc.’s factories. Specifically, BubCam estimates the volume of air bubbles in an ink bag, which may affect the printing quality. The design of BubCam faces challenges due to the dynamic ambient light reflection, motion blur effect, and data labeling difficulty. As a starting point, we design a single-camera system which leverages various deep learning (DL)-based image segmentation and depth fusion techniques. New data labeling and training approaches are proposed to utilize prior knowledge of the production system for training the segmentation model with a small dataset. Then, we design a multi-camera system which additionally deploys multiple wireless cameras to achieve better accuracy due to multi-view sensing. To save power of the wireless cameras, we formulate a configuration adaptation problem and develop the single-agent and multi-agent deep reinforcement learning (DRL)-based solutions to adjust each wireless camera’s operation mode and frame rate in response to the changes of presence of air bubbles and light reflection. The multi-agent DRL approach aims to reduce the retraining costs during the production line reconfiguration process by only retraining the DRL agents for the newly added cameras and the existing cameras with changed positions. Extensive evaluation on a lab testbed and real factory trial shows that BubCam outperforms six baseline solutions including the current manual inspection and existing bubble detection and camera configuration adaptation approaches. In particular, BubCam achieves 1.3x accuracy improvement and 300x latency reduction, compared with the manual inspection approach.