{"title":"基于联合目标检测的分布式视觉传感器网络标定","authors":"Jennifer Simonjan, B. Rinner","doi":"10.1109/DCOSS.2017.17","DOIUrl":null,"url":null,"abstract":"In this paper we present a distributed, autonomous network calibration algorithm, which enables visual sensor networks to gather knowledge about the network topology. A calibrated sensor network provides the basis for more robust applications, since nodes are aware of their spatial neighbors. In our approach, sensor nodes estimate relative positions and orientations of nodes with overlapping fields of view based on jointly detected objects and geometric relations. Distance and angle measurements are the only information required to be exchanged between nodes. The process works iteratively, first calibrating camera neighbors in a pairwise manner and then spreading the calibration information through the network. Further, each node operates within its local coordinate system avoiding the need for any global coordinates. While existing methods mostly exploit computer vision algorithms to relate nodes to each other based on their images, we solely rely on geometric constraints.","PeriodicalId":399222,"journal":{"name":"2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Distributed Visual Sensor Network Calibration Based on Joint Object Detections\",\"authors\":\"Jennifer Simonjan, B. Rinner\",\"doi\":\"10.1109/DCOSS.2017.17\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we present a distributed, autonomous network calibration algorithm, which enables visual sensor networks to gather knowledge about the network topology. A calibrated sensor network provides the basis for more robust applications, since nodes are aware of their spatial neighbors. In our approach, sensor nodes estimate relative positions and orientations of nodes with overlapping fields of view based on jointly detected objects and geometric relations. Distance and angle measurements are the only information required to be exchanged between nodes. The process works iteratively, first calibrating camera neighbors in a pairwise manner and then spreading the calibration information through the network. Further, each node operates within its local coordinate system avoiding the need for any global coordinates. While existing methods mostly exploit computer vision algorithms to relate nodes to each other based on their images, we solely rely on geometric constraints.\",\"PeriodicalId\":399222,\"journal\":{\"name\":\"2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS)\",\"volume\":\"75 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DCOSS.2017.17\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCOSS.2017.17","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Distributed Visual Sensor Network Calibration Based on Joint Object Detections
In this paper we present a distributed, autonomous network calibration algorithm, which enables visual sensor networks to gather knowledge about the network topology. A calibrated sensor network provides the basis for more robust applications, since nodes are aware of their spatial neighbors. In our approach, sensor nodes estimate relative positions and orientations of nodes with overlapping fields of view based on jointly detected objects and geometric relations. Distance and angle measurements are the only information required to be exchanged between nodes. The process works iteratively, first calibrating camera neighbors in a pairwise manner and then spreading the calibration information through the network. Further, each node operates within its local coordinate system avoiding the need for any global coordinates. While existing methods mostly exploit computer vision algorithms to relate nodes to each other based on their images, we solely rely on geometric constraints.