S. Miyata, H. Saito, Kosuke Takahashi, Dan Mikami, Mariko Isogawa, H. Kimata
{"title":"Ball 3D Trajectory Reconstruction without Preliminary Temporal and Geometrical Camera Calibration","authors":"S. Miyata, H. Saito, Kosuke Takahashi, Dan Mikami, Mariko Isogawa, H. Kimata","doi":"10.1109/CVPRW.2017.26","DOIUrl":null,"url":null,"abstract":"This paper proposes a method for reconstructing 3D ball trajectories by using multiple temporally and geometrically uncalibrated cameras. To use cameras to measure the trajectory of a fast-moving object, such as a ball thrown by a pitcher, the cameras must be temporally synchronized and their position and orientation should be calibrated. In some cases, these conditions cannot be met, e.g., one cannot geometrically calibrate cameras when one cannot step into a baseball stadium. The basic idea of the proposed method is to use a ball captured by multiple cameras as a corresponding point. The method first detects a ball. Then, it estimates temporal difference between cameras. After that, the ball positions are used as corresponding points for geometrically calibrating the cameras. Experiments using actual pitching videos verify the effectiveness of our method.","PeriodicalId":6668,"journal":{"name":"2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)","volume":"1 1","pages":"164-169"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPRW.2017.26","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
This paper proposes a method for reconstructing 3D ball trajectories by using multiple temporally and geometrically uncalibrated cameras. To use cameras to measure the trajectory of a fast-moving object, such as a ball thrown by a pitcher, the cameras must be temporally synchronized and their position and orientation should be calibrated. In some cases, these conditions cannot be met, e.g., one cannot geometrically calibrate cameras when one cannot step into a baseball stadium. The basic idea of the proposed method is to use a ball captured by multiple cameras as a corresponding point. The method first detects a ball. Then, it estimates temporal difference between cameras. After that, the ball positions are used as corresponding points for geometrically calibrating the cameras. Experiments using actual pitching videos verify the effectiveness of our method.