Teerasak Chotikawanid, W. Suwansantisuk, P. Kumhom
{"title":"A study of transformation of object location from an image plane to a 3D space based on vanishing point and reference points on the 3D axes","authors":"Teerasak Chotikawanid, W. Suwansantisuk, P. Kumhom","doi":"10.1109/ISPACS.2016.7824747","DOIUrl":null,"url":null,"abstract":"Object localization provides positional information that is useful for object tracking, asset monitoring, and fall detection, among the many applications. Although object localization based on a single-camera viewpoint is simple to implement, it has an important limitation: the method requires a calibration. This paper proposes an experimental-based study of such limitations on a transformation of a line from the image plane into the real-world 3D space. The transformation identifies a vanishing point and reference points on the axes of the real-world 3D space, and is particularly useful for fall detection. Experiments are conducted on a scaled-down box so that many locations can be tested easily. In the first experiment, this paper identifies the effect of a shift in the vanishing point on the distance error, which is the distance between the actual position and the estimated position. Furthermore, the paper varies an actual position of the object and measures experimentally the absolute difference between two key quantities: the distance error when the vanishing point is shifted by 10% and the distance error when the vanishing point stays put. The results show that the distance error increases to 20mm when the vanishing point is shifted by 10%. Finally, in order to glean localization information that is available from the proposed transformation, the pattern of errors is considered at various polar coordinates, which are constructed according to the camera's locations. The results show that for the distance errors of less than 20mm, 30mm, and 65mm, the proposed transformation can cover approximately 16%, 45%, and 90%, respectively, of the camera's viewing area. The research results quantify a role of the vanishing point on localization accuracy and guides an installation of multiple cameras to aid localization.","PeriodicalId":131543,"journal":{"name":"2016 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPACS.2016.7824747","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Object localization provides positional information that is useful for object tracking, asset monitoring, and fall detection, among the many applications. Although object localization based on a single-camera viewpoint is simple to implement, it has an important limitation: the method requires a calibration. This paper proposes an experimental-based study of such limitations on a transformation of a line from the image plane into the real-world 3D space. The transformation identifies a vanishing point and reference points on the axes of the real-world 3D space, and is particularly useful for fall detection. Experiments are conducted on a scaled-down box so that many locations can be tested easily. In the first experiment, this paper identifies the effect of a shift in the vanishing point on the distance error, which is the distance between the actual position and the estimated position. Furthermore, the paper varies an actual position of the object and measures experimentally the absolute difference between two key quantities: the distance error when the vanishing point is shifted by 10% and the distance error when the vanishing point stays put. The results show that the distance error increases to 20mm when the vanishing point is shifted by 10%. Finally, in order to glean localization information that is available from the proposed transformation, the pattern of errors is considered at various polar coordinates, which are constructed according to the camera's locations. The results show that for the distance errors of less than 20mm, 30mm, and 65mm, the proposed transformation can cover approximately 16%, 45%, and 90%, respectively, of the camera's viewing area. The research results quantify a role of the vanishing point on localization accuracy and guides an installation of multiple cameras to aid localization.