{"title":"一种实时场景感知便携式增强现实标定算法","authors":"Siyu Yu, L. Qi, Y. Tie","doi":"10.1109/ISNE.2019.8896496","DOIUrl":null,"url":null,"abstract":"In order to solve the problem of accurate holographic projection of the point cloud collected by an external high-precision depth camera in HoloLens, we propose an augmented reality calibration algorithm for real-time scene perception. Firstly, we build a portable high-precision real-time sensing system, using external RealSense to collect point cloud data, and the portable host processes and returns the data to HoloLens via a local area network. Secondly, it calibrates the internal parameters of HoloLens' webcam and RealSense depth cameras, then fixed the two cameras for dual purpose calibration, so as to obtain the internal rotation and translation matrix. Finally, the calculated posture computed by the matrix transformation transforms of the virtual object from the RealSense coordinate system displayed in OSG (Open Scene Graph) to HoloLens unified. The Direct X coordinate system is then transformed into the HoloLens Webcam coordinate system, and then the HoloLens API is used to acquire the fixed coordinate system established during the acquisition. At the same time, the virtual object of the holographic projection is accurately merged with the real object, and the spatial anchor is fixed in the real scene, so that the system realizes an accurate and real-time aware augmented reality capability.","PeriodicalId":405565,"journal":{"name":"2019 8th International Symposium on Next Generation Electronics (ISNE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Calibration Algorithm for Real-time Scene-aware Portable Augmented Reality\",\"authors\":\"Siyu Yu, L. Qi, Y. Tie\",\"doi\":\"10.1109/ISNE.2019.8896496\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In order to solve the problem of accurate holographic projection of the point cloud collected by an external high-precision depth camera in HoloLens, we propose an augmented reality calibration algorithm for real-time scene perception. Firstly, we build a portable high-precision real-time sensing system, using external RealSense to collect point cloud data, and the portable host processes and returns the data to HoloLens via a local area network. Secondly, it calibrates the internal parameters of HoloLens' webcam and RealSense depth cameras, then fixed the two cameras for dual purpose calibration, so as to obtain the internal rotation and translation matrix. Finally, the calculated posture computed by the matrix transformation transforms of the virtual object from the RealSense coordinate system displayed in OSG (Open Scene Graph) to HoloLens unified. The Direct X coordinate system is then transformed into the HoloLens Webcam coordinate system, and then the HoloLens API is used to acquire the fixed coordinate system established during the acquisition. At the same time, the virtual object of the holographic projection is accurately merged with the real object, and the spatial anchor is fixed in the real scene, so that the system realizes an accurate and real-time aware augmented reality capability.\",\"PeriodicalId\":405565,\"journal\":{\"name\":\"2019 8th International Symposium on Next Generation Electronics (ISNE)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 8th International Symposium on Next Generation Electronics (ISNE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISNE.2019.8896496\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 8th International Symposium on Next Generation Electronics (ISNE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISNE.2019.8896496","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
为了解决HoloLens中外部高精度深度相机采集的点云的精确全息投影问题,提出了一种用于实时场景感知的增强现实校准算法。首先,我们构建了一个便携式高精度实时传感系统,利用外部RealSense采集点云数据,便携式主机通过局域网处理并返回数据给HoloLens。其次,对HoloLens的网络摄像头和RealSense深度摄像头的内部参数进行标定,并将两个摄像头固定进行双重标定,从而得到内部旋转平移矩阵。最后,通过矩阵变换计算出的计算姿态将虚拟物体从OSG (Open Scene Graph)中显示的RealSense坐标系转换为统一的HoloLens坐标系。然后将Direct X坐标系转换为HoloLens Webcam坐标系,然后使用HoloLens API获取采集过程中建立的固定坐标系。同时,将全息投影的虚拟物体与真实物体精确融合,并将空间锚定在真实场景中,从而使系统实现了精确、实时的感知增强现实能力。
A Calibration Algorithm for Real-time Scene-aware Portable Augmented Reality
In order to solve the problem of accurate holographic projection of the point cloud collected by an external high-precision depth camera in HoloLens, we propose an augmented reality calibration algorithm for real-time scene perception. Firstly, we build a portable high-precision real-time sensing system, using external RealSense to collect point cloud data, and the portable host processes and returns the data to HoloLens via a local area network. Secondly, it calibrates the internal parameters of HoloLens' webcam and RealSense depth cameras, then fixed the two cameras for dual purpose calibration, so as to obtain the internal rotation and translation matrix. Finally, the calculated posture computed by the matrix transformation transforms of the virtual object from the RealSense coordinate system displayed in OSG (Open Scene Graph) to HoloLens unified. The Direct X coordinate system is then transformed into the HoloLens Webcam coordinate system, and then the HoloLens API is used to acquire the fixed coordinate system established during the acquisition. At the same time, the virtual object of the holographic projection is accurately merged with the real object, and the spatial anchor is fixed in the real scene, so that the system realizes an accurate and real-time aware augmented reality capability.