Yusuke Kameda, Hiroyuki Kishi, Tomokazu Ishikawa, I. Matsuda, S. Itoh
{"title":"Multi-frame motion compensation using extrapolated frame by optical flow for lossless Video Coding","authors":"Yusuke Kameda, Hiroyuki Kishi, Tomokazu Ishikawa, I. Matsuda, S. Itoh","doi":"10.1109/ISSPIT.2016.7886053","DOIUrl":null,"url":null,"abstract":"We propose an efficient motion compensation method based on a temporally extrapolated frame by using a pel-wise motion (optical flow) estimation. In traditional motion compensation methods, motion vectors are generally detected on a block-by-block basis and sent to the decoder as side information. However, such block-wise motions are not always suitable for motions such as local scaling, rotation, and deformation. On the other hand, pel-wise motion can be estimated on both the side of the encoder and decoder from two successive frames that were previously encoded without side information. The use of pel-wise motion enables the extrapolated frame to be generated under the assumption of linear uniform motions within a short time period. This frame is an approximation of the frame to be encoded. The proposed bi-prediction method uses the extrapolated frame as one of the reference frames. The experimental results indicate that the prediction performance of the proposed method is higher than that of the traditional method.","PeriodicalId":371691,"journal":{"name":"2016 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSPIT.2016.7886053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
We propose an efficient motion compensation method based on a temporally extrapolated frame by using a pel-wise motion (optical flow) estimation. In traditional motion compensation methods, motion vectors are generally detected on a block-by-block basis and sent to the decoder as side information. However, such block-wise motions are not always suitable for motions such as local scaling, rotation, and deformation. On the other hand, pel-wise motion can be estimated on both the side of the encoder and decoder from two successive frames that were previously encoded without side information. The use of pel-wise motion enables the extrapolated frame to be generated under the assumption of linear uniform motions within a short time period. This frame is an approximation of the frame to be encoded. The proposed bi-prediction method uses the extrapolated frame as one of the reference frames. The experimental results indicate that the prediction performance of the proposed method is higher than that of the traditional method.