{"title":"基于长短期记忆神经网络的无参考光场图像质量评价方法","authors":"Sana Alamgeer, Mylène C. Q. Farias","doi":"10.1109/ICMEW56448.2022.9859419","DOIUrl":null,"url":null,"abstract":"Light Field (LF) cameras capture angular and spatial information and, consequently, require a large amount of resources in memory and bandwidth. To reduce these requirements, LF contents generally need to undergo compression and transmission protocols. Since these techniques may introduce distortions, the design of Light-Field Image Quality Assessment (LFI-IQA) methods are important to monitor the quality of the LFI content at the user side. In this work, we present a No-Reference (NR) LFIIQA method that is based on a Long Short-Term Memory based Deep Neural Network (LSTM-DNN). The method is composed of two streams. The first stream extracts long-term dependent distortion related features from horizontal epipolar plane images, while the second stream processes bottleneck features of micro-lens images. The outputs of both streams are fused, and supplied to a regression operation that generates a scalar value as a predicted quality score. Results show that the proposed method is robust and accurate, outperforming several state-of-the-art LF-IQA methods.","PeriodicalId":106759,"journal":{"name":"2022 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"No-Reference Light Field Image Quality Assessment Method Based on a Long-Short Term Memory Neural Network\",\"authors\":\"Sana Alamgeer, Mylène C. Q. Farias\",\"doi\":\"10.1109/ICMEW56448.2022.9859419\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Light Field (LF) cameras capture angular and spatial information and, consequently, require a large amount of resources in memory and bandwidth. To reduce these requirements, LF contents generally need to undergo compression and transmission protocols. Since these techniques may introduce distortions, the design of Light-Field Image Quality Assessment (LFI-IQA) methods are important to monitor the quality of the LFI content at the user side. In this work, we present a No-Reference (NR) LFIIQA method that is based on a Long Short-Term Memory based Deep Neural Network (LSTM-DNN). The method is composed of two streams. The first stream extracts long-term dependent distortion related features from horizontal epipolar plane images, while the second stream processes bottleneck features of micro-lens images. The outputs of both streams are fused, and supplied to a regression operation that generates a scalar value as a predicted quality score. Results show that the proposed method is robust and accurate, outperforming several state-of-the-art LF-IQA methods.\",\"PeriodicalId\":106759,\"journal\":{\"name\":\"2022 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMEW56448.2022.9859419\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMEW56448.2022.9859419","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
No-Reference Light Field Image Quality Assessment Method Based on a Long-Short Term Memory Neural Network
Light Field (LF) cameras capture angular and spatial information and, consequently, require a large amount of resources in memory and bandwidth. To reduce these requirements, LF contents generally need to undergo compression and transmission protocols. Since these techniques may introduce distortions, the design of Light-Field Image Quality Assessment (LFI-IQA) methods are important to monitor the quality of the LFI content at the user side. In this work, we present a No-Reference (NR) LFIIQA method that is based on a Long Short-Term Memory based Deep Neural Network (LSTM-DNN). The method is composed of two streams. The first stream extracts long-term dependent distortion related features from horizontal epipolar plane images, while the second stream processes bottleneck features of micro-lens images. The outputs of both streams are fused, and supplied to a regression operation that generates a scalar value as a predicted quality score. Results show that the proposed method is robust and accurate, outperforming several state-of-the-art LF-IQA methods.