{"title":"基于参考的图像超分辨率残差通道注意连接网络","authors":"Ruirong Lin, Nangfeng Xiao","doi":"10.1109/ICCSS53909.2021.9722011","DOIUrl":null,"url":null,"abstract":"Compared with single image super-resolution (SISR), reference-based image super-resolution (RefSR) utilizes additional references (Ref) to recover more realistic texture details, achieving better reconstruction performance. Most recent works focus on transferring relevant texture features from Ref to low-resolution (LR) images. However, those works ignore the high-frequency information existing in the LR space, leading to performance degradation when irrelevant Ref images are given. To address this issue, we propose a residual channel attention connection network for reference-based image super-resolution (RCACSR), which fuses valuable high-frequency information in LR space with high-resolution (HR) texture details of Ref. Specifically, the proposed residual channel attention connection network (RCACN) can extract more complex features from the LR space. Moreover, an enhanced texture transformer is presented, which can search and transfer texture features more accurately from Ref. Extensive experiments have demonstrated that the proposed RCACSR is superior to the state-of-the-art approaches in the aspects of both quantitative and qualitative measurements.","PeriodicalId":435816,"journal":{"name":"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Residual Channel Attention Connection Network for Reference-based Image Super-resolution\",\"authors\":\"Ruirong Lin, Nangfeng Xiao\",\"doi\":\"10.1109/ICCSS53909.2021.9722011\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Compared with single image super-resolution (SISR), reference-based image super-resolution (RefSR) utilizes additional references (Ref) to recover more realistic texture details, achieving better reconstruction performance. Most recent works focus on transferring relevant texture features from Ref to low-resolution (LR) images. However, those works ignore the high-frequency information existing in the LR space, leading to performance degradation when irrelevant Ref images are given. To address this issue, we propose a residual channel attention connection network for reference-based image super-resolution (RCACSR), which fuses valuable high-frequency information in LR space with high-resolution (HR) texture details of Ref. Specifically, the proposed residual channel attention connection network (RCACN) can extract more complex features from the LR space. Moreover, an enhanced texture transformer is presented, which can search and transfer texture features more accurately from Ref. Extensive experiments have demonstrated that the proposed RCACSR is superior to the state-of-the-art approaches in the aspects of both quantitative and qualitative measurements.\",\"PeriodicalId\":435816,\"journal\":{\"name\":\"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCSS53909.2021.9722011\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 8th International Conference on Information, Cybernetics, and Computational Social Systems (ICCSS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCSS53909.2021.9722011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Residual Channel Attention Connection Network for Reference-based Image Super-resolution
Compared with single image super-resolution (SISR), reference-based image super-resolution (RefSR) utilizes additional references (Ref) to recover more realistic texture details, achieving better reconstruction performance. Most recent works focus on transferring relevant texture features from Ref to low-resolution (LR) images. However, those works ignore the high-frequency information existing in the LR space, leading to performance degradation when irrelevant Ref images are given. To address this issue, we propose a residual channel attention connection network for reference-based image super-resolution (RCACSR), which fuses valuable high-frequency information in LR space with high-resolution (HR) texture details of Ref. Specifically, the proposed residual channel attention connection network (RCACN) can extract more complex features from the LR space. Moreover, an enhanced texture transformer is presented, which can search and transfer texture features more accurately from Ref. Extensive experiments have demonstrated that the proposed RCACSR is superior to the state-of-the-art approaches in the aspects of both quantitative and qualitative measurements.