{"title":"Multi-scale implicit transformer with re-parameterization for arbitrary-scale super-resolution","authors":"Jinchen Zhu, Mingjian Zhang, Ling Zheng, Shizhuang Weng","doi":"10.1016/j.patcog.2024.111327","DOIUrl":null,"url":null,"abstract":"<div><div>Methods based on implicit neural representations have recently exhibited excellent capabilities for arbitrary-scale super-resolution (ASSR). Although these methods represent the features of an image by generating latent codes, these latent codes are difficult to adapt to the different magnification factors of super-resolution (SR) imaging, seriously affecting their performance. To address this issue, we design a multi-scale implicit transformer (MSIT) that consists of a multi-scale neural operator (MSNO) and multi-scale self-attention (MSSA). MSNO obtains multi-scale latent codes through feature enhancement, multi-scale characteristic extraction, and multi-scale characteristic merging. MSSA further enhances the multi-scale characteristics of latent codes, resulting in improved performance. Furthermore, we propose the re-interaction module combined with a cumulative training strategy to improve the diversity of learned information for the network during training. We have systematically introduced multi-scale characteristics for the first time into ASSR. Extensive experiments are performed to validate the effectiveness of MSIT, and our method achieves state-of-the-art performance in ASSR tasks.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"162 ","pages":"Article 111327"},"PeriodicalIF":7.5000,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320324010781","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Methods based on implicit neural representations have recently exhibited excellent capabilities for arbitrary-scale super-resolution (ASSR). Although these methods represent the features of an image by generating latent codes, these latent codes are difficult to adapt to the different magnification factors of super-resolution (SR) imaging, seriously affecting their performance. To address this issue, we design a multi-scale implicit transformer (MSIT) that consists of a multi-scale neural operator (MSNO) and multi-scale self-attention (MSSA). MSNO obtains multi-scale latent codes through feature enhancement, multi-scale characteristic extraction, and multi-scale characteristic merging. MSSA further enhances the multi-scale characteristics of latent codes, resulting in improved performance. Furthermore, we propose the re-interaction module combined with a cumulative training strategy to improve the diversity of learned information for the network during training. We have systematically introduced multi-scale characteristics for the first time into ASSR. Extensive experiments are performed to validate the effectiveness of MSIT, and our method achieves state-of-the-art performance in ASSR tasks.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.