Yun Wang, Wanru Chang, Chongfei Huang, Dexing Kong
{"title":"用于可变形图像配准的多尺度无监督网络","authors":"Yun Wang, Wanru Chang, Chongfei Huang, Dexing Kong","doi":"10.3233/XST-240159","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Deformable image registration (DIR) plays an important part in many clinical tasks, and deep learning has made significant progress in DIR over the past few years.</p><p><strong>Objective: </strong>To propose a fast multiscale unsupervised deformable image registration (referred to as FMIRNet) method for monomodal image registration.</p><p><strong>Methods: </strong>We designed a multiscale fusion module to estimate the large displacement field by combining and refining the deformation fields of three scales. The spatial attention mechanism was employed in our fusion module to weight the displacement field pixel by pixel. Except mean square error (MSE), we additionally added structural similarity (ssim) measure during the training phase to enhance the structural consistency between the deformed images and the fixed images.</p><p><strong>Results: </strong>Our registration method was evaluated on EchoNet, CHAOS and SLIVER, and had indeed performance improvement in terms of SSIM, NCC and NMI scores. Furthermore, we integrated the FMIRNet into the segmentation network (FCN, UNet) to boost the segmentation task on a dataset with few manual annotations in our joint leaning frameworks. The experimental results indicated that the joint segmentation methods had performance improvement in terms of Dice, HD and ASSD scores.</p><p><strong>Conclusions: </strong>Our proposed FMIRNet is effective for large deformation estimation, and its registration capability is generalizable and robust in joint registration and segmentation frameworks to generate reliable labels for training segmentation tasks.</p>","PeriodicalId":1,"journal":{"name":"Accounts of Chemical Research","volume":null,"pages":null},"PeriodicalIF":16.4000,"publicationDate":"2024-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multiscale unsupervised network for deformable image registration.\",\"authors\":\"Yun Wang, Wanru Chang, Chongfei Huang, Dexing Kong\",\"doi\":\"10.3233/XST-240159\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Deformable image registration (DIR) plays an important part in many clinical tasks, and deep learning has made significant progress in DIR over the past few years.</p><p><strong>Objective: </strong>To propose a fast multiscale unsupervised deformable image registration (referred to as FMIRNet) method for monomodal image registration.</p><p><strong>Methods: </strong>We designed a multiscale fusion module to estimate the large displacement field by combining and refining the deformation fields of three scales. The spatial attention mechanism was employed in our fusion module to weight the displacement field pixel by pixel. Except mean square error (MSE), we additionally added structural similarity (ssim) measure during the training phase to enhance the structural consistency between the deformed images and the fixed images.</p><p><strong>Results: </strong>Our registration method was evaluated on EchoNet, CHAOS and SLIVER, and had indeed performance improvement in terms of SSIM, NCC and NMI scores. Furthermore, we integrated the FMIRNet into the segmentation network (FCN, UNet) to boost the segmentation task on a dataset with few manual annotations in our joint leaning frameworks. The experimental results indicated that the joint segmentation methods had performance improvement in terms of Dice, HD and ASSD scores.</p><p><strong>Conclusions: </strong>Our proposed FMIRNet is effective for large deformation estimation, and its registration capability is generalizable and robust in joint registration and segmentation frameworks to generate reliable labels for training segmentation tasks.</p>\",\"PeriodicalId\":1,\"journal\":{\"name\":\"Accounts of Chemical Research\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":16.4000,\"publicationDate\":\"2024-09-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Accounts of Chemical Research\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.3233/XST-240159\",\"RegionNum\":1,\"RegionCategory\":\"化学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"CHEMISTRY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Accounts of Chemical Research","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3233/XST-240159","RegionNum":1,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
Multiscale unsupervised network for deformable image registration.
Background: Deformable image registration (DIR) plays an important part in many clinical tasks, and deep learning has made significant progress in DIR over the past few years.
Objective: To propose a fast multiscale unsupervised deformable image registration (referred to as FMIRNet) method for monomodal image registration.
Methods: We designed a multiscale fusion module to estimate the large displacement field by combining and refining the deformation fields of three scales. The spatial attention mechanism was employed in our fusion module to weight the displacement field pixel by pixel. Except mean square error (MSE), we additionally added structural similarity (ssim) measure during the training phase to enhance the structural consistency between the deformed images and the fixed images.
Results: Our registration method was evaluated on EchoNet, CHAOS and SLIVER, and had indeed performance improvement in terms of SSIM, NCC and NMI scores. Furthermore, we integrated the FMIRNet into the segmentation network (FCN, UNet) to boost the segmentation task on a dataset with few manual annotations in our joint leaning frameworks. The experimental results indicated that the joint segmentation methods had performance improvement in terms of Dice, HD and ASSD scores.
Conclusions: Our proposed FMIRNet is effective for large deformation estimation, and its registration capability is generalizable and robust in joint registration and segmentation frameworks to generate reliable labels for training segmentation tasks.
期刊介绍:
Accounts of Chemical Research presents short, concise and critical articles offering easy-to-read overviews of basic research and applications in all areas of chemistry and biochemistry. These short reviews focus on research from the author’s own laboratory and are designed to teach the reader about a research project. In addition, Accounts of Chemical Research publishes commentaries that give an informed opinion on a current research problem. Special Issues online are devoted to a single topic of unusual activity and significance.
Accounts of Chemical Research replaces the traditional article abstract with an article "Conspectus." These entries synopsize the research affording the reader a closer look at the content and significance of an article. Through this provision of a more detailed description of the article contents, the Conspectus enhances the article's discoverability by search engines and the exposure for the research.