Shuting Liu, Guoliang Wei, Yi Fan, Lei Chen, Zhaodong Zhang
{"title":"多尺度特征交叉的多模态注册网络","authors":"Shuting Liu, Guoliang Wei, Yi Fan, Lei Chen, Zhaodong Zhang","doi":"10.1007/s11548-024-03258-0","DOIUrl":null,"url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Purpose</h3><p>A critical piece of information for prostate intervention and cancer treatment is provided by the complementary medical imaging modalities of ultrasound (US) and magnetic resonance imaging (MRI). Therefore, MRI–US image fusion is often required during prostate examination to provide contrast-enhanced TRUS, in which image registration is a key step in multimodal image fusion.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>We propose a novel multi-scale feature-crossing network for the prostate MRI–US image registration task. We designed a feature-crossing module to enhance information flow in the hidden layer by integrating intermediate features between adjacent scales. Additionally, an attention block utilizing three-dimensional convolution interacts information between channels, improving the correlation between different modal features. We used 100 cases randomly selected from The Cancer Imaging Archive (TCIA) for our experiments. A fivefold cross-validation method was applied, dividing the dataset into five subsets. Four subsets were used for training, and one for testing, repeating this process five times to ensure each subset served as the test set once.</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>We test and evaluate our technique using fivefold cross-validation. The cross-validation trials result in a median target registration error of 2.20 mm on landmark centroids and a median Dice of 0.87 on prostate glands, both of which were better than the baseline model. In addition, the standard deviation of the dice similarity coefficient is 0.06, which suggests that the model is stable.</p><h3 data-test=\"abstract-sub-heading\">Conclusion</h3><p>We propose a novel multi-scale feature-crossing network for the prostate MRI–US image registration task. A random selection of 100 cases from The Cancer Imaging Archive (TCIA) was used to test and evaluate our approach using fivefold cross-validation. The experimental results showed that our method improves the registration accuracy. After registration, MRI and TURS images were more similar in structure and morphology, and the location and morphology of cancer were more accurately reflected in the images.</p>","PeriodicalId":51251,"journal":{"name":"International Journal of Computer Assisted Radiology and Surgery","volume":"24 1","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multimodal registration network with multi-scale feature-crossing\",\"authors\":\"Shuting Liu, Guoliang Wei, Yi Fan, Lei Chen, Zhaodong Zhang\",\"doi\":\"10.1007/s11548-024-03258-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3 data-test=\\\"abstract-sub-heading\\\">Purpose</h3><p>A critical piece of information for prostate intervention and cancer treatment is provided by the complementary medical imaging modalities of ultrasound (US) and magnetic resonance imaging (MRI). Therefore, MRI–US image fusion is often required during prostate examination to provide contrast-enhanced TRUS, in which image registration is a key step in multimodal image fusion.</p><h3 data-test=\\\"abstract-sub-heading\\\">Methods</h3><p>We propose a novel multi-scale feature-crossing network for the prostate MRI–US image registration task. We designed a feature-crossing module to enhance information flow in the hidden layer by integrating intermediate features between adjacent scales. Additionally, an attention block utilizing three-dimensional convolution interacts information between channels, improving the correlation between different modal features. We used 100 cases randomly selected from The Cancer Imaging Archive (TCIA) for our experiments. A fivefold cross-validation method was applied, dividing the dataset into five subsets. Four subsets were used for training, and one for testing, repeating this process five times to ensure each subset served as the test set once.</p><h3 data-test=\\\"abstract-sub-heading\\\">Results</h3><p>We test and evaluate our technique using fivefold cross-validation. The cross-validation trials result in a median target registration error of 2.20 mm on landmark centroids and a median Dice of 0.87 on prostate glands, both of which were better than the baseline model. In addition, the standard deviation of the dice similarity coefficient is 0.06, which suggests that the model is stable.</p><h3 data-test=\\\"abstract-sub-heading\\\">Conclusion</h3><p>We propose a novel multi-scale feature-crossing network for the prostate MRI–US image registration task. A random selection of 100 cases from The Cancer Imaging Archive (TCIA) was used to test and evaluate our approach using fivefold cross-validation. The experimental results showed that our method improves the registration accuracy. After registration, MRI and TURS images were more similar in structure and morphology, and the location and morphology of cancer were more accurately reflected in the images.</p>\",\"PeriodicalId\":51251,\"journal\":{\"name\":\"International Journal of Computer Assisted Radiology and Surgery\",\"volume\":\"24 1\",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2024-09-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Computer Assisted Radiology and Surgery\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1007/s11548-024-03258-0\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Assisted Radiology and Surgery","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11548-024-03258-0","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
Multimodal registration network with multi-scale feature-crossing
Purpose
A critical piece of information for prostate intervention and cancer treatment is provided by the complementary medical imaging modalities of ultrasound (US) and magnetic resonance imaging (MRI). Therefore, MRI–US image fusion is often required during prostate examination to provide contrast-enhanced TRUS, in which image registration is a key step in multimodal image fusion.
Methods
We propose a novel multi-scale feature-crossing network for the prostate MRI–US image registration task. We designed a feature-crossing module to enhance information flow in the hidden layer by integrating intermediate features between adjacent scales. Additionally, an attention block utilizing three-dimensional convolution interacts information between channels, improving the correlation between different modal features. We used 100 cases randomly selected from The Cancer Imaging Archive (TCIA) for our experiments. A fivefold cross-validation method was applied, dividing the dataset into five subsets. Four subsets were used for training, and one for testing, repeating this process five times to ensure each subset served as the test set once.
Results
We test and evaluate our technique using fivefold cross-validation. The cross-validation trials result in a median target registration error of 2.20 mm on landmark centroids and a median Dice of 0.87 on prostate glands, both of which were better than the baseline model. In addition, the standard deviation of the dice similarity coefficient is 0.06, which suggests that the model is stable.
Conclusion
We propose a novel multi-scale feature-crossing network for the prostate MRI–US image registration task. A random selection of 100 cases from The Cancer Imaging Archive (TCIA) was used to test and evaluate our approach using fivefold cross-validation. The experimental results showed that our method improves the registration accuracy. After registration, MRI and TURS images were more similar in structure and morphology, and the location and morphology of cancer were more accurately reflected in the images.
期刊介绍:
The International Journal for Computer Assisted Radiology and Surgery (IJCARS) is a peer-reviewed journal that provides a platform for closing the gap between medical and technical disciplines, and encourages interdisciplinary research and development activities in an international environment.