Md Nur-A Alam , Khandaker Mohammad Mohi Uddin , Md Mahbubur Rahman , M.M.R. Manu , Mostofa Kamal Nasir
{"title":"利用超分辨率深度融合特征从超声波图像中检测乳腺癌的新型自动系统","authors":"Md Nur-A Alam , Khandaker Mohammad Mohi Uddin , Md Mahbubur Rahman , M.M.R. Manu , Mostofa Kamal Nasir","doi":"10.1016/j.ibmed.2024.100149","DOIUrl":null,"url":null,"abstract":"<div><p>Cancer patients can benefit from early detection and diagnosis. This study proposes a machine vision strategy for detecting breast cancer in ultrasound images and correcting several ultrasound difficulties such artifacts, speckle noise, and blurring. In quantitative evolution, edge preservation criteria were discovered to be superior to standard ones for hybrid anisotropic diffusion. A learnable super-resolution (SR) is inserted in the deep CNN to dig for extra possible information. The feature is fused with a pre-trained deep CNN model utilizing Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP). Machine learning (ML) techniques that are used to create these recommendation systems require well-balanced data in terms of class distribution, however most datasets in the real world are imbalanced. Imbalanced data forces a classifier to concentrate on the majority class while ignoring other classes of interest, lowering the predicted performance of any classification model. We propose a generative adversarial networks (GAN) strategy to overcome the data imbalance problem and improve the performance of recommendation systems in this research. Standard data is used to train this model, which assures a high level of resolution. In the testing phase, generalized data of varied resolution is used to evaluate the model's performance. It is discovered through cross-validation that a 5-fold method can successfully eliminate the overfitting problem. With an accuracy of 99.48 %, this suggested feature fusion technique indicates satisfactory performance when compared to current related works. Finally finding cancer region, researcher used U-Net architecture and extract cancer region from BC ultrasound images.</p></div>","PeriodicalId":73399,"journal":{"name":"Intelligence-based medicine","volume":"10 ","pages":"Article 100149"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2666521224000164/pdfft?md5=b686b67a89246f188c3f0ac0748f2cab&pid=1-s2.0-S2666521224000164-main.pdf","citationCount":"0","resultStr":"{\"title\":\"A novel automated system to detect breast cancer from ultrasound images using deep fused features with super resolution\",\"authors\":\"Md Nur-A Alam , Khandaker Mohammad Mohi Uddin , Md Mahbubur Rahman , M.M.R. Manu , Mostofa Kamal Nasir\",\"doi\":\"10.1016/j.ibmed.2024.100149\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Cancer patients can benefit from early detection and diagnosis. This study proposes a machine vision strategy for detecting breast cancer in ultrasound images and correcting several ultrasound difficulties such artifacts, speckle noise, and blurring. In quantitative evolution, edge preservation criteria were discovered to be superior to standard ones for hybrid anisotropic diffusion. A learnable super-resolution (SR) is inserted in the deep CNN to dig for extra possible information. The feature is fused with a pre-trained deep CNN model utilizing Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP). Machine learning (ML) techniques that are used to create these recommendation systems require well-balanced data in terms of class distribution, however most datasets in the real world are imbalanced. Imbalanced data forces a classifier to concentrate on the majority class while ignoring other classes of interest, lowering the predicted performance of any classification model. We propose a generative adversarial networks (GAN) strategy to overcome the data imbalance problem and improve the performance of recommendation systems in this research. Standard data is used to train this model, which assures a high level of resolution. In the testing phase, generalized data of varied resolution is used to evaluate the model's performance. It is discovered through cross-validation that a 5-fold method can successfully eliminate the overfitting problem. With an accuracy of 99.48 %, this suggested feature fusion technique indicates satisfactory performance when compared to current related works. Finally finding cancer region, researcher used U-Net architecture and extract cancer region from BC ultrasound images.</p></div>\",\"PeriodicalId\":73399,\"journal\":{\"name\":\"Intelligence-based medicine\",\"volume\":\"10 \",\"pages\":\"Article 100149\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2666521224000164/pdfft?md5=b686b67a89246f188c3f0ac0748f2cab&pid=1-s2.0-S2666521224000164-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Intelligence-based medicine\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666521224000164\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligence-based medicine","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666521224000164","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A novel automated system to detect breast cancer from ultrasound images using deep fused features with super resolution
Cancer patients can benefit from early detection and diagnosis. This study proposes a machine vision strategy for detecting breast cancer in ultrasound images and correcting several ultrasound difficulties such artifacts, speckle noise, and blurring. In quantitative evolution, edge preservation criteria were discovered to be superior to standard ones for hybrid anisotropic diffusion. A learnable super-resolution (SR) is inserted in the deep CNN to dig for extra possible information. The feature is fused with a pre-trained deep CNN model utilizing Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP). Machine learning (ML) techniques that are used to create these recommendation systems require well-balanced data in terms of class distribution, however most datasets in the real world are imbalanced. Imbalanced data forces a classifier to concentrate on the majority class while ignoring other classes of interest, lowering the predicted performance of any classification model. We propose a generative adversarial networks (GAN) strategy to overcome the data imbalance problem and improve the performance of recommendation systems in this research. Standard data is used to train this model, which assures a high level of resolution. In the testing phase, generalized data of varied resolution is used to evaluate the model's performance. It is discovered through cross-validation that a 5-fold method can successfully eliminate the overfitting problem. With an accuracy of 99.48 %, this suggested feature fusion technique indicates satisfactory performance when compared to current related works. Finally finding cancer region, researcher used U-Net architecture and extract cancer region from BC ultrasound images.