Omer Hadad, R. Bakalo, Rami Ben-Ari, Sharbell Y. Hashoul, Guy Amit
{"title":"Classification of breast lesions using cross-modal deep learning","authors":"Omer Hadad, R. Bakalo, Rami Ben-Ari, Sharbell Y. Hashoul, Guy Amit","doi":"10.1109/ISBI.2017.7950480","DOIUrl":null,"url":null,"abstract":"Automatic detection and classification of lesions in medical images is a desirable goal, with numerous clinical applications. In breast imaging, multiple modalities such as X-ray, ultrasound and MRI are often used in the diagnostic workflow. Training robust classifiers for each modality is challenging due to the typically small size of the available datasets. We propose to use cross-modal transfer learning to improve the robustness of the classifiers. We demonstrate the potential of this approach on a problem of identifying masses in breast MRI images, using a network that was trained on mammography images. Comparison between cross-modal and cross-domain transfer learning showed that the former improved the classification performance, with overall accuracy of 0.93 versus 0.90, while the accuracy of de-novo training was 0.94. Using transfer learning within the medical imaging domain may help to produce standard pre-trained shared models, which can be utilized to solve a variety of specific clinical problems.","PeriodicalId":6547,"journal":{"name":"2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017)","volume":"24 1","pages":"109-112"},"PeriodicalIF":0.0000,"publicationDate":"2017-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"53","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISBI.2017.7950480","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 53
Abstract
Automatic detection and classification of lesions in medical images is a desirable goal, with numerous clinical applications. In breast imaging, multiple modalities such as X-ray, ultrasound and MRI are often used in the diagnostic workflow. Training robust classifiers for each modality is challenging due to the typically small size of the available datasets. We propose to use cross-modal transfer learning to improve the robustness of the classifiers. We demonstrate the potential of this approach on a problem of identifying masses in breast MRI images, using a network that was trained on mammography images. Comparison between cross-modal and cross-domain transfer learning showed that the former improved the classification performance, with overall accuracy of 0.93 versus 0.90, while the accuracy of de-novo training was 0.94. Using transfer learning within the medical imaging domain may help to produce standard pre-trained shared models, which can be utilized to solve a variety of specific clinical problems.
医学图像中病变的自动检测和分类是一个理想的目标,具有许多临床应用。在乳腺成像中,诊断工作流程中经常使用x射线、超声和MRI等多种方式。由于可用数据集通常规模较小,因此为每种模式训练鲁棒分类器具有挑战性。我们建议使用跨模态迁移学习来提高分类器的鲁棒性。我们展示了这种方法在乳房MRI图像中识别肿块问题上的潜力,使用了一个在乳房x光摄影图像上训练过的网络。跨模态和跨域迁移学习的比较表明,前者提高了分类性能,总体准确率为0.93 vs 0.90,而de-novo训练的准确率为0.94。在医学成像领域使用迁移学习可以帮助产生标准的预训练共享模型,可用于解决各种特定的临床问题。