{"title":"粘附细胞培养相衬图像的自动细胞分割","authors":"Guochang Ye, Mehmet Kaya","doi":"10.1109/BioSMART54244.2021.9677717","DOIUrl":null,"url":null,"abstract":"Cell segmentation is a critical step for performing image-based experimental analysis. This study proposes an efficient and accurate cell segmentation method. This image processing pipeline involving simple morphological operations automatically achieves cell segmentation for phase-contrast images. Manual/Visual cell segmentation serves as the control group to evaluate the proposed methodology's performance. Regarding the manual labeling data (156 images as ground truth), the proposed method achieves 90.07% as the average dice coefficient, 82.16% as the average intersection over union, and 6.52% as the average relative error on measuring cell growth area. Additionally, similar degrees of segmentation accuracy are observed on training a modified U-Net model (16848 images) individually with the ground truth and the generated data resulting from the proposed method. These results demonstrate good accuracy and high practicality of the proposed cell segmentation method capable of quantitating cell growth area and generating labeled data for deep learning cell segmentation techniques.","PeriodicalId":286026,"journal":{"name":"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automated Cell Segmentation for Phase-Contrast Images of Adhesion Cell Culture\",\"authors\":\"Guochang Ye, Mehmet Kaya\",\"doi\":\"10.1109/BioSMART54244.2021.9677717\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Cell segmentation is a critical step for performing image-based experimental analysis. This study proposes an efficient and accurate cell segmentation method. This image processing pipeline involving simple morphological operations automatically achieves cell segmentation for phase-contrast images. Manual/Visual cell segmentation serves as the control group to evaluate the proposed methodology's performance. Regarding the manual labeling data (156 images as ground truth), the proposed method achieves 90.07% as the average dice coefficient, 82.16% as the average intersection over union, and 6.52% as the average relative error on measuring cell growth area. Additionally, similar degrees of segmentation accuracy are observed on training a modified U-Net model (16848 images) individually with the ground truth and the generated data resulting from the proposed method. These results demonstrate good accuracy and high practicality of the proposed cell segmentation method capable of quantitating cell growth area and generating labeled data for deep learning cell segmentation techniques.\",\"PeriodicalId\":286026,\"journal\":{\"name\":\"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)\",\"volume\":\"5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BioSMART54244.2021.9677717\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BioSMART54244.2021.9677717","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automated Cell Segmentation for Phase-Contrast Images of Adhesion Cell Culture
Cell segmentation is a critical step for performing image-based experimental analysis. This study proposes an efficient and accurate cell segmentation method. This image processing pipeline involving simple morphological operations automatically achieves cell segmentation for phase-contrast images. Manual/Visual cell segmentation serves as the control group to evaluate the proposed methodology's performance. Regarding the manual labeling data (156 images as ground truth), the proposed method achieves 90.07% as the average dice coefficient, 82.16% as the average intersection over union, and 6.52% as the average relative error on measuring cell growth area. Additionally, similar degrees of segmentation accuracy are observed on training a modified U-Net model (16848 images) individually with the ground truth and the generated data resulting from the proposed method. These results demonstrate good accuracy and high practicality of the proposed cell segmentation method capable of quantitating cell growth area and generating labeled data for deep learning cell segmentation techniques.