Qingchuan Ma, Etsuko Kobayashi, Siao Jin, Ken Masamune, Hideyuki Suenaga
{"title":"3D evaluation model of facial aesthetics based on multi-input 3D convolution neural networks for orthognathic surgery","authors":"Qingchuan Ma, Etsuko Kobayashi, Siao Jin, Ken Masamune, Hideyuki Suenaga","doi":"10.1002/rcs.2651","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <h3> Background</h3>\n \n <p>Quantitative evaluation of facial aesthetics is an important but also time-consuming procedure in orthognathic surgery, while existing 2D beauty-scoring models are mainly used for entertainment with less clinical impact.</p>\n </section>\n \n <section>\n \n <h3> Methods</h3>\n \n <p>A deep-learning-based 3D evaluation model DeepBeauty3D was designed and trained using 133 patients' CT images. The customised image preprocessing module extracted the skeleton, soft tissue, and personal physical information from raw DICOM data, and the predicting network module employed 3-input-2-output convolution neural networks (CNN) to receive the aforementioned data and output aesthetic scores automatically.</p>\n </section>\n \n <section>\n \n <h3> Results</h3>\n \n <p>Experiment results showed that this model predicted the skeleton and soft tissue score with 0.231 ± 0.218 (4.62%) and 0.100 ± 0.344 (2.00%) accuracy in 11.203 ± 2.824 s from raw CT images.</p>\n </section>\n \n <section>\n \n <h3> Conclusion</h3>\n \n <p>This study provided an end-to-end solution using real clinical data based on 3D CNN to quantitatively evaluate facial aesthetics by considering three anatomical factors simultaneously, showing promising potential in reducing workload and bridging the surgeon-patient aesthetics perspective gap.</p>\n </section>\n </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":"20 3","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2024-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rcs.2651","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Medical Robotics and Computer Assisted Surgery","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/rcs.2651","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SURGERY","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Quantitative evaluation of facial aesthetics is an important but also time-consuming procedure in orthognathic surgery, while existing 2D beauty-scoring models are mainly used for entertainment with less clinical impact.
Methods
A deep-learning-based 3D evaluation model DeepBeauty3D was designed and trained using 133 patients' CT images. The customised image preprocessing module extracted the skeleton, soft tissue, and personal physical information from raw DICOM data, and the predicting network module employed 3-input-2-output convolution neural networks (CNN) to receive the aforementioned data and output aesthetic scores automatically.
Results
Experiment results showed that this model predicted the skeleton and soft tissue score with 0.231 ± 0.218 (4.62%) and 0.100 ± 0.344 (2.00%) accuracy in 11.203 ± 2.824 s from raw CT images.
Conclusion
This study provided an end-to-end solution using real clinical data based on 3D CNN to quantitatively evaluate facial aesthetics by considering three anatomical factors simultaneously, showing promising potential in reducing workload and bridging the surgeon-patient aesthetics perspective gap.
期刊介绍:
The International Journal of Medical Robotics and Computer Assisted Surgery provides a cross-disciplinary platform for presenting the latest developments in robotics and computer assisted technologies for medical applications. The journal publishes cutting-edge papers and expert reviews, complemented by commentaries, correspondence and conference highlights that stimulate discussion and exchange of ideas. Areas of interest include robotic surgery aids and systems, operative planning tools, medical imaging and visualisation, simulation and navigation, virtual reality, intuitive command and control systems, haptics and sensor technologies. In addition to research and surgical planning studies, the journal welcomes papers detailing clinical trials and applications of computer-assisted workflows and robotic systems in neurosurgery, urology, paediatric, orthopaedic, craniofacial, cardiovascular, thoraco-abdominal, musculoskeletal and visceral surgery. Articles providing critical analysis of clinical trials, assessment of the benefits and risks of the application of these technologies, commenting on ease of use, or addressing surgical education and training issues are also encouraged. The journal aims to foster a community that encompasses medical practitioners, researchers, and engineers and computer scientists developing robotic systems and computational tools in academic and commercial environments, with the intention of promoting and developing these exciting areas of medical technology.