1
|
Kim P, Seo B, De Silva H. Concordance of clinician, Chat-GPT4, and ORAD diagnoses against histopathology in Odontogenic Keratocysts and tumours: a 15-Year New Zealand retrospective study. Oral Maxillofac Surg 2024; 28:1557-1569. [PMID: 39060850 DOI: 10.1007/s10006-024-01284-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2024] [Accepted: 07/19/2024] [Indexed: 07/28/2024]
Abstract
BACKGROUND This research aimed to investigate the concordance between clinical impressions and histopathologic diagnoses made by clinicians and artificial intelligence tools for odontogenic keratocyst (OKC) and Odontogenic tumours (OT) in a New Zealand population from 2008 to 2023. METHODS Histopathological records from the Oral Pathology Centre, University of Otago (2008-2023) were examined to identify OKCs and OT. Specimen referral details, histopathologic reports, and clinician differential diagnoses, as well as those provided by ORAD and Chat-GPT4, were documented. Data were analyzed using SPSS, and concordance between provisional and histopathologic diagnoses was ascertained. RESULTS Of the 34,225 biopsies, 302 and 321 samples were identified as OTs and OKCs. Concordance rates were 43.2% for clinicians, 45.6% for ORAD, and 41.4% for Chat-GPT4. Corresponding Kappa value against histological diagnosis were 0.23, 0.13 and 0.14. Surgeons achieved a higher concordance rate (47.7%) compared to non-surgeons (29.82%). Odds ratio of having concordant diagnosis using Chat-GPT4 and ORAD were between 1.4 and 2.8 (p < 0.05). ROC-AUC and PR-AUC were similar between the groups (Clinician 0.62/0.42, ORAD 0.58/0.28, Char-GPT4 0.63/0.37) for ameloblastoma and for OKC (Clinician 0.64/0.78, ORAD 0.66/0.77, Char-GPT4 0.60/0.71). CONCLUSION Clinicians with surgical training achieved higher concordance rate when it comes to OT and OKC. Chat-GPT4 and Bayesian approach (ORAD) have shown potential in enhancing diagnostic capabilities.
Collapse
Affiliation(s)
- Paul Kim
- Oral and Maxillofacial Surgery Registrar, Dunedin Hospital, Dunedin, New Zealand.
| | - Benedict Seo
- Department of Oral Diagnostic and Surgical Sciences, University of Otago, Dunedin, New Zealand
| | - Harsha De Silva
- Department of Oral Diagnostic and Surgical Sciences, University of Otago, Dunedin, New Zealand
| |
Collapse
|
2
|
Rampf S, Gehrig H, Möltner A, Fischer MR, Schwendicke F, Huth KC. Radiographical diagnostic competences of dental students using various feedback methods and integrating an artificial intelligence application-A randomized clinical trial. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2024; 28:925-937. [PMID: 39082447 DOI: 10.1111/eje.13028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/30/2023] [Revised: 05/15/2024] [Accepted: 05/27/2024] [Indexed: 10/16/2024]
Abstract
INTRODUCTION Radiographic diagnostic competences are a primary focus of dental education. This study assessed two feedback methods to enhance learning outcomes and explored the feasibility of artificial intelligence (AI) to support education. MATERIALS AND METHODS Fourth-year dental students had access to 16 virtual radiological example cases for 8 weeks. They were randomly assigned to either elaborated feedback (eF) or knowledge of results feedback (KOR) based on expert consensus. Students´ diagnostic competences were tested on bitewing/periapical radiographs for detection of caries, apical periodontitis, accuracy for all radiological findings and image quality. We additionally assessed the accuracy of an AI system (dentalXrai Pro 3.0), where applicable. Data were analysed descriptively and using ROC analysis (accuracy, sensitivity, specificity, AUC). Groups were compared with Welch's t-test. RESULTS Among 55 students, the eF group by large performed significantly better than the KOR group in detecting enamel caries (accuracy 0.840 ± 0.041, p = .196; sensitivity 0.638 ± 0.204, p = .037; specificity 0.859 ± 0.050, p = .410; ROC AUC 0.748 ± 0.094, p = .020), apical periodontitis (accuracy 0.813 ± 0.095, p = .011; sensitivity 0.476 ± 0.230, p = .003; specificity 0.914 ± 0.108, p = .292; ROC AUC 0.695 ± 0.123, p = .001) and in assessing the image quality of periapical images (p = .031). No significant differences were observed for the other outcomes. The AI showed almost perfect diagnostic performance (enamel caries: accuracy 0.964, sensitivity 0.857, specificity 0.074; dentin caries: accuracy 0.988, sensitivity 0.941, specificity 1.0; overall: accuracy 0.976, sensitivity 0.958, specificity 0.983). CONCLUSION Elaborated feedback can improve student's radiographic diagnostic competences, particularly in detecting enamel caries and apical periodontitis. Using an AI may constitute an alternative to expert labelling of radiographs.
Collapse
Affiliation(s)
- Sarah Rampf
- Department of Conservative Dentistry, University Hospital Heidelberg, Heidelberg University, Heidelberg, Germany
| | - Holger Gehrig
- Department of Conservative Dentistry, University Hospital Heidelberg, Heidelberg University, Heidelberg, Germany
| | - Andreas Möltner
- Deans Office of the Medical Faculty, Heidelberg University, Heidelberg, Germany
| | - Martin R Fischer
- Institute of Medical Education, LMU University Hospital, LMU Munich, Munich, Germany
| | - Falk Schwendicke
- Department of Conservative Dentistry and Periodontology, University Hospital, LMU Munich, Munich, Germany
| | - Karin C Huth
- Department of Conservative Dentistry and Periodontology, University Hospital, LMU Munich, Munich, Germany
| |
Collapse
|
3
|
Chen Y, Du P, Zhang Y, Guo X, Song Y, Wang J, Yang LL, He W. Image-based multi-omics analysis for oral science: recent progress and perspectives. J Dent 2024:105425. [PMID: 39427959 DOI: 10.1016/j.jdent.2024.105425] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2024] [Revised: 10/01/2024] [Accepted: 10/18/2024] [Indexed: 10/22/2024] Open
Abstract
OBJECTIVES The diagnosis and treatment of oral and dental diseases rely heavily on various types of medical imaging. Deep learning-mediated multi-omics analysis can extract more representative features than those identified through traditional diagnostic methods. This review aims to discuss the applications and recent advances in image-based multi-omics analysis in oral science and to highlight its potential to enhance traditional diagnostic approaches for oral diseases. STUDY SELECTION, DATA, AND SOURCES A systematic search was conducted in the PubMed, Web of Science, and Google Scholar databases, covering all available records. This search thoroughly examined and summarized advances in image-based multi-omics analysis in oral and maxillofacial medicine. CONCLUSIONS This review comprehensively summarizes recent advancements in image-based multi-omics analysis for oral science, including radiomics, pathomics, and photographic-based omics analysis. It also discusses the ongoing challenges and future perspectives that could provide new insights into exploiting the potential of image-based omics analysis in the field of oral science. CLINICAL SIGNIFICANCE This review article presents the state of image-based multi-omics analysis in stomatology, aiming to help oral clinicians recognize the utility of combining omics analyses with imaging during diagnosis and treatment, which can improve diagnostic accuracy, shorten times to diagnosis, save medical resources, and reduce disparity in professional knowledge among clinicians.
Collapse
Affiliation(s)
- Yizhuo Chen
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China
| | - Pengxi Du
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China
| | - Yinyin Zhang
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China
| | - Xin Guo
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China
| | - Yujing Song
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China
| | - Jianhua Wang
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China
| | - Lei-Lei Yang
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China.
| | - Wei He
- Department of Stomatology, The First Affiliated Hospital of Zhengzhou University, Zhengzhou 450052, China.
| |
Collapse
|
4
|
Liu W, Li X, Liu C, Gao G, Xiong Y, Zhu T, Zeng W, Guo J, Tang W. Automatic classification and segmentation of multiclass jaw lesions in cone-beam CT using deep learning. Dentomaxillofac Radiol 2024; 53:439-446. [PMID: 38937280 DOI: 10.1093/dmfr/twae028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 04/06/2024] [Accepted: 06/24/2024] [Indexed: 06/29/2024] Open
Abstract
OBJECTIVES To develop and validate a modified deep learning (DL) model based on nnU-Net for classifying and segmenting five-class jaw lesions using cone-beam CT (CBCT). METHODS A total of 368 CBCT scans (37 168 slices) were used to train a multi-class segmentation model. The data underwent manual annotation by two oral and maxillofacial surgeons (OMSs) to serve as ground truth. Sensitivity, specificity, precision, F1-score, and accuracy were used to evaluate the classification ability of the model and doctors, with or without artificial intelligence assistance. The dice similarity coefficient (DSC), average symmetric surface distance (ASSD), and segmentation time were used to evaluate the segmentation effect of the model. RESULTS The model achieved the dual task of classifying and segmenting jaw lesions in CBCT. For classification, the sensitivity, specificity, precision, and accuracy of the model were 0.871, 0.974, 0.874, and 0.891, respectively, surpassing oral and maxillofacial radiologists (OMFRs) and OMSs, approaching the specialist. With the model's assistance, the classification performance of OMFRs and OMSs improved, particularly for odontogenic keratocyst (OKC) and ameloblastoma (AM), with F1-score improvements ranging from 6.2% to 12.7%. For segmentation, the DSC was 87.2% and the ASSD was 1.359 mm. The model's average segmentation time was 40 ± 9.9 s, contrasting with 25 ± 7.2 min for OMSs. CONCLUSIONS The proposed DL model accurately and efficiently classified and segmented five classes of jaw lesions using CBCT. In addition, it could assist doctors in improving classification accuracy and segmentation efficiency, particularly in distinguishing confusing lesions (eg, AM and OKC).
Collapse
Affiliation(s)
- Wei Liu
- State Key Laboratory of Oral Diseases & National Center for Stomatology & National Clinical Research Center for Oral Diseases & Department of Oral and Maxillofacial Surgery, West China Hospital of Stomatology, Sichuan University, Chengdu 610041, China
| | - Xiang Li
- Machine Intelligence Laboratory, College of Computer Science, Sichuan University, Chengdu 610065, China
| | - Chang Liu
- State Key Laboratory of Oral Diseases & National Center for Stomatology & National Clinical Research Center for Oral Diseases & Department of Oral and Maxillofacial Surgery, West China Hospital of Stomatology, Sichuan University, Chengdu 610041, China
| | - Ge Gao
- State Key Laboratory of Oral Diseases & National Center for Stomatology & National Clinical Research Center for Oral Diseases & Department of Oral and Maxillofacial Surgery, West China Hospital of Stomatology, Sichuan University, Chengdu 610041, China
| | - Yutao Xiong
- State Key Laboratory of Oral Diseases & National Center for Stomatology & National Clinical Research Center for Oral Diseases & Department of Oral and Maxillofacial Surgery, West China Hospital of Stomatology, Sichuan University, Chengdu 610041, China
| | - Tao Zhu
- State Key Laboratory of Oral Diseases & National Center for Stomatology & National Clinical Research Center for Oral Diseases & Department of Oral and Maxillofacial Surgery, West China Hospital of Stomatology, Sichuan University, Chengdu 610041, China
| | - Wei Zeng
- State Key Laboratory of Oral Diseases & National Center for Stomatology & National Clinical Research Center for Oral Diseases & Department of Oral and Maxillofacial Surgery, West China Hospital of Stomatology, Sichuan University, Chengdu 610041, China
| | - Jixiang Guo
- Machine Intelligence Laboratory, College of Computer Science, Sichuan University, Chengdu 610065, China
| | - Wei Tang
- State Key Laboratory of Oral Diseases & National Center for Stomatology & National Clinical Research Center for Oral Diseases & Department of Oral and Maxillofacial Surgery, West China Hospital of Stomatology, Sichuan University, Chengdu 610041, China
| |
Collapse
|
5
|
Fedato Tobias RS, Teodoro AB, Evangelista K, Leite AF, Valladares-Neto J, de Freitas Silva BS, Yamamoto-Silva FP, Almeida FT, Silva MAG. Diagnostic capability of artificial intelligence tools for detecting and classifying odontogenic cysts and tumors: a systematic review and meta-analysis. Oral Surg Oral Med Oral Pathol Oral Radiol 2024; 138:414-426. [PMID: 38845306 DOI: 10.1016/j.oooo.2024.03.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2023] [Revised: 03/09/2024] [Accepted: 03/11/2024] [Indexed: 08/23/2024]
Abstract
OBJECTIVE To evaluate the diagnostic capability of artificial intelligence (AI) for detecting and classifying odontogenic cysts and tumors, with special emphasis on odontogenic keratocyst (OKC) and ameloblastoma. STUDY DESIGN Nine electronic databases and the gray literature were examined. Human-based studies using AI algorithms to detect or classify odontogenic cysts and tumors by using panoramic radiographs or CBCT were included. Diagnostic tests were evaluated, and a meta-analysis was performed for classifying OKCs and ameloblastomas. Heterogeneity, risk of bias, and certainty of evidence were evaluated. RESULTS Twelve studies concluded that AI is a promising tool for the detection and/or classification of lesions, producing high diagnostic test values. Three articles assessed the sensitivity of convolutional neural networks in classifying similar lesions using panoramic radiographs, specifically OKC and ameloblastoma. The accuracy was 0.893 (95% CI 0.832-0.954). AI applied to cone beam computed tomography produced superior accuracy based on only 4 studies. The results revealed heterogeneity in the models used, variations in imaging examinations, and discrepancies in the presentation of metrics. CONCLUSION AI tools exhibited a relatively high level of accuracy in detecting and classifying OKC and ameloblastoma. Panoramic radiography appears to be an accurate method for AI-based classification of these lesions, albeit with a low level of certainty. The accuracy of CBCT model data appears to be high and promising, although with limited available data.
Collapse
Affiliation(s)
| | - Ana Beatriz Teodoro
- Graduate Program, School of Dentistry, Federal University of Goias, Goiânia, Goiás, Brazil
| | - Karine Evangelista
- Department of Orthodontics, School of Dentistry, Federal University of Goias, Goiânia, Goiás, Brazil
| | - André Ferreira Leite
- Oral and Maxillofacial Radiology, Department of Dentistry, Faculty of Health Sciences, Brasília-DF, Brazil
| | - José Valladares-Neto
- Department of Orthodontics, School of Dentistry, Federal University of Goias, Goiânia, Goiás, Brazil
| | | | | | - Fabiana T Almeida
- Oral and Maxillofacial Radiology, Faculty of Medicine and Dentistry, University of Alberta, Canada
| | | |
Collapse
|
6
|
Erturk M, Öziç MÜ, Tassoker M. Deep Convolutional Neural Network for Automated Staging of Periodontal Bone Loss Severity on Bite-wing Radiographs: An Eigen-CAM Explainability Mapping Approach. JOURNAL OF IMAGING INFORMATICS IN MEDICINE 2024:10.1007/s10278-024-01218-3. [PMID: 39147888 DOI: 10.1007/s10278-024-01218-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/01/2024] [Revised: 07/16/2024] [Accepted: 07/29/2024] [Indexed: 08/17/2024]
Abstract
Periodontal disease is a significant global oral health problem. Radiographic staging is critical in determining periodontitis severity and treatment requirements. This study aims to automatically stage periodontal bone loss using a deep learning approach using bite-wing images. A total of 1752 bite-wing images were used for the study. Radiological examinations were classified into 4 groups. Healthy (normal), no bone loss; stage I (mild destruction), bone loss in the coronal third (< 15%); stage II (moderate destruction), bone loss is in the coronal third and from 15 to 33% (15-33%); stage III-IV (severe destruction), bone loss extending from the middle third to the apical third with furcation destruction (> 33%). All images were converted to 512 × 400 dimensions using bilinear interpolation. The data was divided into 80% training validation and 20% testing. The classification module of the YOLOv8 deep learning model was used for the artificial intelligence-based classification of the images. Based on four class results, it was trained using fivefold cross-validation after transfer learning and fine tuning. After the training, 20% of test data, which the system had never seen, were analyzed using the artificial intelligence weights obtained in each cross-validation. Training and test results were calculated with average accuracy, precision, recall, and F1-score performance metrics. Test images were analyzed with Eigen-CAM explainability heat maps. In the classification of bite-wing images as healthy, mild destruction, moderate destruction, and severe destruction, training performance results were 86.100% accuracy, 84.790% precision, 82.350% recall, and 84.411% F1-score, and test performance results were 83.446% accuracy, 81.742% precision, 80.883% recall, and 81.090% F1-score. The deep learning model gave successful results in staging periodontal bone loss in bite-wing images. Classification scores were relatively high for normal (no bone loss) and severe bone loss in bite-wing images, as they are more clearly visible than mild and moderate damage.
Collapse
Affiliation(s)
- Mediha Erturk
- Department of Oral and Maxillofacial Radiology, Faculty of Dentistry, Necmettin Erbakan University, Konya, Turkey
| | - Muhammet Üsame Öziç
- Faculty of Technology Department of Biomedical Engineering, Pamukkale University, Denizli, Turkey.
| | - Melek Tassoker
- Department of Oral and Maxillofacial Radiology, Faculty of Dentistry, Necmettin Erbakan University, Konya, Turkey
| |
Collapse
|
7
|
Zirek T, Öziç MÜ, Tassoker M. AI-Driven localization of all impacted teeth and prediction of winter angulation for third molars on panoramic radiographs: Clinical user interface design. Comput Biol Med 2024; 178:108755. [PMID: 38897151 DOI: 10.1016/j.compbiomed.2024.108755] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Revised: 06/05/2024] [Accepted: 06/11/2024] [Indexed: 06/21/2024]
Abstract
PURPOSE Impacted teeth are abnormal tooth disorders under the gums or jawbone that cannot take their normal position even though it is time to erupt. This study aims to detect all impacted teeth and to classify impacted third molars according to the Winter method with an artificial intelligence model on panoramic radiographs. METHODS In this study, 1197 panoramic radiographs from the dentistry faculty database were collected for all impacted teeth, and 1000 panoramic radiographs were collected for Winter classification. Some pre-processing methods were performed and the images were doubled with data augmentation. Both datasets were randomly divided into 80% training, 10% validation, and 10% testing. After transfer learning and fine-tuning processes, the two datasets were trained with the YOLOv8 deep learning algorithm, a high-performance artificial intelligence model, and the detection of impacted teeth was carried out. The results were evaluated with precision, recall, mAP, and F1-score performance metrics. A graphical user interface was designed for clinical use with the artificial intelligence weights obtained as a result of the training. RESULTS For the detection of impacted third molar teeth according to Winter classification, the average precision, average recall, and average F1 score were obtained to be 0.972, 0.967, and 0.969, respectively. For the detection of all impacted teeth, the average precision, average recall, and average F1 score were obtained as 0.991, 0.995, and 0.993, respectively. CONCLUSION According to the results, the artificial intelligence-based YOLOv8 deep learning model successfully detected all impacted teeth and the impacted third molar teeth according to the Winter classification system.
Collapse
Affiliation(s)
- Taha Zirek
- Necmettin Erbakan University, Faculty of Dentistry, Department of Oral and Maxillofacial Radiology, Konya, Turkey
| | - Muhammet Üsame Öziç
- Pamukkale University, Faculty of Technology, Department of Biomedical Engineering, Denizli, Turkey
| | - Melek Tassoker
- Necmettin Erbakan University, Faculty of Dentistry, Department of Oral and Maxillofacial Radiology, Konya, Turkey.
| |
Collapse
|
8
|
Giraldo-Roldán D, Araújo ALD, Moraes MC, da Silva VM, Ribeiro ECC, Cerqueira M, Saldivia-Siracusa C, Sousa-Neto SS, Pérez-de-Oliveira ME, Lopes MA, Kowalski LP, de Carvalho ACPDLF, Santos-Silva AR, Vargas PA. Artificial intelligence and radiomics in the diagnosis of intraosseous lesions of the gnathic bones: A systematic review. J Oral Pathol Med 2024; 53:415-433. [PMID: 38807455 DOI: 10.1111/jop.13548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2024] [Revised: 05/02/2024] [Accepted: 05/05/2024] [Indexed: 05/30/2024]
Abstract
BACKGROUND The purpose of this systematic review (SR) is to gather evidence on the use of machine learning (ML) models in the diagnosis of intraosseous lesions in gnathic bones and to analyze the reliability, impact, and usefulness of such models. This SR was performed in accordance with the PRISMA 2022 guidelines and was registered in the PROSPERO database (CRD42022379298). METHODS The acronym PICOS was used to structure the inquiry-focused review question "Is Artificial Intelligence reliable for the diagnosis of intraosseous lesions in gnathic bones?" The literature search was conducted in various electronic databases, including PubMed, Embase, Scopus, Cochrane Library, Web of Science, Lilacs, IEEE Xplore, and Gray Literature (Google Scholar and ProQuest). Risk of bias assessment was performed using PROBAST, and the results were synthesized by considering the task and sampling strategy of the dataset. RESULTS Twenty-six studies were included (21 146 radiographic images). Ameloblastomas, odontogenic keratocysts, dentigerous cysts, and periapical cysts were the most frequently investigated lesions. According to TRIPOD, most studies were classified as type 2 (randomly divided). The F1 score was presented in only 13 studies, which provided the metrics for 20 trials, with a mean of 0.71 (±0.25). CONCLUSION There is no conclusive evidence to support the usefulness of ML-based models in the detection, segmentation, and classification of intraosseous lesions in gnathic bones for routine clinical application. The lack of detail about data sampling, the lack of a comprehensive set of metrics for training and validation, and the absence of external testing limit experiments and hinder proper evaluation of model performance.
Collapse
Affiliation(s)
- Daniela Giraldo-Roldán
- Faculdade de Odontologia de Piracicaba, Universidade Estadual de Campinas (FOP-UNICAMP), Piracicaba, Brazil
| | | | - Matheus Cardoso Moraes
- Department of Science and Technology, Institute of Science and Technology, Federal University of São Paulo (ICT-Unifesp), São José dos Campos, Brazil
| | - Viviane Mariano da Silva
- Department of Science and Technology, Institute of Science and Technology, Federal University of São Paulo (ICT-Unifesp), São José dos Campos, Brazil
| | - Erin Crespo Cordeiro Ribeiro
- Department of Science and Technology, Institute of Science and Technology, Federal University of São Paulo (ICT-Unifesp), São José dos Campos, Brazil
| | - Matheus Cerqueira
- Department of Computer Science, Institute of Mathematics and Computer Science (ICMC - USP), University of São Paulo, São Carlos, Brazil
| | - Cristina Saldivia-Siracusa
- Faculdade de Odontologia de Piracicaba, Universidade Estadual de Campinas (FOP-UNICAMP), Piracicaba, Brazil
| | | | | | - Marcio Ajudarte Lopes
- Faculdade de Odontologia de Piracicaba, Universidade Estadual de Campinas (FOP-UNICAMP), Piracicaba, Brazil
| | - Luiz Paulo Kowalski
- Head and Neck Surgery Department, University of São Paulo Medical School (FMUSP), São Paulo, Brazil
- Department of Head and Neck Surgery and Otorhinolaryngology, A.C. Camargo Cancer Center, São Paulo, Brazil
| | | | - Alan Roger Santos-Silva
- Faculdade de Odontologia de Piracicaba, Universidade Estadual de Campinas (FOP-UNICAMP), Piracicaba, Brazil
| | - Pablo Agustin Vargas
- Faculdade de Odontologia de Piracicaba, Universidade Estadual de Campinas (FOP-UNICAMP), Piracicaba, Brazil
| |
Collapse
|
9
|
Çelebi A, Imak A, Üzen H, Budak Ü, Türkoğlu M, Hanbay D, Şengür A. Maxillary sinus detection on cone beam computed tomography images using ResNet and Swin Transformer-based UNet. Oral Surg Oral Med Oral Pathol Oral Radiol 2024; 138:149-161. [PMID: 37633787 DOI: 10.1016/j.oooo.2023.06.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2022] [Revised: 02/27/2023] [Accepted: 06/01/2023] [Indexed: 08/28/2023]
Abstract
OBJECTIVES This study, which uses artificial intelligence-based methods, aimed to determine the limits of pathologic conditions and infections related to the maxillary sinus in cone beam computed tomography (CBCT) images to facilitate the work of dentists. METHODS A new UNet architecture based on a state-of-the-art Swin transformer called Res-Swin-UNet was developed to detect the sinus. The encoder part of the proposed network model consists of a pre-trained ResNet architecture, and the decoder part consists of Swin transformer blocks. Swin transformers achieve powerful global context properties with self-attention mechanisms. Because the output of the Swin transformer generates sectorized features, the patch expanding layer was used in this section instead of the traditional upsampling layer. In the last layer of the decoder, sinus diagnosis was conducted through classical convolution and sigmoid function. In experimental works, we used a data set including 298 CBCT images. RESULTS The Res-Swin-UNet model achieved more success, with a 91.72% F1-score, 99% accuracy, and 84.71% IoU, outperforming the state-of-the-art models. CONCLUSIONS The deep learning-based model proposed in the present study can assist dentists in automatically detecting the boundaries of pathologic conditions and infections within the maxillary sinus based on CBCT images.
Collapse
Affiliation(s)
- Adalet Çelebi
- Oral and Maxillofacial Surgery Department, Faculty of Dentistry, Mersin University, Mersin, Turkey
| | - Andaç Imak
- Department of Electrical and Electronic Engineering, Faculty of Engineering, Munzur University, Tunceli, Turkey.
| | - Hüseyin Üzen
- Department of Computer Engineering, Faculty of Engineering, Bingol University, Bingol, Turkey
| | - Ümit Budak
- Department of Electrical and Electronics Engineering, Faculty of Engineering, Bitlis Eren University, Bitlis, Turkey
| | - Muammer Türkoğlu
- Department of Software Engineering, Faculty of Engineering, Samsun University, Samsun, Turkey
| | - Davut Hanbay
- Department of Computer Engineering, Faculty of Engineering, Inonu University, Malatya, Turkey
| | - Abdulkadir Şengür
- Department of Electrical and Electronic Engineering, Faculty of Technology, Firat University, Elazig, Turkey
| |
Collapse
|
10
|
Shrivastava PK, Hasan S, Abid L, Injety R, Shrivastav AK, Sybil D. Accuracy of machine learning in the diagnosis of odontogenic cysts and tumors: a systematic review and meta-analysis. Oral Radiol 2024; 40:342-356. [PMID: 38530559 DOI: 10.1007/s11282-024-00745-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2023] [Accepted: 03/06/2024] [Indexed: 03/28/2024]
Abstract
BACKGROUND The recent impact of artificial intelligence in diagnostic services has been enormous. Machine learning tools offer an innovative alternative to diagnose cysts and tumors radiographically that pose certain challenges due to the near similar presentation, anatomical variations, and superimposition. It is crucial that the performance of these models is evaluated for their clinical applicability in diagnosing cysts and tumors. METHODS A comprehensive literature search was carried out on eminent databases for published studies between January 2015 and December 2022. Studies utilizing machine learning models in the diagnosis of odontogenic cysts or tumors using Orthopantomograms (OPG) or Cone Beam Computed Tomographic images (CBCT) were included. QUADAS-2 tool was used for the assessment of the risk of bias and applicability concerns. Meta-analysis was performed for studies reporting sufficient performance metrics, separately for OPG and CBCT. RESULTS 16 studies were included for qualitative synthesis including a total of 10,872 odontogenic cysts and tumors. The sensitivity and specificity of machine learning in diagnosing cysts and tumors through OPG were 0.83 (95% CI 0.81-0.85) and 0.82 (95% CI 0.81-0.83) respectively. Studies utilizing CBCT noted a sensitivity of 0.88 (95% CI 0.87-0.88) and specificity of 0.88 (95% CI 0.87-0.89). Highest classification accuracy was 100%, noted for Support Vector Machine classifier. CONCLUSION The results from the present review favoured machine learning models to be used as a clinical adjunct in the radiographic diagnosis of odontogenic cysts and tumors, provided they undergo robust training with a huge dataset. However, the arduous process, investment, and certain ethical concerns associated with the total dependence on technology must be taken into account. Standardized reporting of outcomes for diagnostic studies utilizing machine learning methods is recommended to ensure homogeneity in assessment criteria, facilitate comparison between different studies, and promote transparency in research findings.
Collapse
Affiliation(s)
| | - Shamimul Hasan
- Department of Oral Medicine and Radiology, Faculty of Dentistry, Jamia Millia Islamia, New Delhi, India
| | - Laraib Abid
- Faculty of Dentistry, Jamia Millia Islamia, New Delhi, India
| | - Ranjit Injety
- Department of Neurology, Christian Medical College & Hospital, Ludhiana, Punjab, India
| | - Ayush Kumar Shrivastav
- Computer Science and Engineering, Centre for Development of Advanced Computing, Noida, Uttar Pradesh, India
| | - Deborah Sybil
- Department of Oral and Maxillofacial Surgery, Faculty of Dentistry, Jamia Millia Islamia, New Delhi, India.
| |
Collapse
|
11
|
Yilmaz S, Tasyurek M, Amuk M, Celik M, Canger EM. Developing deep learning methods for classification of teeth in dental panoramic radiography. Oral Surg Oral Med Oral Pathol Oral Radiol 2024; 138:118-127. [PMID: 37316425 DOI: 10.1016/j.oooo.2023.02.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2022] [Revised: 09/13/2022] [Accepted: 02/10/2023] [Indexed: 06/16/2023]
Abstract
OBJECTIVES We aimed to develop an artificial intelligence-based clinical dental decision-support system using deep-learning methods to reduce diagnostic interpretation error and time and increase the effectiveness of dental treatment and classification. STUDY DESIGN We compared the performance of 2 deep-learning methods, You Only Look Once V4 (YOLO-V4) and Faster Regions with the Convolutional Neural Networks (R-CNN), for tooth classification in dental panoramic radiography for tooth classification in dental panoramic radiography to determine which is more successful in terms of accuracy, time, and detection ability. Using a method based on deep-learning models trained on a semantic segmentation task, we analyzed 1200 panoramic radiographs selected retrospectively. In the classification process, our model identified 36 classes, including 32 teeth and 4 impacted teeth. RESULTS The YOLO-V4 method achieved a mean 99.90% precision, 99.18% recall, and 99.54% F1 score. The Faster R-CNN method achieved a mean 93.67% precision, 90.79% recall, and 92.21% F1 score. Experimental evaluations showed that the YOLO-V4 method outperformed the Faster R-CNN method in terms of accuracy of predicted teeth in the tooth classification process, speed of tooth classification, and ability to detect impacted and erupted third molars. CONCLUSIONS The YOLO-V4 method outperforms the Faster R-CNN method in terms of accuracy of tooth prediction, speed of detection, and ability to detect impacted third molars and erupted third molars. The proposed deep learning based methods can assist dentists in clinical decision making, save time, and reduce the negative effects of stress and fatigue in daily practice.
Collapse
Affiliation(s)
- Serkan Yilmaz
- Faculty of Dentistry, Department of Oral and Maxillofacial Radiology, Erciyes University, Kayseri, Turkey
| | - Murat Tasyurek
- Department of Computer Engineering, Kayseri University, Kayseri, Turkey
| | - Mehmet Amuk
- Faculty of Dentistry, Department of Oral and Maxillofacial Radiology, Erciyes University, Kayseri, Turkey
| | - Mete Celik
- Department of Computer Engineering, Erciyes University, Kayseri, Turkey
| | - Emin Murat Canger
- Faculty of Dentistry, Department of Oral and Maxillofacial Radiology, Erciyes University, Kayseri, Turkey.
| |
Collapse
|
12
|
Song Y, Ma S, Mao B, Xu K, Liu Y, Ma J, Jia J. Application of machine learning in the preoperative radiomic diagnosis of ameloblastoma and odontogenic keratocyst based on cone-beam CT. Dentomaxillofac Radiol 2024; 53:316-324. [PMID: 38627247 PMCID: PMC11211686 DOI: 10.1093/dmfr/twae016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2023] [Revised: 01/03/2024] [Accepted: 04/11/2024] [Indexed: 06/29/2024] Open
Abstract
OBJECTIVES Preoperative diagnosis of oral ameloblastoma (AME) and odontogenic keratocyst (OKC) has been a challenge in dentistry. This study uses radiomics approaches and machine learning (ML) algorithms to characterize cone-beam CT (CBCT) image features for the preoperative differential diagnosis of AME and OKC and compares ML algorithms to expert radiologists to validate performance. METHODS We retrospectively collected the data of 326 patients with AME and OKC, where all diagnoses were confirmed by histopathologic tests. A total of 348 features were selected to train six ML models for differential diagnosis by a 5-fold cross-validation. We then compared the performance of ML-based diagnoses to those of radiologists. RESULTS Among the six ML models, XGBoost was effective in distinguishing AME and OKC in CBCT images, with its classification performance outperforming the other models. The mean precision, recall, accuracy, F1-score, and area under the curve (AUC) were 0.900, 0.807, 0.843, 0.841, and 0.872, respectively. Compared to the diagnostics by radiologists, ML-based radiomic diagnostics performed better. CONCLUSIONS Radiomic-based ML algorithms allow CBCT images of AME and OKC to be distinguished accurately, facilitating the preoperative differential diagnosis of AME and OKC. ADVANCES IN KNOWLEDGE ML and radiomic approaches with high-resolution CBCT images provide new insights into the differential diagnosis of AME and OKC.
Collapse
Affiliation(s)
- Yang Song
- School of Medicine and Health Management, Huazhong University of Science & Technology, Hangkong Road, Wuhan, 430030, China
| | - Sirui Ma
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Luoyu Road, Wuhan, 430072, China
- Department of Oral and Maxillofacial-Head and Neck Oncology, School and Hospital of Stomatology, Wuhan University, Luoyu Road, Wuhan, 430072, China
| | - Bing Mao
- Zhengzhou University People's Hospital (Henan Provincial People's Hospital), Weiwu Road, Zhengzhou, 450003, China
| | - Kun Xu
- School of Medicine and Health Management, Huazhong University of Science & Technology, Hangkong Road, Wuhan, 430030, China
| | - Yuan Liu
- School of Medicine and Health Management, Huazhong University of Science & Technology, Hangkong Road, Wuhan, 430030, China
| | - Jingdong Ma
- School of Medicine and Health Management, Huazhong University of Science & Technology, Hangkong Road, Wuhan, 430030, China
| | - Jun Jia
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Luoyu Road, Wuhan, 430072, China
- Department of Oral and Maxillofacial-Head and Neck Oncology, School and Hospital of Stomatology, Wuhan University, Luoyu Road, Wuhan, 430072, China
| |
Collapse
|
13
|
Shi YJ, Li JP, Wang Y, Ma RH, Wang YL, Guo Y, Li G. Deep learning in the diagnosis for cystic lesions of the jaws: a review of recent progress. Dentomaxillofac Radiol 2024; 53:271-280. [PMID: 38814810 PMCID: PMC11211683 DOI: 10.1093/dmfr/twae022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Revised: 05/06/2024] [Accepted: 05/09/2024] [Indexed: 06/01/2024] Open
Abstract
Cystic lesions of the gnathic bones present challenges in differential diagnosis. In recent years, artificial intelligence (AI) represented by deep learning (DL) has rapidly developed and emerged in the field of dental and maxillofacial radiology (DMFR). Dental radiography provides a rich resource for the study of diagnostic analysis methods for cystic lesions of the jaws and has attracted many researchers. The aim of the current study was to investigate the diagnostic performance of DL for cystic lesions of the jaws. Online searches were done on Google Scholar, PubMed, and IEEE Xplore databases, up to September 2023, with subsequent manual screening for confirmation. The initial search yielded 1862 titles, and 44 studies were ultimately included. All studies used DL methods or tools for the identification of a variable number of maxillofacial cysts. The performance of algorithms with different models varies. Although most of the reviewed studies demonstrated that DL methods have better discriminative performance than clinicians, further development is still needed before routine clinical implementation due to several challenges and limitations such as lack of model interpretability, multicentre data validation, etc. Considering the current limitations and challenges, future studies for the differential diagnosis of cystic lesions of the jaws should follow actual clinical diagnostic scenarios to coordinate study design and enhance the impact of AI in the diagnosis of oral and maxillofacial diseases.
Collapse
Affiliation(s)
- Yu-Jie Shi
- School of Electronics and Information Engineering, Beijing Jiaotong University, Beijing, 100044, China
| | - Ju-Peng Li
- School of Electronics and Information Engineering, Beijing Jiaotong University, Beijing, 100044, China
| | - Yue Wang
- School of Electronics and Information Engineering, Beijing Jiaotong University, Beijing, 100044, China
| | - Ruo-Han Ma
- Department of Oral and Maxillofacial Radiology, Peking University School and Hospital of Stomatology, Beijing, 100081, China
| | - Yan-Lin Wang
- Department of Oral and Maxillofacial Radiology, Peking University School and Hospital of Stomatology, Beijing, 100081, China
| | - Yong Guo
- School of Electronics and Information Engineering, Beijing Jiaotong University, Beijing, 100044, China
| | - Gang Li
- Department of Oral and Maxillofacial Radiology, Peking University School and Hospital of Stomatology, Beijing, 100081, China
| |
Collapse
|
14
|
Karakuş R, Öziç MÜ, Tassoker M. AI-Assisted Detection of Interproximal, Occlusal, and Secondary Caries on Bite-Wing Radiographs: A Single-Shot Deep Learning Approach. JOURNAL OF IMAGING INFORMATICS IN MEDICINE 2024:10.1007/s10278-024-01113-x. [PMID: 38743125 DOI: 10.1007/s10278-024-01113-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/23/2024] [Revised: 03/28/2024] [Accepted: 04/01/2024] [Indexed: 05/16/2024]
Abstract
Tooth decay is a common oral disease worldwide, but errors in diagnosis can often be made in dental clinics, which can lead to a delay in treatment. This study aims to use artificial intelligence (AI) for the automated detection and localization of secondary, occlusal, and interproximal (D1, D2, D3) caries types on bite-wing radiographs. The eight hundred and sixty bite-wing radiographs were collected from the School of Dentistry database. Pre-processing and data augmentation operations were performed. Interproximal (D1, D2, D3), secondary, and occlusal caries on bite-wing radiographs were annotated by two oral radiologists. The data were split into 80% for training, 10% for validation, and 10% for testing. The AI-based training process was conducted using the YOLOv8 algorithm. A clinical decision support system interface was designed using the Python PyQT5 library, allowing for the use of dental caries detection without the need for complex programming procedures. In the test images, the average precision, average sensitivity, and average F1 score values for secondary, occlusal, and interproximal caries were obtained as 0.977, 0.932, and 0.954, respectively. The AI-based dental caries detection system yielded highly successful results in the test, receiving full approval from dentists for clinical use. YOLOv8 has the potential to increase sensitivity and reliability while reducing the burden on dentists and can prevent diagnostic errors in dental clinics.
Collapse
Affiliation(s)
- Rabia Karakuş
- Faculty of Dentistry, Department of Oral and Maxillofacial Radiology, Necmettin Erbakan University, Konya, Turkey
| | - Muhammet Üsame Öziç
- Faculty of Technology, Department of Biomedical Engineering, Pamukkale University, Denizli, Turkey.
| | - Melek Tassoker
- Faculty of Dentistry, Department of Oral and Maxillofacial Radiology, Necmettin Erbakan University, Konya, Turkey
| |
Collapse
|
15
|
Pringle AJ, Kumaran V, Missier MS, Nadar ASP. Perceptiveness and Attitude on the use of Artificial Intelligence (AI) in Dentistry among Dentists and Non-Dentists - A Regional Survey. JOURNAL OF PHARMACY AND BIOALLIED SCIENCES 2024; 16:S1481-S1486. [PMID: 38882768 PMCID: PMC11174187 DOI: 10.4103/jpbs.jpbs_1019_23] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2023] [Revised: 10/14/2023] [Accepted: 10/22/2023] [Indexed: 06/18/2024] Open
Abstract
Artificial intelligence (AI) is an emerging tool in modern medicine and the digital world. AI can help dentists diagnose oral diseases, design treatment plans, monitor patient progress and automate administrative tasks. The aim of this study is to evaluate the perception and attitude on use of artificial intelligence in dentistry for diagnosis and treatment planning among dentists and non-dentists' population of south Tamil Nadu region in India. Materials and Methods A cross sectional online survey conducted using 20 close ended questionnaire google forms which were circulated among the dentists and non -dentists population of south Tamil Nadu region in India. The data collected from 264 participants (dentists -158, non-dentists -106) within a limited time frame were subjected to descriptive statistical analysis. Results 70.9% of dentists are aware of artificial intelligence in dentistry. 40.5% participants were not aware of AI in caries detection but aware of its use in interpretation of radiographs (43.9%) and in planning of orthognathic surgery (42.4%) which are statistically significant P < 0.05.44.7% support clinical experience of a human doctor better than AI diagnosis. Dentists of 54.4% agree to support AI use in dentistry. Conclusion The study concluded AI use in dentistry knowledge is more with dentists and perception of AI in dentistry is optimistic among dentists than non -dentists, majority of participants support AI in dentistry as an adjunct tool to diagnosis and treatment planning.
Collapse
Affiliation(s)
- A Jebilla Pringle
- Department of Orthodontics, Rajas Dental College and Hospitals, Kavalkinaru, Tamil Nadu, India
| | - V Kumaran
- Department of Orthodontics, J.K.K. Nataraja Dental College and Hospitals, Nammakal, Tamil Nadu, India
| | - Mary Sheloni Missier
- Department of Orthodontics, Rajas Dental College and Hospitals, Kavalkinaru, Tamil Nadu, India
| | | |
Collapse
|
16
|
Lee WF, Day MY, Fang CY, Nataraj V, Wen SC, Chang WJ, Teng NC. Establishing a novel deep learning model for detecting peri-implantiti s. J Dent Sci 2024; 19:1165-1173. [PMID: 38618118 PMCID: PMC11010782 DOI: 10.1016/j.jds.2023.11.017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2023] [Revised: 11/21/2023] [Accepted: 11/23/2023] [Indexed: 04/16/2024] Open
Abstract
BACKGROUND/PURPOSE The diagnosis of peri-implantitis using periapical radiographs is crucial. Recently, artificial intelligence may apply in radiographic image analysis effectively. The aim of this study was to differentiate the degree of marginal bone loss of an implant, and also to classify the severity of peri-implantitis using a deep learning model. MATERIALS AND METHODS A dataset of 800 periapical radiographic images were divided into training (n = 600), validation (n = 100), and test (n = 100) datasets with implants used for deep learning. An object detection algorithm (YOLOv7) was used to identify peri-implantitis. The classification performance of this model was evaluated using metrics, including the specificity, precision, recall, and F1 score. RESULTS Considering the classification performance, the specificity was 100%, precision was 100%, recall was 94.44%, and F1 score was 97.10%. CONCLUSION Results of this study suggested that implants can be identified from periapical radiographic images using deep learning-based object detection. This identification system could help dentists and patients suffering from implant problems. However, more images of other implant systems are needed to increase the learning performance to apply this system in clinical practice.
Collapse
Affiliation(s)
- Wei-Fang Lee
- School of Dentistry, Taipei Medical University, Taipei, Taiwan
- School of Dental Technology, Taipei Medical University, Taipei, Taiwan
| | - Min-Yuh Day
- Institute of Information Management, National Taipei University, New Taipei City, Taiwan
| | - Chih-Yuan Fang
- School of Dentistry, Taipei Medical University, Taipei, Taiwan
- Department of Oral and Maxillofacial Surgery, Wan Fang Hospital, Taipei Medical University, Taipei, Taiwan
| | - Vidhya Nataraj
- Institute of Information Management, National Taipei University, New Taipei City, Taiwan
| | - Shih-Cheng Wen
- School of Dentistry, Taipei Medical University, Taipei, Taiwan
- Private Practice, New Taipei City, Taiwan
| | - Wei-Jen Chang
- School of Dentistry, Taipei Medical University, Taipei, Taiwan
- Dental Department, Taipei Medical University, Shuang Ho Hospital, New Taipei City, Taiwan
| | - Nai-Chia Teng
- School of Dentistry, Taipei Medical University, Taipei, Taiwan
- Department of Dentistry, Taipei Medical University Hospital, Taipei, Taiwan
| |
Collapse
|
17
|
Delamare E, Fu X, Huang Z, Kim J. Panoramic imaging errors in machine learning model development: a systematic review. Dentomaxillofac Radiol 2024; 53:165-172. [PMID: 38273661 PMCID: PMC11003661 DOI: 10.1093/dmfr/twae002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2023] [Revised: 12/11/2023] [Accepted: 01/01/2024] [Indexed: 01/27/2024] Open
Abstract
OBJECTIVES To investigate the management of imaging errors from panoramic radiography (PAN) datasets used in the development of machine learning (ML) models. METHODS This systematic literature followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses and used three databases. Keywords were selected from relevant literature. ELIGIBILITY CRITERIA PAN studies that used ML models and mentioned image quality concerns. RESULTS Out of 400 articles, 41 papers satisfied the inclusion criteria. All the studies used ML models, with 35 papers using deep learning (DL) models. PAN quality assessment was approached in 3 ways: acknowledgement and acceptance of imaging errors in the ML model, removal of low-quality radiographs from the dataset before building the model, and application of image enhancement methods prior to model development. The criteria for determining PAN image quality varied widely across studies and were prone to bias. CONCLUSIONS This study revealed significant inconsistencies in the management of PAN imaging errors in ML research. However, most studies agree that such errors are detrimental when building ML models. More research is needed to understand the impact of low-quality inputs on model performance. Prospective studies may streamline image quality assessment by leveraging DL models, which excel at pattern recognition tasks.
Collapse
Affiliation(s)
- Eduardo Delamare
- Sydney Dental School, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
- Digital Health and Data Science, Faculty of Medicine and Health, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Xingyue Fu
- School of Computer Science, Faculty of Engineering, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Zimo Huang
- School of Computer Science, Faculty of Engineering, The University of Sydney, Camperdown, NSW, 2050, Australia
| | - Jinman Kim
- School of Computer Science, Faculty of Engineering, The University of Sydney, Camperdown, NSW, 2050, Australia
| |
Collapse
|
18
|
Lee T, Shin W, Lee JH, Lee S, Yeom HG, Yun JP. Resolving the non-uniformity in the feature space of age estimation: A deep learning model based on feature clusters of panoramic images. Comput Med Imaging Graph 2024; 112:102329. [PMID: 38271869 DOI: 10.1016/j.compmedimag.2024.102329] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Revised: 11/04/2023] [Accepted: 12/13/2023] [Indexed: 01/27/2024]
Abstract
Age estimation is important in forensics, and numerous techniques have been investigated to estimate age based on various parts of the body. Among them, dental tissue is considered reliable for estimating age as it is less influenced by external factors. The advancement in deep learning has led to the development of automatic estimation of age using dental panoramic images. Typically, most of the medical datasets used for model learning are non-uniform in the feature space. This causes the model to be highly influenced by dense feature areas, resulting in adequate estimations; however, relatively poor estimations are observed in other areas. An effective solution to address this issue can be pre-dividing the data by age feature and training each regressor to estimate the age for individual features. In this study, we divide the data based on feature clusters obtained from unsupervised learning. The developed model comprises a classification head and multi-regression head, wherein the former predicts the cluster to which the data belong and the latter estimates the age within the predicted cluster. The visualization results show that the model can focus on a clinically meaningful area in each cluster for estimating age. The proposed model outperforms the models without feature clusters by focusing on the differences within the area. The performance improvement is particularly noticeable in the growth and aging periods. Furthermore, the model can adequately estimate the age even for samples with a high probability of classification error as they are located at the border of two feature clusters.
Collapse
Affiliation(s)
- Taehan Lee
- AI research center for Manufacturing Systems (AIMS), Korea Institute of Industrial Technology (KITECH), Daegu 42994, South Korea
| | - WooSang Shin
- AI research center for Manufacturing Systems (AIMS), Korea Institute of Industrial Technology (KITECH), Daegu 42994, South Korea; Electronic Engineering Department, Kyungpook National University, Daegu 41566, South Korea
| | - Jong-Hyeon Lee
- AI research center for Manufacturing Systems (AIMS), Korea Institute of Industrial Technology (KITECH), Daegu 42994, South Korea; Electronic Engineering Department, Kyungpook National University, Daegu 41566, South Korea
| | - Sangmoon Lee
- Electronic Engineering Department, Kyungpook National University, Daegu 41566, South Korea
| | - Han-Gyeol Yeom
- Department of Oral and Maxillofacial Radiology and Wonkwang Dental Research Institute, College of Dentistry, Wonkwang University, Iksan 54538, South Korea.
| | - Jong Pil Yun
- AI research center for Manufacturing Systems (AIMS), Korea Institute of Industrial Technology (KITECH), Daegu 42994, South Korea; University of Science and Technology, Daegu 42994, South Korea.
| |
Collapse
|
19
|
Reduwan NH, Abdul Aziz AA, Mohd Razi R, Abdullah ERMF, Mazloom Nezhad SM, Gohain M, Ibrahim N. Application of deep learning and feature selection technique on external root resorption identification on CBCT images. BMC Oral Health 2024; 24:252. [PMID: 38373931 PMCID: PMC10875886 DOI: 10.1186/s12903-024-03910-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2023] [Accepted: 01/17/2024] [Indexed: 02/21/2024] Open
Abstract
BACKGROUND Artificial intelligence has been proven to improve the identification of various maxillofacial lesions. The aim of the current study is two-fold: to assess the performance of four deep learning models (DLM) in external root resorption (ERR) identification and to assess the effect of combining feature selection technique (FST) with DLM on their ability in ERR identification. METHODS External root resorption was simulated on 88 extracted premolar teeth using tungsten bur in different depths (0.5 mm, 1 mm, and 2 mm). All teeth were scanned using a Cone beam CT (Carestream Dental, Atlanta, GA). Afterward, a training (70%), validation (10%), and test (20%) dataset were established. The performance of four DLMs including Random Forest (RF) + Visual Geometry Group 16 (VGG), RF + EfficienNetB4 (EFNET), Support Vector Machine (SVM) + VGG, and SVM + EFNET) and four hybrid models (DLM + FST: (i) FS + RF + VGG, (ii) FS + RF + EFNET, (iii) FS + SVM + VGG and (iv) FS + SVM + EFNET) was compared. Five performance parameters were assessed: classification accuracy, F1-score, precision, specificity, and error rate. FST algorithms (Boruta and Recursive Feature Selection) were combined with the DLMs to assess their performance. RESULTS RF + VGG exhibited the highest performance in identifying ERR, followed by the other tested models. Similarly, FST combined with RF + VGG outperformed other models with classification accuracy, F1-score, precision, and specificity of 81.9%, weighted accuracy of 83%, and area under the curve (AUC) of 96%. Kruskal Wallis test revealed a significant difference (p = 0.008) in the prediction accuracy among the eight DLMs. CONCLUSION In general, all DLMs have similar performance on ERR identification. However, the performance can be improved by combining FST with DLMs.
Collapse
Affiliation(s)
- Nor Hidayah Reduwan
- Department of Oral and Maxillofacial Clinical Sciences, Faculty of Dentistry, Universiti Malaya, Kuala Lumpur, 50603, Malaysia
- Centre of Oral and Maxillofacial Diagnostic and Medicine Studies, Faculty of Dentistry, University Teknologi MARA, Sungai Buloh, 47000, Malaysia
| | - Azwatee Abdul Abdul Aziz
- Department of Restorative Dentistry, Faculty of Dentistry, Universiti Malaya, Kuala Lumpur, 50603, Malaysia
| | - Roziana Mohd Razi
- Department of Pediatric Dentistry and Orthodontic, Faculty of Dentistry, Universiti Malaya, Kuala Lumpur, 50603, Malaysia
| | - Erma Rahayu Mohd Faizal Abdullah
- Department of Artificial Intelligence, Faculty of Computer Science and Information Technology, Universiti Malaya, Kuala Lumpur, 50603, Malaysia.
| | - Seyed Matin Mazloom Nezhad
- Department of Artificial Intelligence, Faculty of Computer Science and Information Technology, Universiti Malaya, Kuala Lumpur, 50603, Malaysia
| | - Meghna Gohain
- Department of Oral and Maxillofacial Clinical Sciences, Faculty of Dentistry, Universiti Malaya, Kuala Lumpur, 50603, Malaysia
| | - Norliza Ibrahim
- Department of Oral and Maxillofacial Clinical Sciences, Faculty of Dentistry, Universiti Malaya, Kuala Lumpur, 50603, Malaysia.
| |
Collapse
|
20
|
Kang J, Le VNT, Lee DW, Kim S. Diagnosing oral and maxillofacial diseases using deep learning. Sci Rep 2024; 14:2497. [PMID: 38291068 PMCID: PMC10827796 DOI: 10.1038/s41598-024-52929-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2023] [Accepted: 01/25/2024] [Indexed: 02/01/2024] Open
Abstract
The classification and localization of odontogenic lesions from panoramic radiographs is a challenging task due to the positional biases and class imbalances of the lesions. To address these challenges, a novel neural network, DOLNet, is proposed that uses mutually influencing hierarchical attention across different image scales to jointly learn the global representation of the entire jaw and the local discrepancy between normal tissue and lesions. The proposed approach uses local attention to learn representations within a patch. From the patch-level representations, we generate inter-patch, i.e., global, attention maps to represent the positional prior of lesions in the whole image. Global attention enables the reciprocal calibration of path-level representations by considering non-local information from other patches, thereby improving the generation of whole-image-level representation. To address class imbalances, we propose an effective data augmentation technique that involves merging lesion crops with normal images, thereby synthesizing new abnormal cases for effective model training. Our approach outperforms recent studies, enhancing the classification performance by up to 42.4% and 44.2% in recall and F1 scores, respectively, and ensuring robust lesion localization with respect to lesion size variations and positional biases. Our approach further outperforms human expert clinicians in classification by 10.7 % and 10.8 % in recall and F1 score, respectively.
Collapse
Affiliation(s)
| | - Van Nhat Thang Le
- Faculty of Odonto-Stomatology, Hue University of Medicine and Pharmacy, Hue University, Hue, 49120, Vietnam
| | - Dae-Woo Lee
- The Department of Pediatric Dentistry, Jeonbuk National University, Jeonju, 54896, Korea.
- Biomedical Research Institute of Jeonbuk National University Hospital, Jeonbuk National University, Jeonju, 54896, Korea.
- Research Institute of Clinical Medicine of Jeonbuk National University, Jeonju, 54896, Korea.
| | - Sungchan Kim
- The Department of Computer Science and Artificial Intelligence, Jeonbuk National University, Jeonju, 54896, Korea.
- Center for Advanced Image Information Technology, Jeonbuk National University, Jeonju, 54896, Korea.
| |
Collapse
|
21
|
Xu L, Qiu K, Li K, Ying G, Huang X, Zhu X. Automatic segmentation of ameloblastoma on ct images using deep learning with limited data. BMC Oral Health 2024; 24:55. [PMID: 38195496 PMCID: PMC10775495 DOI: 10.1186/s12903-023-03587-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2023] [Accepted: 10/27/2023] [Indexed: 01/11/2024] Open
Abstract
BACKGROUND Ameloblastoma, a common benign tumor found in the jaw bone, necessitates accurate localization and segmentation for effective diagnosis and treatment. However, the traditional manual segmentation method is plagued with inefficiencies and drawbacks. Hence, the implementation of an AI-based automatic segmentation approach is crucial to enhance clinical diagnosis and treatment procedures. METHODS We collected CT images from 79 patients diagnosed with ameloblastoma and employed a deep learning neural network model for training and testing purposes. Specifically, we utilized the Mask R-CNN neural network structure and implemented image preprocessing and enhancement techniques. During the testing phase, cross-validation methods were employed for evaluation, and the experimental results were verified using an external validation set. Finally, we obtained an additional dataset comprising 200 CT images of ameloblastoma from a different dental center to evaluate the model's generalization performance. RESULTS During extensive testing and evaluation, our model successfully demonstrated the capability to automatically segment ameloblastoma. The DICE index achieved an impressive value of 0.874. Moreover, when the IoU threshold ranged from 0.5 to 0.95, the model's AP was 0.741. For a specific IoU threshold of 0.5, the model achieved an AP of 0.914, and for another IoU threshold of 0.75, the AP was 0.826. Our validation using external data confirms the model's strong generalization performance. CONCLUSION In this study, we successfully applied a neural network model based on deep learning that effectively performs automatic segmentation of ameloblastoma. The proposed method offers notable advantages in terms of efficiency, accuracy, and speed, rendering it a promising tool for clinical diagnosis and treatment.
Collapse
Affiliation(s)
- Liang Xu
- The First Affiliated Hospital of Fujian Medical University, Fuzhou, China
- Department of Stomatology, National Regional Medical Center, Binhai Campus of the First Affiliated Hospital, Fujian Medical University, Fuzhou, China
| | - Kaixi Qiu
- Fuzhou First General Hospital, , Fuzhou, China
| | - Kaiwang Li
- School of Aeronautics and Astronautics, Tsinghua University, Beijing, China
| | - Ge Ying
- Jianning County General Hospital, , Fuzhou, China
| | - Xiaohong Huang
- The First Affiliated Hospital of Fujian Medical University, Fuzhou, China.
| | - Xiaofeng Zhu
- The First Affiliated Hospital of Fujian Medical University, Fuzhou, China.
- Department of Stomatology, National Regional Medical Center, Binhai Campus of the First Affiliated Hospital, Fujian Medical University, Fuzhou, China.
| |
Collapse
|
22
|
Rašić M, Tropčić M, Karlović P, Gabrić D, Subašić M, Knežević P. Detection and Segmentation of Radiolucent Lesions in the Lower Jaw on Panoramic Radiographs Using Deep Neural Networks. MEDICINA (KAUNAS, LITHUANIA) 2023; 59:2138. [PMID: 38138241 PMCID: PMC10744511 DOI: 10.3390/medicina59122138] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2023] [Revised: 11/29/2023] [Accepted: 12/07/2023] [Indexed: 12/24/2023]
Abstract
Background and Objectives: The purpose of this study was to develop and evaluate a deep learning model capable of autonomously detecting and segmenting radiolucent lesions in the lower jaw by utilizing You Only Look Once (YOLO) v8. Materials and Methods: This study involved the analysis of 226 lesions present in panoramic radiographs captured between 2013 and 2023 at the Clinical Hospital Dubrava and the School of Dental Medicine, University of Zagreb. Panoramic radiographs included radiolucent lesions such as radicular cysts, ameloblastomas, odontogenic keratocysts (OKC), dentigerous cysts and residual cysts. To enhance the database, we applied techniques such as translation, scaling, rotation, horizontal flipping and mosaic effects. We have employed the deep neural network to tackle our detection and segmentation objectives. Also, to improve our model's generalization capabilities, we conducted five-fold cross-validation. The assessment of the model's performance was carried out through metrics like Intersection over Union (IoU), precision, recall and mean average precision (mAP)@50 and mAP@50-95. Results: In the detection task, the precision, recall, mAP@50 and mAP@50-95 scores without augmentation were recorded at 91.8%, 57.1%, 75.8% and 47.3%, while, with augmentation, were 95.2%, 94.4%, 97.5% and 68.7%, respectively. Similarly, in the segmentation task, the precision, recall, mAP@50 and mAP@50-95 values achieved without augmentation were 76%, 75.5%, 75.1% and 48.3%, respectively. Augmentation techniques led to an improvement of these scores to 100%, 94.5%, 96.6% and 72.2%. Conclusions: Our study confirmed that the model developed using the advanced YOLOv8 has the remarkable capability to automatically detect and segment radiolucent lesions in the mandible. With its continual evolution and integration into various medical fields, the deep learning model holds the potential to revolutionize patient care.
Collapse
Affiliation(s)
- Mario Rašić
- Clinic for Tumors, Clinical Hospital Center “Sisters of Mercy”, Ilica 197, 10000 Zagreb, Croatia;
| | - Mario Tropčić
- Faculty of Electrical Engineering and Computing, University of Zagreb, Unska Ulica 3, 10000 Zagreb, Croatia;
| | - Pjetra Karlović
- Department of Maxillofacial and Oral Surgery, Dubrava University Hospital, Avenija Gojka Šuška 6, 10000 Zagreb, Croatia;
| | - Dragana Gabrić
- Department of Oral Surgery, School of Dental Medicine, University of Zagreb, Gundulićeva 5, 10000 Zagreb, Croatia;
| | - Marko Subašić
- Faculty of Electrical Engineering and Computing, University of Zagreb, Unska Ulica 3, 10000 Zagreb, Croatia;
| | - Predrag Knežević
- Department of Maxillofacial and Oral Surgery, Dubrava University Hospital, Avenija Gojka Šuška 6, 10000 Zagreb, Croatia;
| |
Collapse
|
23
|
Farajollahi M, Safarian MS, Hatami M, Esmaeil Nejad A, Peters OA. Applying artificial intelligence to detect and analyse oral and maxillofacial bone loss-A scoping review. AUST ENDOD J 2023; 49:720-734. [PMID: 37439465 DOI: 10.1111/aej.12775] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2023] [Revised: 07/03/2023] [Accepted: 07/04/2023] [Indexed: 07/14/2023]
Abstract
Radiographic evaluation of bone changes is one of the main tools in the diagnosis of many oral and maxillofacial diseases. However, this approach to assessment has limitations in accuracy, inconsistency and comparatively low diagnostic efficiency. Recently, artificial intelligence (AI)-based algorithms like deep learning networks have been introduced as a solution to overcome these challenges. Based on recent studies, AI can improve the detection accuracy of an expert clinician for periapical pathology, periodontal diseases and their prognostication, as well as peri-implant bone loss. Also, AI has been successfully used to detect and diagnose oral and maxillofacial lesions with a high predictive value. This study aims to review the current evidence on artificial intelligence applications in the detection and analysis of bone loss in the oral and maxillofacial regions.
Collapse
Affiliation(s)
- Mehran Farajollahi
- Iranian Center for Endodontic Research, Research Institute of Dental Sciences, School of Dentistry, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Mohammad Sadegh Safarian
- Iranian Center for Endodontic Research, Research Institute of Dental Sciences, School of Dentistry, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Masoud Hatami
- Iranian Center for Endodontic Research, Research Institute of Dental Sciences, School of Dentistry, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Azadeh Esmaeil Nejad
- Iranian Center for Endodontic Research, Research Institute of Dental Sciences, School of Dentistry, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Ove A Peters
- School of Dentistry, The University of Queensland, Herston, Queensland, Australia
| |
Collapse
|
24
|
Ciconelle ACM, da Silva RLB, Kim JH, Rocha BA, Dos Santos DG, Vianna LGR, Gomes Ferreira LG, Pereira Dos Santos VH, Costa JO, Vicente R. Deep learning for sex determination: Analyzing over 200,000 panoramic radiographs. J Forensic Sci 2023; 68:2057-2064. [PMID: 37746788 DOI: 10.1111/1556-4029.15376] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Revised: 08/23/2023] [Accepted: 08/24/2023] [Indexed: 09/26/2023]
Abstract
The objective of this study is to assess the performance of an innovative AI-powered tool for sex determination using panoramic radiographs (PR) and to explore factors affecting the performance of the convolutional neural network (CNN). The study involved 207,946 panoramic dental X-rays and their corresponding reports from 15 clinical centers in São Paulo, Brazil. The PRs were acquired with four different devices, and 58% of the patients were female. Data preprocessing included anonymizing the exams, extracting pertinent information from the reports, such as sex, age, type of dentition, and number of missing teeth, and organizing the data into a PostgreSQL database. Two neural network architectures, a standard CNN and a ResNet, were utilized for sex classification, with both undergoing hyperparameter tuning and cross-validation to ensure optimal performance. The CNN model achieved 95.02% accuracy in sex estimation, with image resolution being a significant influencing factor. The ResNet model attained over 86% accuracy in subjects older than 6 years and over 96% in those over 16 years. The algorithm performed better on female images, and the area under the curve (AUC) exceeded 96% for most age groups, except the youngest. Accuracy values were also assessed for different dentition types (deciduous, mixed, and permanent) and missing teeth. This study demonstrates the effectiveness of an AI-driven tool for sex determination using PR and emphasizes the role of image resolution, age, and sex in determining the algorithm's performance.
Collapse
Affiliation(s)
| | - Renan Lucio Berbel da Silva
- Department of Stomatology, School of Dentistry, University of São Paulo, São Paulo, Brazil
- Department of Oral and Maxillofacial Radiology, School of Dentistry, Seoul National University, Seoul, Republic of Korea
| | - Jun Ho Kim
- Department of Stomatology, School of Dentistry, University of São Paulo, São Paulo, Brazil
- Department of Oral and Maxillofacial Radiology, School of Dentistry, Seoul National University, Seoul, Republic of Korea
| | | | | | | | | | | | | | - Renato Vicente
- Institute of Mathematics and Statistics, University of São Paulo, São Paulo, Brazil
| |
Collapse
|
25
|
Zhong NN, Wang HQ, Huang XY, Li ZZ, Cao LM, Huo FY, Liu B, Bu LL. Enhancing head and neck tumor management with artificial intelligence: Integration and perspectives. Semin Cancer Biol 2023; 95:52-74. [PMID: 37473825 DOI: 10.1016/j.semcancer.2023.07.002] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2023] [Revised: 07/11/2023] [Accepted: 07/15/2023] [Indexed: 07/22/2023]
Abstract
Head and neck tumors (HNTs) constitute a multifaceted ensemble of pathologies that primarily involve regions such as the oral cavity, pharynx, and nasal cavity. The intricate anatomical structure of these regions poses considerable challenges to efficacious treatment strategies. Despite the availability of myriad treatment modalities, the overall therapeutic efficacy for HNTs continues to remain subdued. In recent years, the deployment of artificial intelligence (AI) in healthcare practices has garnered noteworthy attention. AI modalities, inclusive of machine learning (ML), neural networks (NNs), and deep learning (DL), when amalgamated into the holistic management of HNTs, promise to augment the precision, safety, and efficacy of treatment regimens. The integration of AI within HNT management is intricately intertwined with domains such as medical imaging, bioinformatics, and medical robotics. This article intends to scrutinize the cutting-edge advancements and prospective applications of AI in the realm of HNTs, elucidating AI's indispensable role in prevention, diagnosis, treatment, prognostication, research, and inter-sectoral integration. The overarching objective is to stimulate scholarly discourse and invigorate insights among medical practitioners and researchers to propel further exploration, thereby facilitating superior therapeutic alternatives for patients.
Collapse
Affiliation(s)
- Nian-Nian Zhong
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China
| | - Han-Qi Wang
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China
| | - Xin-Yue Huang
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China
| | - Zi-Zhan Li
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China
| | - Lei-Ming Cao
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China
| | - Fang-Yi Huo
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China
| | - Bing Liu
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China; Department of Oral & Maxillofacial - Head Neck Oncology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China.
| | - Lin-Lin Bu
- State Key Laboratory of Oral & Maxillofacial Reconstruction and Regeneration, Key Laboratory of Oral Biomedicine Ministry of Education, Hubei Key Laboratory of Stomatology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China; Department of Oral & Maxillofacial - Head Neck Oncology, School & Hospital of Stomatology, Wuhan University, Wuhan 430079, China.
| |
Collapse
|
26
|
Bonny T, Al Nassan W, Obaideen K, Al Mallahi MN, Mohammad Y, El-damanhoury HM. Contemporary Role and Applications of Artificial Intelligence in Dentistry. F1000Res 2023; 12:1179. [PMID: 37942018 PMCID: PMC10630586 DOI: 10.12688/f1000research.140204.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 08/24/2023] [Indexed: 11/10/2023] Open
Abstract
Artificial Intelligence (AI) technologies play a significant role and significantly impact various sectors, including healthcare, engineering, sciences, and smart cities. AI has the potential to improve the quality of patient care and treatment outcomes while minimizing the risk of human error. Artificial Intelligence (AI) is transforming the dental industry, just like it is revolutionizing other sectors. It is used in dentistry to diagnose dental diseases and provide treatment recommendations. Dental professionals are increasingly relying on AI technology to assist in diagnosis, clinical decision-making, treatment planning, and prognosis prediction across ten dental specialties. One of the most significant advantages of AI in dentistry is its ability to analyze vast amounts of data quickly and accurately, providing dental professionals with valuable insights to enhance their decision-making processes. The purpose of this paper is to identify the advancement of artificial intelligence algorithms that have been frequently used in dentistry and assess how well they perform in terms of diagnosis, clinical decision-making, treatment, and prognosis prediction in ten dental specialties; dental public health, endodontics, oral and maxillofacial surgery, oral medicine and pathology, oral & maxillofacial radiology, orthodontics and dentofacial orthopedics, pediatric dentistry, periodontics, prosthodontics, and digital dentistry in general. We will also show the pros and cons of using AI in all dental specialties in different ways. Finally, we will present the limitations of using AI in dentistry, which made it incapable of replacing dental personnel, and dentists, who should consider AI a complimentary benefit and not a threat.
Collapse
Affiliation(s)
- Talal Bonny
- Department of Computer Engineering, University of Sharjah, Sharjah, 27272, United Arab Emirates
| | - Wafaa Al Nassan
- Department of Computer Engineering, University of Sharjah, Sharjah, 27272, United Arab Emirates
| | - Khaled Obaideen
- Sustainable Energy and Power Systems Research Centre, RISE, University of Sharjah, Sharjah, 27272, United Arab Emirates
| | - Maryam Nooman Al Mallahi
- Department of Mechanical and Aerospace Engineering, United Arab Emirates University, Al Ain City, Abu Dhabi, 27272, United Arab Emirates
| | - Yara Mohammad
- College of Engineering and Information Technology, Ajman University, Ajman University, Ajman, Ajman, United Arab Emirates
| | - Hatem M. El-damanhoury
- Department of Preventive and Restorative Dentistry, College of Dental Medicine, University of Sharjah, Sharjah, 27272, United Arab Emirates
| |
Collapse
|
27
|
Ramachandran RA, Barão VAR, Ozevin D, Sukotjo C, Srinivasa PP, Mathew M. Early Predicting Tribocorrosion Rate of Dental Implant Titanium Materials Using Random Forest Machine Learning Models. TRIBOLOGY INTERNATIONAL 2023; 187:108735. [PMID: 37720691 PMCID: PMC10503681 DOI: 10.1016/j.triboint.2023.108735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/19/2023]
Abstract
Early detection and prediction of bio-tribocorrosion can avert unexpected damage that may lead to secondary revision surgery and associated risks of implantable devices. Therefore, this study sought to develop a state-of-the-art prediction technique leveraging machine learning(ML) models to classify and predict the possibility of mechanical degradation in dental implant materials. Key features considered in the study involving pure titanium and titanium-zirconium (zirconium = 5, 10, and 15 in wt%) alloys include corrosion potential, acoustic emission(AE) absolute energy, hardness, and weight-loss estimates. ML prototype models deployed confirms its suitability in tribocorrosion prediction with an accuracy above 90%. Proposed system can evolve as a continuous structural-health monitoring as well as a reliable predictive modeling technique for dental implant monitoring.
Collapse
Affiliation(s)
| | - Valentim A R Barão
- Department of Prosthodontics and Periodontology, Piracicaba Dental School, University of Campinas (UNICAMP), Piracicaba, São Paulo, Brazil
| | - Didem Ozevin
- Department of Civil, Materials, and Environmental Engineering, University of Illinois at Chicago, IL, USA
| | - Cortino Sukotjo
- Department of Restorative Dentistry, College of Dentistry, University of Illinois at Chicago, IL, USA
| | - Pai P Srinivasa
- Department of Mechanical Engineering, NMAM IT, Nitte, Karnataka, India
| | - Mathew Mathew
- Department of Biomedical Engineering, University of Illinois at Chicago, IL, USA
- Department of Restorative Dentistry, College of Dentistry, University of Illinois at Chicago, IL, USA
| |
Collapse
|
28
|
Orhan K, Aktuna Belgin C, Manulis D, Golitsyna M, Bayrak S, Aksoy S, Sanders A, Önder M, Ezhov M, Shamshiev M, Gusarev M, Shlenskii V. Determining the reliability of diagnosis and treatment using artificial intelligence software with panoramic radiographs. Imaging Sci Dent 2023; 53:199-208. [PMID: 37799743 PMCID: PMC10548159 DOI: 10.5624/isd.20230109] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2023] [Revised: 07/07/2023] [Accepted: 07/10/2023] [Indexed: 10/07/2023] Open
Abstract
Purpose The objective of this study was to evaluate the accuracy and effectiveness of an artificial intelligence (AI) program in identifying dental conditions using panoramic radiographs (PRs), as well as to assess the appropriateness of its treatment recommendations. Material and Methods PRs from 100 patients (representing 4497 teeth) with known clinical examination findings were randomly selected from a university database. Three dentomaxillofacial radiologists and the Diagnocat AI software evaluated these PRs. The evaluations were focused on various dental conditions and treatments, including canal filling, caries, cast post and core, dental calculus, fillings, furcation lesions, implants, lack of interproximal tooth contact, open margins, overhangs, periapical lesions, periodontal bone loss, short fillings, voids in root fillings, overfillings, pontics, root fragments, impacted teeth, artificial crowns, missing teeth, and healthy teeth. Results The AI demonstrated almost perfect agreement (exceeding 0.81) in most of the assessments when compared to the ground truth. The sensitivity was very high (above 0.8) for the evaluation of healthy teeth, artificial crowns, dental calculus, missing teeth, fillings, lack of interproximal contact, periodontal bone loss, and implants. However, the sensitivity was low for the assessment of caries, periapical lesions, pontic voids in the root canal, and overhangs. Conclusion Despite the limitations of this study, the synthesized data suggest that AI-based decision support systems can serve as a valuable tool in detecting dental conditions, when used with PR for clinical dental applications.
Collapse
Affiliation(s)
- Kaan Orhan
- Department of Dentomaxillofacial Radiology, Faculty of Dentistry, Ankara University, Ankara, Turkey
| | - Ceren Aktuna Belgin
- Department of Dentomaxillofacial Radiology, Faculty of Dentistry, Hatay Mustafa Kemal University, Hatay, Turkey
| | | | | | - Seval Bayrak
- Department of Dentomaxillofacial Radiology, Faculty of Dentistry, Abant İzzet Baysal University, Bolu, Turkey
| | - Secil Aksoy
- Department of Dentomaxillofacial Radiology, Faculty of Dentistry, Near East University, Nicosia, Cyprus
| | | | - Merve Önder
- Department of Dentomaxillofacial Radiology, Faculty of Dentistry, Ankara University, Ankara, Turkey
| | | | | | | | | |
Collapse
|
29
|
Cuschieri LA, Schembri-Higgans R, Bezzina N, Betts A, Cortes ARG. Importance of 3-dimensional imaging in the early diagnosis of chondroblastic osteosarcoma. Imaging Sci Dent 2023; 53:247-256. [PMID: 37799747 PMCID: PMC10548150 DOI: 10.5624/isd.20220223] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Revised: 05/30/2023] [Accepted: 06/15/2023] [Indexed: 10/07/2023] Open
Abstract
The aim of this report is to present a case of chondroblastic osteosarcoma located in the right maxillary premolar region of a 17-year-old female patient. The initial clinical presentation and 2-dimensional (2D) radiographic methods proved inadequate for a definitive diagnosis. However, a cone-beam computed tomography scan revealed a hyperdense, heterogeneous lesion in the right maxillary premolar region, exhibiting a characteristic "sun-ray" appearance. To assess soft tissue involvement, a medical computed tomography scan was subsequently conducted. A positron emission tomography scan detected no metastasis or indications of secondary tumors. T1- and T2-weighted magnetic resonance imaging showed signal heterogeneity within the lesion, including areas of low signal intensity at the periphery. A histological examination conducted after an incisional biopsy confirmed the diagnosis of high-grade chondroblastic osteosarcoma. The patient was then referred to an oncology department for chemotherapy before surgery. In conclusion, these findings suggest that early diagnosis using 3-dimensional imaging can detect chondroblastic osteosarcoma in its early stages, such as before metastasis occurs, thereby improving the patient's prognosis.
Collapse
Affiliation(s)
- Laura Althea Cuschieri
- Department of Dental Surgery, Faculty of Dental Surgery, University of Malta, Msida, Malta
| | | | - Nicholas Bezzina
- Department of Dental Surgery, Faculty of Dental Surgery, University of Malta, Msida, Malta
| | - Alexandra Betts
- Department of Dental Surgery, Faculty of Dental Surgery, University of Malta, Msida, Malta
| | | |
Collapse
|
30
|
Shafi I, Fatima A, Afzal H, Díez IDLT, Lipari V, Breñosa J, Ashraf I. A Comprehensive Review of Recent Advances in Artificial Intelligence for Dentistry E-Health. Diagnostics (Basel) 2023; 13:2196. [PMID: 37443594 DOI: 10.3390/diagnostics13132196] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2023] [Revised: 06/14/2023] [Accepted: 06/23/2023] [Indexed: 07/15/2023] Open
Abstract
Artificial intelligence has made substantial progress in medicine. Automated dental imaging interpretation is one of the most prolific areas of research using AI. X-ray and infrared imaging systems have enabled dental clinicians to identify dental diseases since the 1950s. However, the manual process of dental disease assessment is tedious and error-prone when diagnosed by inexperienced dentists. Thus, researchers have employed different advanced computer vision techniques, and machine- and deep-learning models for dental disease diagnoses using X-ray and near-infrared imagery. Despite the notable development of AI in dentistry, certain factors affect the performance of the proposed approaches, including limited data availability, imbalanced classes, and lack of transparency and interpretability. Hence, it is of utmost importance for the research community to formulate suitable approaches, considering the existing challenges and leveraging findings from the existing studies. Based on an extensive literature review, this survey provides a brief overview of X-ray and near-infrared imaging systems. Additionally, a comprehensive insight into challenges faced by researchers in the dental domain has been brought forth in this survey. The article further offers an amalgamative assessment of both performances and methods evaluated on public benchmarks and concludes with ethical considerations and future research avenues.
Collapse
Affiliation(s)
- Imran Shafi
- College of Electrical and Mechanical Engineering, National University of Sciences and Technology (NUST), Islamabad 44000, Pakistan
| | - Anum Fatima
- National Centre for Robotics, National University of Sciences and Technology (NUST), Islamabad 44000, Pakistan
| | - Hammad Afzal
- Military College of Signals (MCS), National University of Sciences and Technology (NUST), Islamabad 44000, Pakistan
| | - Isabel de la Torre Díez
- Department of Signal Theory and Communications and Telematic Engineering, University of Valladolid, Paseo de Belén 15, 47011 Valladolid, Spain
| | - Vivian Lipari
- Research Unit in Food Technologies, Agro-Food Industries and Nutrition, Universidad Europea del Atlántico, Isabel Torres 21, 39011 Santander, Spain
- Research Unit in Food Technologies, Agro-Food Industries and Nutrition, Universidad Internacional Iberoamericana, Campeche 24560, Mexico
- Research Unit in Food Technologies, Agro-Food Industries and Nutrition, Fundación Universitaria Internacional de Colombia, Bogotá 111311, Colombia
| | - Jose Breñosa
- Research Unit in Food Technologies, Agro-Food Industries and Nutrition, Universidad Europea del Atlántico, Isabel Torres 21, 39011 Santander, Spain
- Universidade Internacional do Cuanza, Cuito EN250, Bié, Angola
- Research Unit in Food Technologies, Agro-Food Industries and Nutrition, Universidad Internacional Iberoamericana Arecibo, Puerto Rico, PR 00613, USA
| | - Imran Ashraf
- Department of Information and Communication Engineering, Yeungnam University, Gyeongsan 38541, Republic of Korea
| |
Collapse
|
31
|
Stafie CS, Sufaru IG, Ghiciuc CM, Stafie II, Sufaru EC, Solomon SM, Hancianu M. Exploring the Intersection of Artificial Intelligence and Clinical Healthcare: A Multidisciplinary Review. Diagnostics (Basel) 2023; 13:1995. [PMID: 37370890 PMCID: PMC10297646 DOI: 10.3390/diagnostics13121995] [Citation(s) in RCA: 11] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2023] [Revised: 05/31/2023] [Accepted: 06/05/2023] [Indexed: 06/29/2023] Open
Abstract
Artificial intelligence (AI) plays a more and more important role in our everyday life due to the advantages that it brings when used, such as 24/7 availability, a very low percentage of errors, ability to provide real time insights, or performing a fast analysis. AI is increasingly being used in clinical medical and dental healthcare analyses, with valuable applications, which include disease diagnosis, risk assessment, treatment planning, and drug discovery. This paper presents a narrative literature review of AI use in healthcare from a multi-disciplinary perspective, specifically in the cardiology, allergology, endocrinology, and dental fields. The paper highlights data from recent research and development efforts in AI for healthcare, as well as challenges and limitations associated with AI implementation, such as data privacy and security considerations, along with ethical and legal concerns. The regulation of responsible design, development, and use of AI in healthcare is still in early stages due to the rapid evolution of the field. However, it is our duty to carefully consider the ethical implications of implementing AI and to respond appropriately. With the potential to reshape healthcare delivery and enhance patient outcomes, AI systems continue to reveal their capabilities.
Collapse
Affiliation(s)
- Celina Silvia Stafie
- Department of Preventive Medicine and Interdisciplinarity, Grigore T. Popa University of Medicine and Pharmacy Iasi, Universitatii Street 16, 700115 Iasi, Romania;
| | - Irina-Georgeta Sufaru
- Department of Periodontology, Grigore T. Popa University of Medicine and Pharmacy Iasi, Universitatii Street 16, 700115 Iasi, Romania
| | - Cristina Mihaela Ghiciuc
- Department of Morpho-Functional Sciences II—Pharmacology and Clinical Pharmacology, Grigore T. Popa University of Medicine and Pharmacy Iasi, Universitatii Street 16, 700115 Iasi, Romania
| | - Ingrid-Ioana Stafie
- Endocrinology Residency Program, Sf. Spiridon Clinical Emergency Hospital, Independentei 1, 700111 Iasi, Romania
| | | | - Sorina Mihaela Solomon
- Department of Periodontology, Grigore T. Popa University of Medicine and Pharmacy Iasi, Universitatii Street 16, 700115 Iasi, Romania
| | - Monica Hancianu
- Pharmacognosy-Phytotherapy, Grigore T. Popa University of Medicine and Pharmacy Iasi, Universitatii Street 16, 700115 Iasi, Romania
| |
Collapse
|
32
|
Rajaram Mohan K, Mathew Fenn S. Artificial Intelligence and Its Theranostic Applications in Dentistry. Cureus 2023; 15:e38711. [PMID: 37292569 PMCID: PMC10246515 DOI: 10.7759/cureus.38711] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/16/2022] [Indexed: 06/10/2023] Open
Abstract
As new technologies emerge, they continue to have an impact on our daily lives, and artificial intelligence (AI) covers a wide range of applications. Because of the advancements in AI, it is now possible to analyse large amounts of data, which results in more accurate data and more effective decision-making. This article explains the fundamentals of AI and examines its development and present use. AI technology has had an impact on the healthcare sector as a result of the need for accurate diagnosis and improved patient care. An overview of the existing AI applications in clinical dentistry was provided. Comprehensive care involving artificial intelligence aims to provide cutting-edge research and innovations, as well as high-quality patient care, by enabling sophisticated decision support tools. The cornerstone of AI advancement in dentistry is creative inter-professional coordination among medical professionals, scientists, and engineers. Artificial intelligence will continue to be associated with dentistry from a wide angle despite potential misconceptions and worries about patient privacy. This is because precise treatment methods and quick data sharing are both essential in dentistry. Additionally, these developments will make it possible for patients, academicians, and healthcare professionals to exchange large data on health as well as provide insights that enhance patient care.
Collapse
Affiliation(s)
- Karthik Rajaram Mohan
- Oral Medicine, Vinayaka Mission's Sankarachariyar Dental College, Vinayaka Mission's Research Foundation (Deemed to be University), Salem, IND
| | - Saramma Mathew Fenn
- Oral Medicine and Radiology, Vinayaka Mission's Sankarachariyar Dental College, Vinayaka Mission's Research Foundation (Deemed to be University), Salem, IND
| |
Collapse
|
33
|
Vodanović M, Subašić M, Milošević D, Savić Pavičin I. Artificial Intelligence in Medicine and Dentistry. Acta Stomatol Croat 2023; 57:70-84. [PMID: 37288152 PMCID: PMC10243707 DOI: 10.15644/asc57/1/8] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Accepted: 03/01/2023] [Indexed: 09/14/2023] Open
Abstract
INTRODUCTION Artificial intelligence has been applied in various fields throughout history, but its integration into daily life is more recent. The first applications of AI were primarily in academia and government research institutions, but as technology has advanced, AI has also been applied in industry, commerce, medicine and dentistry. OBJECTIVE Considering that the possibilities of applying artificial intelligence are developing rapidly and that this field is one of the areas with the greatest increase in the number of newly published articles, the aim of this paper was to provide an overview of the literature and to give an insight into the possibilities of applying artificial intelligence in medicine and dentistry. In addition, the aim was to discuss its advantages and disadvantages. CONCLUSION The possibilities of applying artificial intelligence to medicine and dentistry are just being discovered. Artificial intelligence will greatly contribute to developments in medicine and dentistry, as it is a tool that enables development and progress, especially in terms of personalized healthcare that will lead to much better treatment outcomes.
Collapse
Affiliation(s)
- Marin Vodanović
- Department of Dental Anthropology, School of Dental Medicine, University of Zagreb, Croatia
- University Hospital Centre Zagreb, Croatia
| | - Marko Subašić
- Faculty of Electrical Engineering and Computing, University of Zagreb, Croatia
| | - Denis Milošević
- Faculty of Electrical Engineering and Computing, University of Zagreb, Croatia
| | - Ivana Savić Pavičin
- Department of Dental Anthropology, School of Dental Medicine, University of Zagreb, Croatia
- University Hospital Centre Zagreb, Croatia
| |
Collapse
|
34
|
Ding H, Wu J, Zhao W, Matinlinna JP, Burrow MF, Tsoi JKH. Artificial intelligence in dentistry—A review. FRONTIERS IN DENTAL MEDICINE 2023. [DOI: 10.3389/fdmed.2023.1085251] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/22/2023] Open
Abstract
Artificial Intelligence (AI) is the ability of machines to perform tasks that normally require human intelligence. AI is not a new term, the concept of AI can be dated back to 1950. However, it has not become a practical tool until two decades ago. Owing to the rapid development of three cornerstones of current AI technology—big data (coming through digital devices), computational power, and AI algorithm—in the past two decades, AI applications have been started to provide convenience to people's lives. In dentistry, AI has been adopted in all dental disciplines, i.e., operative dentistry, periodontics, orthodontics, oral and maxillofacial surgery, and prosthodontics. The majority of the AI applications in dentistry go to the diagnosis based on radiographic or optical images, while other tasks are not as applicable as image-based tasks mainly due to the constraints of data availability, data uniformity, and computational power for handling 3D data. Evidence-based dentistry (EBD) is regarded as the gold standard for the decision-making of dental professionals, while AI machine learning (ML) models learn from human expertise. ML can be seen as another valuable tool to assist dental professionals in multiple stages of clinical cases. This review narrated the history and classification of AI, summarised AI applications in dentistry, discussed the relationship between EBD and ML, and aimed to help dental professionals to understand AI as a tool better to assist their routine work with improved efficiency.
Collapse
|
35
|
Chaiprasittikul N, Thanathornwong B, Pornprasertsuk-Damrongsri S, Raocharernporn S, Maponthong S, Manopatanakul S. Application of a Multi-Layer Perceptron in Preoperative Screening for Orthognathic Surgery. Healthc Inform Res 2023; 29:16-22. [PMID: 36792097 PMCID: PMC9932311 DOI: 10.4258/hir.2023.29.1.16] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2022] [Accepted: 07/13/2022] [Indexed: 02/11/2023] Open
Abstract
OBJECTIVES Orthognathic surgery is used to treat moderate to severe occlusal discrepancies. Examinations and measurements for preoperative screening are essential procedures. A careful analysis is needed to decide whether cases require orthognathic surgery. This study developed screening software using a multi-layer perceptron to determine whether orthognathic surgery is required. METHODS In total, 538 digital lateral cephalometric radiographs were retrospectively collected from a hospital data system. The input data consisted of seven cephalometric variables. All cephalograms were analyzed by the Detectron2 detection and segmentation algorithms. A keypoint region-based convolutional neural network (R-CNN) was used for object detection, and an artificial neural network (ANN) was used for classification. This novel neural network decision support system was created and validated using Keras software. The output data are shown as a number from 0 to 1, with cases requiring orthognathic surgery being indicated by a number approaching 1. RESULTS The screening software demonstrated a diagnostic agreement of 96.3% with specialists regarding the requirement for orthognathic surgery. A confusion matrix showed that only 2 out of 54 cases were misdiagnosed (accuracy = 0.963, sensitivity = 1, precision = 0.93, F-value = 0.963, area under the curve = 0.96). CONCLUSIONS Orthognathic surgery screening with a keypoint R-CNN for object detection and an ANN for classification showed 96.3% diagnostic agreement in this study.
Collapse
Affiliation(s)
- Natkritta Chaiprasittikul
- Department of Advanced General Dentistry, Faculty of Dentistry, Mahidol University, Bangkok,
Thailand
| | - Bhornsawan Thanathornwong
- Department of General Dentistry, Faculty of Dentistry, Srinakharinwirot University, Bangkok,
Thailand
| | | | - Somchart Raocharernporn
- Department of Oral and Maxillofacial Surgery, Faculty of Dentistry, Mahidol University, Bangkok,
Thailand
| | - Somporn Maponthong
- Department of Oral and Maxillofacial Radiology, Faculty of Dentistry, Mahidol University, Bangkok,
Thailand
| | - Somchai Manopatanakul
- Department of Advanced General Dentistry, Faculty of Dentistry, Mahidol University, Bangkok,
Thailand
| |
Collapse
|
36
|
Hung KF, Yeung AWK, Bornstein MM, Schwendicke F. Personalized dental medicine, artificial intelligence, and their relevance for dentomaxillofacial imaging. Dentomaxillofac Radiol 2023; 52:20220335. [PMID: 36472627 PMCID: PMC9793453 DOI: 10.1259/dmfr.20220335] [Citation(s) in RCA: 14] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2022] [Revised: 11/08/2022] [Accepted: 11/11/2022] [Indexed: 12/12/2022] Open
Abstract
Personalized medicine refers to the tailoring of diagnostics and therapeutics to individuals based on one's biological, social, and behavioral characteristics. While personalized dental medicine is still far from being a reality, advanced artificial intelligence (AI) technologies with improved data analytic approaches are expected to integrate diverse data from the individual, setting, and system levels, which may facilitate a deeper understanding of the interaction of these multilevel data and therefore bring us closer to more personalized, predictive, preventive, and participatory dentistry, also known as P4 dentistry. In the field of dentomaxillofacial imaging, a wide range of AI applications, including several commercially available software options, have been proposed to assist dentists in the diagnosis and treatment planning of various dentomaxillofacial diseases, with performance similar or even superior to that of specialists. Notably, the impact of these dental AI applications on treatment decision, clinical and patient-reported outcomes, and cost-effectiveness has so far been assessed sparsely. Such information should be further investigated in future studies to provide patients, providers, and healthcare organizers a clearer picture of the true usefulness of AI in daily dental practice.
Collapse
Affiliation(s)
- Kuo Feng Hung
- Division of Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, China
| | - Andy Wai Kan Yeung
- Division of Oral and Maxillofacial Radiology, Applied Oral Sciences and Community Dental Care, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, China
| | - Michael M. Bornstein
- Department of Oral Health & Medicine, University Center for Dental Medicine Basel UZB, University of Basel, Basel, Switzerland
| | - Falk Schwendicke
- Department of Oral Diagnostics, Digital Health and Health Services Research, Charité–Universitätsmedizin Berlin, Berlin, Germany
| |
Collapse
|
37
|
Hung KF, Ai QYH, Wong LM, Yeung AWK, Li DTS, Leung YY. Current Applications of Deep Learning and Radiomics on CT and CBCT for Maxillofacial Diseases. Diagnostics (Basel) 2022; 13:110. [PMID: 36611402 PMCID: PMC9818323 DOI: 10.3390/diagnostics13010110] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2022] [Revised: 12/23/2022] [Accepted: 12/24/2022] [Indexed: 12/31/2022] Open
Abstract
The increasing use of computed tomography (CT) and cone beam computed tomography (CBCT) in oral and maxillofacial imaging has driven the development of deep learning and radiomics applications to assist clinicians in early diagnosis, accurate prognosis prediction, and efficient treatment planning of maxillofacial diseases. This narrative review aimed to provide an up-to-date overview of the current applications of deep learning and radiomics on CT and CBCT for the diagnosis and management of maxillofacial diseases. Based on current evidence, a wide range of deep learning models on CT/CBCT images have been developed for automatic diagnosis, segmentation, and classification of jaw cysts and tumors, cervical lymph node metastasis, salivary gland diseases, temporomandibular (TMJ) disorders, maxillary sinus pathologies, mandibular fractures, and dentomaxillofacial deformities, while CT-/CBCT-derived radiomics applications mainly focused on occult lymph node metastasis in patients with oral cancer, malignant salivary gland tumors, and TMJ osteoarthritis. Most of these models showed high performance, and some of them even outperformed human experts. The models with performance on par with human experts have the potential to serve as clinically practicable tools to achieve the earliest possible diagnosis and treatment, leading to a more precise and personalized approach for the management of maxillofacial diseases. Challenges and issues, including the lack of the generalizability and explainability of deep learning models and the uncertainty in the reproducibility and stability of radiomic features, should be overcome to gain the trust of patients, providers, and healthcare organizers for daily clinical use of these models.
Collapse
Affiliation(s)
- Kuo Feng Hung
- Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, China
| | - Qi Yong H. Ai
- Health Technology and Informatics, The Hong Kong Polytechnic University, Hong Kong SAR, China
| | - Lun M. Wong
- Imaging and Interventional Radiology, Faculty of Medicine, The Chinese University of Hong Kong, Hong Kong SAR, China
| | - Andy Wai Kan Yeung
- Oral and Maxillofacial Radiology, Applied Oral Sciences and Community Dental Care, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, China
| | - Dion Tik Shun Li
- Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, China
| | - Yiu Yan Leung
- Oral and Maxillofacial Surgery, Faculty of Dentistry, The University of Hong Kong, Hong Kong SAR, China
| |
Collapse
|
38
|
Jang WS, Kim S, Yun PS, Jang HS, Seong YW, Yang HS, Chang JS. Accurate detection for dental implant and peri-implant tissue by transfer learning of faster R-CNN: a diagnostic accuracy study. BMC Oral Health 2022; 22:591. [PMID: 36494645 PMCID: PMC9737962 DOI: 10.1186/s12903-022-02539-x] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Accepted: 10/26/2022] [Indexed: 12/13/2022] Open
Abstract
BACKGROUND The diagnosis of dental implants and the periapical tissues using periapical radiographs is crucial. Recently, artificial intelligence has shown a rapid advancement in the field of radiographic imaging. PURPOSE This study attempted to detect dental implants and peri-implant tissues by using a deep learning method known as object detection on the implant image of periapical radiographs. METHODS After implant treatment, the periapical images were collected and data were processed by labeling the dental implant and peri-implant tissue together in the images. Next, 300 images of the periapical radiographs were split into 80:20 ratio (i.e. 80% of the data were used for training the model while 20% were used for testing the model). These were evaluated using an object detection model known as Faster R-CNN, which simultaneously performs classification and localization. This model was evaluated on the classification performance using metrics, including precision, recall, and F1 score. Additionally, in order to assess the localization performance, an evaluation through intersection over union (IoU) was utilized, and, Average Precision (AP) was used to assess both the classification and localization performance. RESULTS Considering the classification performance, precision = 0.977, recall = 0.992, and F1 score = 0.984 were derived. The indicator of localization was derived as mean IoU = 0.907. On the other hand, considering the indicators of both classification and localization performance, AP showed an object detection level of AP@0.5 = 0.996 and AP@0.75 = 0.967. CONCLUSION Thus, the implementation of Faster R-CNN model for object detection on 300 periapical radiographic images including dental implants, resulted in high-quality object detection for dental implants and peri-implant tissues.
Collapse
Affiliation(s)
- Woo Sung Jang
- grid.15444.300000 0004 0470 5454Department of Artificial Intelligence, College of Engineering, Yonsei University, Seoul, Korea
| | - Sunjai Kim
- grid.15444.300000 0004 0470 5454Department of Prosthodontics, Gangnam Severance Dental Hospital, College of Dentistry, Yonsei University, 211 Eonju-ro, Gangnam-gu, 06273 Seoul, Korea
| | - Pill Sang Yun
- grid.15444.300000 0004 0470 5454Department of Prosthodontics, Gangnam Severance Dental Hospital, College of Dentistry, Yonsei University, 211 Eonju-ro, Gangnam-gu, 06273 Seoul, Korea
| | - Han Sol Jang
- grid.15444.300000 0004 0470 5454Department of Prosthodontics, Gangnam Severance Dental Hospital, College of Dentistry, Yonsei University, 211 Eonju-ro, Gangnam-gu, 06273 Seoul, Korea
| | - You Won Seong
- grid.26999.3d0000 0001 2151 536XGraduate School of Public Policy, The University of Tokyo, Tokyo, Japan
| | - Hee Soo Yang
- grid.15444.300000 0004 0470 5454Department of Mechanical Engineering, College of Engineering, Yonsei University, Seoul, Korea
| | - Jae-Seung Chang
- grid.15444.300000 0004 0470 5454Department of Prosthodontics, Gangnam Severance Dental Hospital, College of Dentistry, Yonsei University, 211 Eonju-ro, Gangnam-gu, 06273 Seoul, Korea
| |
Collapse
|
39
|
Miloglu O, Guller MT, Tosun ZT. The Use of Artificial Intelligence in Dentistry Practices. Eurasian J Med 2022; 54:34-42. [PMID: 36655443 PMCID: PMC11163356 DOI: 10.5152/eurasianjmed.2022.22301] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2022] [Accepted: 11/30/2022] [Indexed: 01/19/2023] Open
Abstract
Artificial intelligence can be defined as "understanding human thinking and trying to develop computer processes that will produce a similar structure." Thus, it is an attempt by a programmed computer to think. According to a broader definition, artificial intelligence is a computer equipped with human intelligencespecific capacities such as acquiring information, perceiving, seeing, thinking, and making decisions. Quality demands in dental treatments have constantly been increasing in recent years. In parallel with this, using image-based methods and multimedia-supported explanation systems on the computer is becoming widespread to evaluate the available information. The use of artificial intelligence in dentistry will greatly contribute to the reduction of treatment times and the effort spent by the dentist, reduce the need for a specialist dentist, and give a new perspective to how dentistry is practiced. In this review, we aim to review the studies conducted with artificial intelligence in dentistry and to inform our dentists about the existence of this new technology.
Collapse
Affiliation(s)
- Ozkan Miloglu
- Department of Oral, Dental and Maxillofacial Radiology, Atatürk University Faculty of Dentistry, Erzurum, Turkey
| | - Mustafa Taha Guller
- Department of Dentistry Services, Oral and Dental Health Program, Binali Yıldırım University Vocational School of Health Services, , Erzincan, Turkey
| | - Zeynep Turanli Tosun
- Department of Oral, Dental and Maxillofacial Radiology, Atatürk University Faculty of Dentistry, Erzurum, Turkey
| |
Collapse
|
40
|
Tsolakis IA, Tsolakis AI, Elshebiny T, Matthaios S, Palomo JM. Comparing a Fully Automated Cephalometric Tracing Method to a Manual Tracing Method for Orthodontic Diagnosis. J Clin Med 2022; 11:jcm11226854. [PMID: 36431331 PMCID: PMC9693212 DOI: 10.3390/jcm11226854] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2022] [Revised: 11/11/2022] [Accepted: 11/16/2022] [Indexed: 11/22/2022] Open
Abstract
Background: This study aims to compare an automated cephalometric analysis based on the latest deep learning method of automatically identifying cephalometric landmarks with a manual tracing method using broadly accepted cephalometric software. Methods: A total of 100 cephalometric X-rays taken using a CS8100SC cephalostat were collected from a private practice. The X-rays were taken in maximum image size (18 × 24 cm lateral image). All cephalometric X-rays were first manually traced using the Dolphin 3D Imaging program version 11.0 and then automatically, using the Artificial Intelligence CS imaging V8 software. The American Board of Orthodontics analysis and the European Board of Orthodontics analysis were used for the cephalometric measurements. This resulted in the identification of 16 cephalometric landmarks, used for 16 angular and 2 linear measurements. Results: All measurements showed great reproducibility with high intra-class reliability (>0.97). The two methods showed great agreement, with an ICC range of 0.70−0.92. Mean values of SNA, SNB, ANB, SN-MP, U1-SN, L1-NB, SNPg, ANPg, SN/ANS-PNS, SN/GoGn, U1/ANS-PNS, L1-APg, U1-NA, and L1-GoGn landmarks had no significant differences between the two methods (p > 0.0027), while the mean values of FMA, L1-MP, ANS-PNS/GoGn, and U1-L1 were statistically significantly different (p < 0.0027). Conclusions: The automatic cephalometric tracing method using CS imaging V8 software is reliable and accurate for all cephalometric measurements.
Collapse
Affiliation(s)
- Ioannis A. Tsolakis
- Department of Orthodontics, School of Dentistry, Aristotle University of Thessaloniki, 541 24 Thessaloniki, Greece
- Correspondence:
| | - Apostolos I. Tsolakis
- Department of Orthodontics, School of Dentistry, National and Kapodistrian, University of Athens, 157 72 Athens, Greece
- Department of Orthodontics, School of Dental Medicine, Case Western Reserve University, Cleveland, OH 44106, USA
| | - Tarek Elshebiny
- Department of Orthodontics, School of Dental Medicine, Case Western Reserve University, Cleveland, OH 44106, USA
| | - Stefanos Matthaios
- Department of Orthodontics, School of Dental Medicine, Case Western Reserve University, Cleveland, OH 44106, USA
| | - J. Martin Palomo
- Department of Orthodontics, School of Dental Medicine, Case Western Reserve University, Cleveland, OH 44106, USA
| |
Collapse
|
41
|
Kong HJ, Kim JY, Moon HM, Park HC, Kim JW, Lim R, Woo J, Fakhri GE, Kim DW, Kim S. Automation of generative adversarial network-based synthetic data-augmentation for maximizing the diagnostic performance with paranasal imaging. Sci Rep 2022; 12:18118. [PMID: 36302815 PMCID: PMC9613909 DOI: 10.1038/s41598-022-22222-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2022] [Accepted: 10/11/2022] [Indexed: 12/30/2022] Open
Abstract
Thus far, there have been no reported specific rules for systematically determining the appropriate augmented sample size to optimize model performance when conducting data augmentation. In this paper, we report on the feasibility of synthetic data augmentation using generative adversarial networks (GAN) by proposing an automation pipeline to find the optimal multiple of data augmentation to achieve the best deep learning-based diagnostic performance in a limited dataset. We used Waters' view radiographs for patients diagnosed with chronic sinusitis to demonstrate the method developed herein. We demonstrate that our approach produces significantly better diagnostic performance parameters than models trained using conventional data augmentation. The deep learning method proposed in this study could be implemented to assist radiologists in improving their diagnosis. Researchers and industry workers could overcome the lack of training data by employing our proposed automation pipeline approach in GAN-based synthetic data augmentation. This is anticipated to provide new means to overcome the shortage of graphic data for algorithm training.
Collapse
Affiliation(s)
- Hyoun-Joong Kong
- grid.412484.f0000 0001 0302 820XTransdisciplinary Department of Medicine and Advanced Technology, Seoul National University Hospital, Jongno-Gu, Seoul, 03080 Republic of Korea ,grid.31501.360000 0004 0470 5905Medical Big Data Research Center, Seoul National University College of Medicine, Jongno-Gu, Seoul, 03080 Republic of Korea ,grid.31501.360000 0004 0470 5905Department of Biomedical Engineering, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080 Republic of Korea
| | - Jin Youp Kim
- Department of Otorhinolaryngology-Head and Neck Surgery, Ilsan Hospital, Dongguk University, Gyeonggi, 10326 Republic of Korea ,grid.31501.360000 0004 0470 5905Interdisciplinary Program of Medical Informatics, Seoul National University College of Medicine, Seoul, 03080 Republic of Korea
| | - Hye-Min Moon
- grid.31501.360000 0004 0470 5905Interdisciplinary for Bioengineering, Seoul National University, Jongno-Gu, Seoul, 03080 Republic of Korea
| | - Hae Chan Park
- grid.412480.b0000 0004 0647 3378Department of Otorhinolaryngology-Head and Neck Surgery, Seoul National University Bundang Hospital, Gyeonggi, 13620 Republic of Korea
| | - Jeong-Whun Kim
- grid.412480.b0000 0004 0647 3378Department of Otorhinolaryngology-Head and Neck Surgery, Seoul National University Bundang Hospital, Gyeonggi, 13620 Republic of Korea
| | - Ruth Lim
- grid.38142.3c000000041936754XDepartment of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA 02114 USA
| | - Jonghye Woo
- grid.38142.3c000000041936754XDepartment of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA 02114 USA
| | - Georges El Fakhri
- grid.38142.3c000000041936754XDepartment of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA 02114 USA
| | - Dae Woo Kim
- grid.484628.4 0000 0001 0943 2764Department of Otorhinolaryngology-Head and Neck Surgery, Boramae Medical Center, Seoul Metropolitan Government-Seoul National University 20, Boramae-Ro 5-Gil, Dongjak-Gu, Seoul, 07061 Republic of Korea
| | - Sungwan Kim
- grid.412484.f0000 0001 0302 820XTransdisciplinary Department of Medicine and Advanced Technology, Seoul National University Hospital, Jongno-Gu, Seoul, 03080 Republic of Korea ,grid.31501.360000 0004 0470 5905Department of Biomedical Engineering, Seoul National University College of Medicine, 101 Daehak-Ro, Jongno-Gu, Seoul, 03080 Republic of Korea ,grid.412484.f0000 0001 0302 820XDepartment of Biomedical Engineering, Seoul National University Hospital, Jongno-Gu, Seoul, 03080 Republic of Korea
| |
Collapse
|
42
|
Qian J, Ma R, Qu Y, Deng S, Duan Y, Zuo F, Wang Y, Wu Y. Use and performance of artificial intelligence applications in the diagnosis of chronic apical periodontitis based on cone beam computed tomography imaging. HUA XI KOU QIANG YI XUE ZA ZHI = HUAXI KOUQIANG YIXUE ZAZHI = WEST CHINA JOURNAL OF STOMATOLOGY 2022; 40:576-581. [PMID: 38596979 PMCID: PMC9588865 DOI: 10.7518/hxkq.2022.05.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Revised: 07/05/2022] [Indexed: 01/25/2023]
Abstract
OBJECTIVES This study aims to investigate the diagnostic application of an artificial intelligence (AI) computer-aided diagnostic system based on a convolutional neural network algorithm in detecting chronic apical periodontitis in cone beam computed tomography (CBCT) images. METHODS CBCT raw data of 55 single root chronic apical pe-riodontitis taken in 2nd Dental Center of Peking University School and Hospital from 49 patients from January 2017 to December 2021 were collected, and the chronic apical periodontitis areas were identified by experienced clinicians ma-nually and segmented layer by layer in Materialise Mimics Medical Software. Deep learning of lesion characterization was conducted via AI 3D U-Net, and the network segmentation results were compared manually with the test sets in terms of intersection over union (IOU), Dice coefficient, and pixel accuracy (PA). RESULTS In our deep learning algorithm, the IOU for all actual true lesions in test set samples was 92.18%, and the Dice coefficient and the PA index were 95.93% and 99.27%, respectively. Lesion segmentation and volume measurements performed by humans and AI systems showed excellent agreement. CONCLUSIONS AI systems based on deep learning methods can be applied for detecting chronic apical periodontitis on CBCT images in clinical applications.
Collapse
Affiliation(s)
- Jun Qian
- Second Clinical Division, Peking University School and Hospital of Stomatology; National Clinical Research Center for Oral Diseases; National Engineering Laboratory for Digital and Material Technology of Stomatology, Beijing 100020, China
| | - Rui Ma
- Second Clinical Division, Peking University School and Hospital of Stomatology; National Clinical Research Center for Oral Diseases; National Engineering Laboratory for Digital and Material Technology of Stomatology, Beijing 100020, China
| | - Yan Qu
- Dept. of Stomatology, Beijing Rehabilitation Hospital of Capital Medical University, Beijing 100144, China
| | - Shaochun Deng
- Second Clinical Division, Peking University School and Hospital of Stomatology; National Clinical Research Center for Oral Diseases; National Engineering Laboratory for Digital and Material Technology of Stomatology, Beijing 100020, China
| | - Yao Duan
- Second Clinical Division, Peking University School and Hospital of Stomatology; National Clinical Research Center for Oral Diseases; National Engineering Laboratory for Digital and Material Technology of Stomatology, Beijing 100020, China
| | - Feifei Zuo
- LargeV Instrument Corp. Ltd, Beijing 100084, China
| | - Yajie Wang
- LargeV Instrument Corp. Ltd, Beijing 100084, China
| | - Yuwei Wu
- Second Clinical Division, Peking University School and Hospital of Stomatology; National Clinical Research Center for Oral Diseases; National Engineering Laboratory for Digital and Material Technology of Stomatology, Beijing 100020, China
| |
Collapse
|
43
|
Feher B, Kuchler U, Schwendicke F, Schneider L, Cejudo Grano de Oro JE, Xi T, Vinayahalingam S, Hsu TMH, Brinz J, Chaurasia A, Dhingra K, Gaudin RA, Mohammad-Rahimi H, Pereira N, Perez-Pastor F, Tryfonos O, Uribe SE, Hanisch M, Krois J. Emulating Clinical Diagnostic Reasoning for Jaw Cysts with Machine Learning. Diagnostics (Basel) 2022; 12:diagnostics12081968. [PMID: 36010318 PMCID: PMC9406703 DOI: 10.3390/diagnostics12081968] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2022] [Revised: 08/09/2022] [Accepted: 08/11/2022] [Indexed: 11/24/2022] Open
Abstract
The detection and classification of cystic lesions of the jaw is of high clinical relevance and represents a topic of interest in medical artificial intelligence research. The human clinical diagnostic reasoning process uses contextual information, including the spatial relation of the detected lesion to other anatomical structures, to establish a preliminary classification. Here, we aimed to emulate clinical diagnostic reasoning step by step by using a combined object detection and image segmentation approach on panoramic radiographs (OPGs). We used a multicenter training dataset of 855 OPGs (all positives) and an evaluation set of 384 OPGs (240 negatives). We further compared our models to an international human control group of ten dental professionals from seven countries. The object detection model achieved an average precision of 0.42 (intersection over union (IoU): 0.50, maximal detections: 100) and an average recall of 0.394 (IoU: 0.50–0.95, maximal detections: 100). The classification model achieved a sensitivity of 0.84 for odontogenic cysts and 0.56 for non-odontogenic cysts as well as a specificity of 0.59 for odontogenic cysts and 0.84 for non-odontogenic cysts (IoU: 0.30). The human control group achieved a sensitivity of 0.70 for odontogenic cysts, 0.44 for non-odontogenic cysts, and 0.56 for OPGs without cysts as well as a specificity of 0.62 for odontogenic cysts, 0.95 for non-odontogenic cysts, and 0.76 for OPGs without cysts. Taken together, our results show that a combined object detection and image segmentation approach is feasible in emulating the human clinical diagnostic reasoning process in classifying cystic lesions of the jaw.
Collapse
Affiliation(s)
- Balazs Feher
- Department of Oral Surgery, University Clinic of Dentistry, Medical University of Vienna, 1090 Vienna, Austria
- Competence Center Oral Biology, University Clinic of Dentistry, Medical University of Vienna, 1090 Vienna, Austria
- Correspondence: ; Tel.: +43-1-40070-2623
| | - Ulrike Kuchler
- Department of Oral Surgery, University Clinic of Dentistry, Medical University of Vienna, 1090 Vienna, Austria
| | - Falk Schwendicke
- Department of Oral Diagnostics, Digital Health, and Health Services Research, Charité—University Medicine Berlin, 14197 Berlin, Germany
| | - Lisa Schneider
- Department of Oral Diagnostics, Digital Health, and Health Services Research, Charité—University Medicine Berlin, 14197 Berlin, Germany
| | - Jose Eduardo Cejudo Grano de Oro
- Department of Oral Diagnostics, Digital Health, and Health Services Research, Charité—University Medicine Berlin, 14197 Berlin, Germany
| | - Tong Xi
- Department of Oral and Maxillofacial Surgery, Radboud University Nijmegen Medical Centre, 6525 GA Nijmegen, The Netherlands
| | - Shankeeth Vinayahalingam
- Department of Oral and Maxillofacial Surgery, Radboud University Nijmegen Medical Centre, 6525 GA Nijmegen, The Netherlands
| | - Tzu-Ming Harry Hsu
- Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
| | - Janet Brinz
- Department of Restorative Dentistry, Ludwig-Maximilians-University of Munich, 80336 Munich, Germany
| | - Akhilanand Chaurasia
- Department of Oral Medicine and Radiology, Faculty of Dental Sciences, King George’s Medical University, Lucknow 226003, India
| | - Kunaal Dhingra
- Periodontics Division, Centre for Dental Education and Research, All India Institute of Medical Sciences, New Delhi 110029, India
| | - Robert Andre Gaudin
- Department of Oral and Maxillofacial Surgery, Charité—University Medicine Berlin, 14197 Berlin, Germany
- Berlin Institute of Health, 10178 Berlin, Germany
| | - Hossein Mohammad-Rahimi
- Dentofacial Deformities Research Center, Research Institute of Dental Sciences, Shahid Beheshti University of Medical Sciences, Tehran 1416634793, Iran
| | - Nielsen Pereira
- Private Practice in Oral and Maxillofacial Radiology, Rio de Janeiro 22430-000, Brazil
| | - Francesc Perez-Pastor
- Servei Salut Dental, Gerencia Atencio Primaria, Institut Balear de la Salut, 07003 Palma, Spain
| | - Olga Tryfonos
- Department of Periodontology and Oral Biochemistry, Academic Centre for Dentistry Amsterdam, 1081 LA Amsterdam, The Netherlands
| | - Sergio E. Uribe
- Department of Conservative Dentistry & Oral Health, Riga Stradins University, LV-1007 Riga, Latvia
- School of Dentistry, Universidad Austral de Chile, Valdivia 5110566, Chile
- Baltic Biomaterials Centre of Excellence, Headquarters at Riga Technical University, LV-1658 Riga, Latvia
| | - Marcel Hanisch
- Department of Oral and Maxillofacial Surgery, University Clinic Münster, 48143 Münster, Germany
| | - Joachim Krois
- Department of Oral Diagnostics, Digital Health, and Health Services Research, Charité—University Medicine Berlin, 14197 Berlin, Germany
| |
Collapse
|
44
|
Withdrawal. Artif Organs 2022; 46:1712. [PMID: 34873730 DOI: 10.1111/aor.14128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2021] [Accepted: 11/22/2021] [Indexed: 11/26/2022]
Abstract
Raveendran, R, Perumbure, S, Nath, SG. Artificial intelligence: A newer vista in dentistry. Artif. Organs. 2021; 00:1-11. https://doi.org/10.1111/aor.14128. The above article, published online on December 6, 2021 in Wiley Online Library (wileyonlinelibrary.com), has been withdrawn by agreement between the journal Editor in Chief, Vakhtang Tchantchaleishvili, and John Wiley and Sons, Inc. The withdrawal has been agreed due to an editorial office error that led to the publication of the article without peer review. The authors were unaware of this error until notified by the editorial team and did not engage in any inappropriate or suspicious publishing conduct.
Collapse
|
45
|
Rasteau S, Ernenwein D, Savoldelli C, Bouletreau P. Artificial intelligence for oral and maxillo-facial surgery: A narrative review. JOURNAL OF STOMATOLOGY, ORAL AND MAXILLOFACIAL SURGERY 2022; 123:276-282. [PMID: 35091121 DOI: 10.1016/j.jormas.2022.01.010] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/23/2022] [Accepted: 01/23/2022] [Indexed: 12/24/2022]
Abstract
Artificial Intelligence (AI) is a set of technologies that simulate human cognition in order to address a specific problem. The improvement in computing speed, the exponential production and the routine collection of data have led to the rapid development of AI in the health sector. In this review, we propose to provide surgeons with the essential technical elements to help them understand the possibilities offered by AI and to review the current applications of AI for oral and maxillofacial surgery (OMFS). The review of the literature reveals a real research boom of AI in all fields in OMFS. The algorithms used are related to machine learning, with a strong representation of the convolutional neural networks specific to deep learning. The complex architecture of these networks gives them the capacity to extract and process the elementary characteristics of an image, and they are therefore particularly used for diagnostic purposes on medical imagery or facial photography. We identified representative articles dealing with AI algorithms providing assistance in diagnosis, therapeutic decision, preoperative planning, or prediction and evaluation of the outcomes. Thanks to their learning, classification, prediction and detection capabilities, AI algorithms complement human skills while limiting their imperfections. However, these algorithms should be subject to rigorous clinical evaluation, and ethical reflection on data protection should be systematically conducted.
Collapse
Affiliation(s)
- Simon Rasteau
- Maxillo-Facial Surgery, Facial Plastic Surgery, Stomatology and Oral Surgery, Hospices Civils de Lyon, Lyon-Sud Hospital - Claude-Bernard Lyon 1 University, 165 Chemin du Grand-Revoyet, Pierre-Bénite 69310, France.
| | - Didier Ernenwein
- Department of Pediatric Oral & Maxillofacial & Plastic Surgery, Children's Hospital Robert-Debré, Paris-Diderot University, Paris, France
| | - Charles Savoldelli
- University Institute of the Face and Neck, Côte d'Azur University, Nice University Hospital, 31 Avenue de Valombrose, Nice 06100, France
| | - Pierre Bouletreau
- Maxillo-Facial Surgery, Facial Plastic Surgery, Stomatology and Oral Surgery, Hospices Civils de Lyon, Lyon-Sud Hospital - Claude-Bernard Lyon 1 University, 165 Chemin du Grand-Revoyet, Pierre-Bénite 69310, France
| |
Collapse
|
46
|
Potential and impact of artificial intelligence algorithms in dento-maxillofacial radiology. Clin Oral Investig 2022; 26:5535-5555. [PMID: 35438326 DOI: 10.1007/s00784-022-04477-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2021] [Accepted: 03/25/2022] [Indexed: 12/20/2022]
Abstract
OBJECTIVES Novel artificial intelligence (AI) learning algorithms in dento-maxillofacial radiology (DMFR) are continuously being developed and improved using advanced convolutional neural networks. This review provides an overview of the potential and impact of AI algorithms in DMFR. MATERIALS AND METHODS A narrative review was conducted on the literature on AI algorithms in DMFR. RESULTS In the field of DMFR, AI algorithms were mainly proposed for (1) automated detection of dental caries, periapical pathologies, root fracture, periodontal/peri-implant bone loss, and maxillofacial cysts/tumors; (2) classification of mandibular third molars, skeletal malocclusion, and dental implant systems; (3) localization of cephalometric landmarks; and (4) improvement of image quality. Data insufficiency, overfitting, and the lack of interpretability are the main issues in the development and use of image-based AI algorithms. Several strategies have been suggested to address these issues, such as data augmentation, transfer learning, semi-supervised training, few-shot learning, and gradient-weighted class activation mapping. CONCLUSIONS Further integration of relevant AI algorithms into one fully automatic end-to-end intelligent system for possible multi-disciplinary applications is very likely to be a field of increased interest in the future. CLINICAL RELEVANCE This review provides dental practitioners and researchers with a comprehensive understanding of the current development, performance, issues, and prospects of image-based AI algorithms in DMFR.
Collapse
|
47
|
Gokdeniz ST, Kamburoğlu K. Artificial intelligence in dentomaxillofacial radiology. World J Radiol 2022; 14:55-59. [PMID: 35432776 PMCID: PMC8966498 DOI: 10.4329/wjr.v14.i3.55] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/20/2021] [Revised: 09/05/2021] [Accepted: 02/23/2022] [Indexed: 02/06/2023] Open
Abstract
Artificial intelligence (AI) has the potential to revolutionize healthcare and dentistry. Recently, there has been much interest in the development of AI applications. Dentomaxillofacial radiology (DMFR) is within the scope of these applications due to its compatibility with image processing methods. Classification and segmentation of teeth, automatic marking of anatomical structures and cephalometric analysis, determination of early dental diseases, gingival, periodontal diseases and evaluation of risk groups, diagnosis of certain diseases, such as; osteoporosis that can be detected in jaw radiographs are among studies conducted by using radiological images. Further research in the field of AI will make great contributions to DMFR. We aim to discuss most recent AI-based studies in the field of DMFR.
Collapse
Affiliation(s)
- Seyide Tugce Gokdeniz
- Department of Dentomaxillofacial Radiology, Ankara University Faculty of Dentistry, Ankara 06500, Turkey
| | - Kıvanç Kamburoğlu
- Department of Dentomaxillofacial Radiology, Ankara University Faculty of Dentistry, Ankara 06500, Turkey
| |
Collapse
|
48
|
Chai ZK, Mao L, Chen H, Sun TG, Shen XM, Liu J, Sun ZJ. Improved Diagnostic Accuracy of Ameloblastoma and Odontogenic Keratocyst on Cone-Beam CT by Artificial Intelligence. Front Oncol 2022; 11:793417. [PMID: 35155194 PMCID: PMC8828501 DOI: 10.3389/fonc.2021.793417] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2021] [Accepted: 12/30/2021] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVE The purpose of this study was to utilize a convolutional neural network (CNN) to make preoperative differential diagnoses between ameloblastoma (AME) and odontogenic keratocyst (OKC) on cone-beam CT (CBCT). METHODS The CBCT images of 178 AMEs and 172 OKCs were retrospectively retrieved from the Hospital of Stomatology, Wuhan University. The datasets were randomly split into a training dataset of 272 cases and a testing dataset of 78 cases. Slices comprising lesions were retained and then cropped to suitable patches for training. The Inception v3 deep learning algorithm was utilized, and its diagnostic performance was compared with that of oral and maxillofacial surgeons. RESULTS The sensitivity, specificity, accuracy, and F1 score were 87.2%, 82.1%, 84.6%, and 85.0%, respectively. Furthermore, the average scores of the same indexes for 7 senior oral and maxillofacial surgeons were 60.0%, 71.4%, 65.7%, and 63.6%, respectively, and those of 30 junior oral and maxillofacial surgeons were 63.9%, 53.2%, 58.5%, and 60.7%, respectively. CONCLUSION The deep learning model was able to differentiate these two lesions with better diagnostic accuracy than clinical surgeons. The results indicate that the CNN may provide assistance for clinical diagnosis, especially for inexperienced surgeons.
Collapse
Affiliation(s)
- Zi-Kang Chai
- The State Key Laboratory Breeding Base of Basic Science of Stomatology (Hubei-MOST) & Key Laboratory of Oral Biomedicine, Ministry of Education, School and Hospital of Stomatology, Wuhan University, Wuhan, China
| | - Liang Mao
- The State Key Laboratory Breeding Base of Basic Science of Stomatology (Hubei-MOST) & Key Laboratory of Oral Biomedicine, Ministry of Education, School and Hospital of Stomatology, Wuhan University, Wuhan, China.,Department of Oral Maxillofacial-Head Neck Oncology, School and Hospital of Stomatology, Wuhan University, Wuhan, China
| | - Hua Chen
- Institute of Artificial Intelligence, School of Computer Science, Wuhan University, Wuhan, China
| | - Ting-Guan Sun
- The State Key Laboratory Breeding Base of Basic Science of Stomatology (Hubei-MOST) & Key Laboratory of Oral Biomedicine, Ministry of Education, School and Hospital of Stomatology, Wuhan University, Wuhan, China
| | - Xue-Meng Shen
- The State Key Laboratory Breeding Base of Basic Science of Stomatology (Hubei-MOST) & Key Laboratory of Oral Biomedicine, Ministry of Education, School and Hospital of Stomatology, Wuhan University, Wuhan, China
| | - Juan Liu
- Institute of Artificial Intelligence, School of Computer Science, Wuhan University, Wuhan, China
| | - Zhi-Jun Sun
- The State Key Laboratory Breeding Base of Basic Science of Stomatology (Hubei-MOST) & Key Laboratory of Oral Biomedicine, Ministry of Education, School and Hospital of Stomatology, Wuhan University, Wuhan, China.,Department of Oral Maxillofacial-Head Neck Oncology, School and Hospital of Stomatology, Wuhan University, Wuhan, China
| |
Collapse
|
49
|
Yeung AWK. Radiolucent Lesions of the Jaws: An Attempted Demonstration of the Use of Co-Word Analysis to List Main Similar Pathologies. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:ijerph19041933. [PMID: 35206118 PMCID: PMC8872104 DOI: 10.3390/ijerph19041933] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/24/2022] [Revised: 02/06/2022] [Accepted: 02/07/2022] [Indexed: 02/01/2023]
Abstract
(1) Background: Many radiolucent jaw lesions exist, and they often show a radiographic resemblance, rendering diagnosis a challenging act. Closely related lesions should be frequently mentioned together in the academic literature, which might be helpful for junior practitioners in determining their differential diagnosis. The usefulness of bibliometric analysis in this respect has yet to be demonstrated. (2) Methods: This study evaluated academic publications on radiolucent jaw lesions, as indexed by the Web of Science Core Collection database. The mentions of radiolucent jaw lesions were extracted from the complete bibliographic records of the publications, and co-word analyses were conducted with the aid of VOSviewer. (3) Results: Based on 1897 papers, visualization maps were synthesized to evaluate co-occurrences of the radiolucent jaw lesions. Ameloblastoma was frequently mentioned together with odontogenic keratocyst, dentigerous cyst, and radicular cyst. Osseous dysplasia was co-mentioned with osteomyelitis, ossifying fibroma, odontoma, fibrous dysplasia, and apical periodontitis. (4) Conclusions: The co-word analysis, a form of bibliometric analysis, could demonstrate a relatedness of radiolucent jaw lesions that could be considered at differential diagnosis.
Collapse
Affiliation(s)
- Andy Wai Kan Yeung
- Oral and Maxillofacial Radiology, Applied Oral Sciences and Community Dental Care, Faculty of Dentistry, University of Hong Kong, Hong Kong, China
| |
Collapse
|
50
|
Deep learning based diagnosis for cysts and tumors of jaw with massive healthy samples. Sci Rep 2022; 12:1855. [PMID: 35115624 PMCID: PMC8814152 DOI: 10.1038/s41598-022-05913-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2021] [Accepted: 01/14/2022] [Indexed: 11/09/2022] Open
Abstract
We aimed to develop an explainable and reliable method to diagnose cysts and tumors of the jaw with massive panoramic radiographs of healthy peoples based on deep learning, since collecting and labeling massive lesion samples are time-consuming, and existing deep learning-based methods lack explainability. Based on the collected 872 lesion samples and 10,000 healthy samples, a two-branch network was proposed for classifying the cysts and tumors of the jaw. The two-branch network is firstly pretrained on massive panoramic radiographs of healthy peoples, then is trained for classifying the sample categories and segmenting the lesion area. Totally, 200 healthy samples and 87 lesion samples were included in the testing stage. The average accuracy, precision, sensitivity, specificity, and F1 score of classification are 88.72%, 65.81%, 66.56%, 92.66%, and 66.14%, respectively. The average accuracy, precision, sensitivity, specificity, and F1 score of classification will reach 90.66%, 85.23%, 84.27%, 93.50%, and 84.74%, if only classifying the lesion samples and healthy samples. The proposed method showed encouraging performance in the diagnosis of cysts and tumors of the jaw. The classified categories and segmented lesion areas serve as the diagnostic basis for further diagnosis, which provides a reliable tool for diagnosing jaw tumors and cysts.
Collapse
|