1
|
García-Mejido JA, Solis-Martín D, Martín-Morán M, Fernández-Conde C, Fernández-Palacín F, Sainz-Bueno JA. Applicability of Deep Learning to Dynamically Identify the Different Organs of the Pelvic Floor in the Midsagittal Plane. Int Urogynecol J 2024:10.1007/s00192-024-05841-0. [PMID: 38913129 DOI: 10.1007/s00192-024-05841-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2024] [Accepted: 05/01/2024] [Indexed: 06/25/2024]
Abstract
INTRODUCTION AND HYPOTHESIS The objective was to create and validate the usefulness of a convolutional neural network (CNN) for identifying different organs of the pelvic floor in the midsagittal plane via dynamic ultrasound. METHODS This observational and prospective study included 110 patients. Transperineal ultrasound scans were performed by an expert sonographer of the pelvic floor. A video of each patient was made that captured the midsagittal plane of the pelvic floor at rest and the change in the pelvic structures during the Valsalva maneuver. After saving the captured videos, we manually labeled the different organs in each video. Three different architectures were tested-UNet, FPN, and LinkNet-to determine which CNN model best recognized anatomical structures. The best model was trained with the 86 cases for the number of epochs determined by the stop criterion via cross-validation. The Dice Similarity Index (DSI) was used for CNN validation. RESULTS Eighty-six patients were included to train the CNN and 24 to test the CNN. After applying the trained CNN to the 24 test videos, we did not observe any failed segmentation. In fact, we obtained a DSI of 0.79 (95% CI: 0.73 - 0.82) as the median of the 24 test videos. When we studied the organs independently, we observed differences in the DSI of each organ. The poorest DSIs were obtained in the bladder (0.71 [95% CI: 0.70 - 0.73]) and uterus (0.70 [95% CI: 0.68 - 0.74]), whereas the highest DSIs were obtained in the anus (0.81 [95% CI: 0.80 - 0.86]) and levator ani muscle (0.83 [95% CI: 0.82 - 0.83]). CONCLUSIONS Our results show that it is possible to apply deep learning using a trained CNN to identify different pelvic floor organs in the midsagittal plane via dynamic ultrasound.
Collapse
Affiliation(s)
- José Antonio García-Mejido
- Department of Obstetrics and Gynecology, Valme University Hospital, Seville, Spain.
- Department of Surgery, Faculty of Medicine, University of Seville, Seville, Spain.
| | - David Solis-Martín
- Department of Computer Science and Artificial Intelligence, Faculty of Mathematics, University of Seville, Seville, Spain
| | - Marina Martín-Morán
- Department of Obstetrics and Gynecology, Valme University Hospital, Seville, Spain
| | | | | | - José Antonio Sainz-Bueno
- Department of Obstetrics and Gynecology, Valme University Hospital, Seville, Spain
- Department of Surgery, Faculty of Medicine, University of Seville, Seville, Spain
| |
Collapse
|
2
|
Resta S, De Vito M, Patelli C, Lu JLA, Gabrielli G, Chiodo E, Mappa I, Rizzo G. Validation of an automated software (Smartpelvic™) in assessing hiatal area from three dimensional transperineal pelvic volumes of pregnant women: comparison with manual analysis. J Perinat Med 2024; 52:165-170. [PMID: 37938105 DOI: 10.1515/jpm-2023-0323] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/07/2023] [Accepted: 10/02/2023] [Indexed: 11/09/2023]
Abstract
OBJECTIVES The aim of this investigation was to evaluate the agreement between a manual and an automatic technique in assessing levator hiatus area (LHA) during pregnancy from three-dimensional (3D) pelvic floor volumes obtained by trans-perineal ultrasound (TPUS). METHODS 3D volumes were acquired during rest, maximum pelvic floor contraction and Valsalva maneuver from 66 pregnant women. Manual selection of LHA and automatic software (Smart Pelvic™) were applied on TPUS volume starting from a C-plane view. To evaluate intra- and inter-observer variability measurements of LHA were performed twice by the same operator and once by a second sonographer. Reference hiatal contours obtained manually by the first operator were compared with the automated ones. Reproducibility was evaluated by intraclass correlation coefficients (ICC) and Bland-Altman plots. RESULTS LHA measurement, using automatic software, achieved excellent intra-observer and inter-observer reproducibility in pregnant women both at rest and after dynamic analysis (ICC>0.9). Further, an excellent agreement resulted between manual selection of the LHA and automatic imaging (ICC>0.9). The average time taken to obtain LHA manually was significantly longer when compared to the automatic analysis (p≤0.0001). CONCLUSIONS Smart pelvic software resulted from a reliable method for automatically measuring the LHA, showing high reproducibility and accuracy.
Collapse
Affiliation(s)
- Serena Resta
- Department of Obstetrics and Gynecology, Università di Roma Tor Vergata, Rome, Italy
| | - Marika De Vito
- Department of Obstetrics and Gynecology, Università di Roma Tor Vergata, Rome, Italy
| | - Chiara Patelli
- Department of Obstetrics and Gynecology, Università di Verona, Verona Italy
| | - Jia Li Angela Lu
- Department of Obstetrics and Gynecology, Università di Roma Tor Vergata, Rome, Italy
| | - Gianluca Gabrielli
- Department of Obstetrics and Gynecology, Università di Roma Tor Vergata, Rome, Italy
| | - Erika Chiodo
- Department of Obstetrics and Gynecology, Università di Roma Tor Vergata, Rome, Italy
| | - Ilenia Mappa
- Department of Obstetrics and Gynecology, Università di Roma Tor Vergata, Rome, Italy
| | - Giuseppe Rizzo
- Department of Obstetrics and Gynecology, Università di Roma Tor Vergata, Rome, Italy
| |
Collapse
|
3
|
Deslandes A, Avery J, Chen H, Leonardi M, Condous G, Hull ML. Artificial intelligence as a teaching tool for gynaecological ultrasound: A systematic search and scoping review. Australas J Ultrasound Med 2024; 27:5-11. [PMID: 38434541 PMCID: PMC10902831 DOI: 10.1002/ajum.12368] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/05/2024] Open
Abstract
Purpose The aim of this study was to investigate the current application of artificial intelligence (AI) tools in the teaching of ultrasound skills as they pertain to gynaecological ultrasound. Methods A scoping review was performed. Eight databases (MEDLINE, EMBASE, EMCARE, CINAHL, Scopus, Web of Science, IEEE Xplore and ACM digital library) were searched in December 2022 using predefined keywords. All types of publications were eligible for inclusion so long as they reported the use of an AI tool, included reference to or discussion of teaching or the improvement of ultrasound skills and pertained to gynaecological ultrasound. Conference abstracts and non-English language papers which could not be adequately translated into English were excluded. Results The initial database search returned 481 articles. After screening against our inclusion and exclusion criteria, two were deemed to meet the inclusion criteria. Neither of the articles included reported original research (one systematic review and one review article). Neither of the included articles explicitly provided details of specific tools developed for the teaching of ultrasound skills for gynaecological imaging but highlighted similar applications within the field of obstetrics which could potentially be expanded. Conclusion Artificial intelligence can potentially assist in the training of sonographers and other ultrasound operators, including in the field of gynaecological ultrasound. This scoping review revealed however that to date, no original research has been published reporting the use or development of such a tool specifically for gynaecological ultrasound.
Collapse
Affiliation(s)
- Alison Deslandes
- Robinson Research InstituteUniversity of AdelaideAdelaideSouth AustraliaAustralia
| | - Jodie Avery
- Robinson Research InstituteUniversity of AdelaideAdelaideSouth AustraliaAustralia
| | - Hsiang‐Ting Chen
- School of Computer and Mathematical SciencesUniversity of AdelaideAdelaideSouth AustraliaAustralia
| | - Mathew Leonardi
- Robinson Research InstituteUniversity of AdelaideAdelaideSouth AustraliaAustralia
- Department of Obstetrics and GynecologyMcMaster UniversityHamiltonOntarioCanada
| | - George Condous
- Robinson Research InstituteUniversity of AdelaideAdelaideSouth AustraliaAustralia
| | - M. Louise Hull
- Robinson Research InstituteUniversity of AdelaideAdelaideSouth AustraliaAustralia
| |
Collapse
|
4
|
Płotka SS, Grzeszczyk MK, Szenejko PI, Żebrowska K, Szymecka-Samaha NA, Łęgowik T, Lipa MA, Kosińska-Kaczyńska K, Brawura-Biskupski-Samaha R, Išgum I, Sánchez CI, Sitek A. Deep learning for estimation of fetal weight throughout the pregnancy from fetal abdominal ultrasound. Am J Obstet Gynecol MFM 2023; 5:101182. [PMID: 37821009 DOI: 10.1016/j.ajogmf.2023.101182] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2023] [Revised: 09/17/2023] [Accepted: 10/04/2023] [Indexed: 10/13/2023]
Abstract
BACKGROUND Fetal weight is currently estimated from fetal biometry parameters using heuristic mathematical formulas. Fetal biometry requires measurements of the fetal head, abdomen, and femur. However, this examination is prone to inter- and intraobserver variability because of factors, such as the experience of the operator, image quality, maternal characteristics, or fetal movements. Our study tested the hypothesis that a deep learning method can estimate fetal weight based on a video scan of the fetal abdomen and gestational age with similar performance to the full biometry-based estimations provided by clinical experts. OBJECTIVE This study aimed to develop and test a deep learning method to automatically estimate fetal weight from fetal abdominal ultrasound video scans. STUDY DESIGN A dataset of 900 routine fetal ultrasound examinations was used. Among those examinations, 800 retrospective ultrasound video scans of the fetal abdomen from 700 pregnant women between 15 6/7 and 41 0/7 weeks of gestation were used to train the deep learning model. After the training phase, the model was evaluated on an external prospectively acquired test set of 100 scans from 100 pregnant women between 16 2/7 and 38 0/7 weeks of gestation. The deep learning model was trained to directly estimate fetal weight from ultrasound video scans of the fetal abdomen. The deep learning estimations were compared with manual measurements on the test set made by 6 human readers with varying levels of expertise. Human readers used standard 3 measurements made on the standard planes of the head, abdomen, and femur and heuristic formula to estimate fetal weight. The Bland-Altman analysis, mean absolute percentage error, and intraclass correlation coefficient were used to evaluate the performance and robustness of the deep learning method and were compared with human readers. RESULTS Bland-Altman analysis did not show systematic deviations between readers and deep learning. The mean and standard deviation of the mean absolute percentage error between 6 human readers and the deep learning approach was 3.75%±2.00%. Excluding junior readers (residents), the mean absolute percentage error between 4 experts and the deep learning approach was 2.59%±1.11%. The intraclass correlation coefficients reflected excellent reliability and varied between 0.9761 and 0.9865. CONCLUSION This study reports the use of deep learning to estimate fetal weight using only ultrasound video of the fetal abdomen from fetal biometry scans. Our experiments demonstrated similar performance of human measurements and deep learning on prospectively acquired test data. Deep learning is a promising approach to directly estimate fetal weight using ultrasound video scans of the fetal abdomen.
Collapse
Affiliation(s)
- Szymon S Płotka
- Sano Centre for Computational Medicine, Cracow, Poland (Messrs Płotka and Grzeszczyk); Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez)
| | - Michal K Grzeszczyk
- Sano Centre for Computational Medicine, Cracow, Poland (Messrs Płotka and Grzeszczyk)
| | - Paula I Szenejko
- First Department of Obstetrics and Gynecology, Medical University of Warsaw, Warsaw, Poland (Drs Szenejko and Lipa); Doctoral School of Translational Medicine, Centre of Postgraduate Medical Education, Warsaw, Poland (Dr Szenejko)
| | - Kinga Żebrowska
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | - Natalia A Szymecka-Samaha
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | | | - Michał A Lipa
- First Department of Obstetrics and Gynecology, Medical University of Warsaw, Warsaw, Poland (Drs Szenejko and Lipa)
| | - Katarzyna Kosińska-Kaczyńska
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | - Robert Brawura-Biskupski-Samaha
- Department of Obstetrics, Perinatology, and Neonatology, Centre of Postgraduate Medical Education, Warsaw, Poland (Drs Żebrowska, Szymecka-Samaha, Kosińska-Kaczyńska, and Brawura-Biskupski-Samaha)
| | - Ivana Išgum
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Radiology and Nuclear Medicine, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Dr Išgum)
| | - Clara I Sánchez
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez); Department of Biomedical Engineering and Physics, Amsterdam University Medical Center, University of Amsterdam, The Netherlands (Mr Płotka and Drs Išgum and Sánchez)
| | - Arkadiusz Sitek
- Center for Advanced Medical Computing and Simulation, Massachusetts General Hospital, Harvard Medical School, Boston, MA (Dr Sitek).
| |
Collapse
|
5
|
Jost E, Kosian P, Jimenez Cruz J, Albarqouni S, Gembruch U, Strizek B, Recker F. Evolving the Era of 5D Ultrasound? A Systematic Literature Review on the Applications for Artificial Intelligence Ultrasound Imaging in Obstetrics and Gynecology. J Clin Med 2023; 12:6833. [PMID: 37959298 PMCID: PMC10649694 DOI: 10.3390/jcm12216833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2023] [Revised: 10/17/2023] [Accepted: 10/25/2023] [Indexed: 11/15/2023] Open
Abstract
Artificial intelligence (AI) has gained prominence in medical imaging, particularly in obstetrics and gynecology (OB/GYN), where ultrasound (US) is the preferred method. It is considered cost effective and easily accessible but is time consuming and hindered by the need for specialized training. To overcome these limitations, AI models have been proposed for automated plane acquisition, anatomical measurements, and pathology detection. This study aims to overview recent literature on AI applications in OB/GYN US imaging, highlighting their benefits and limitations. For the methodology, a systematic literature search was performed in the PubMed and Cochrane Library databases. Matching abstracts were screened based on the PICOS (Participants, Intervention or Exposure, Comparison, Outcome, Study type) scheme. Articles with full text copies were distributed to the sections of OB/GYN and their research topics. As a result, this review includes 189 articles published from 1994 to 2023. Among these, 148 focus on obstetrics and 41 on gynecology. AI-assisted US applications span fetal biometry, echocardiography, or neurosonography, as well as the identification of adnexal and breast masses, and assessment of the endometrium and pelvic floor. To conclude, the applications for AI-assisted US in OB/GYN are abundant, especially in the subspecialty of obstetrics. However, while most studies focus on common application fields such as fetal biometry, this review outlines emerging and still experimental fields to promote further research.
Collapse
Affiliation(s)
- Elena Jost
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Philipp Kosian
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Jorge Jimenez Cruz
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Shadi Albarqouni
- Department of Diagnostic and Interventional Radiology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
- Helmholtz AI, Helmholtz Munich, Ingolstädter Landstraße 1, 85764 Neuherberg, Germany
| | - Ulrich Gembruch
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Brigitte Strizek
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| | - Florian Recker
- Department of Obstetrics and Gynecology, University Hospital Bonn, Venusberg Campus 1, 53127 Bonn, Germany
| |
Collapse
|
6
|
Szentimrey Z, Ameri G, Hong CX, Cheung RYK, Ukwatta E, Eltahawi A. Automated segmentation and measurement of the female pelvic floor from the mid-sagittal plane of 3D ultrasound volumes. Med Phys 2023; 50:6215-6227. [PMID: 36964964 DOI: 10.1002/mp.16389] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Revised: 03/17/2023] [Accepted: 03/17/2023] [Indexed: 03/27/2023] Open
Abstract
BACKGROUND Transperineal ultrasound (TPUS) is a valuable imaging tool for evaluating patients with pelvic floor disorders, including pelvic organ prolapse (POP). Currently, measurements of anatomical structures in the mid-sagittal plane of 2D and 3D US volumes are obtained manually, which is time-consuming, has high intra-rater variability, and requires an expert in pelvic floor US interpretation. Manual segmentation and biometric measurement can take 15 min per 2D mid-sagittal image by an expert operator. An automated segmentation method would provide quantitative data relevant to pelvic floor disorders and improve the efficiency and reproducibility of segmentation-based biometric methods. PURPOSE Develop a fast, reproducible, and automated method of acquiring biometric measurements and organ segmentations from the mid-sagittal plane of female 3D TPUS volumes. METHODS Our method used a nnU-Net segmentation model to segment the pubis symphysis, urethra, bladder, rectum, rectal ampulla, and anorectal angle in the mid-sagittal plane of female 3D TPUS volumes. We developed an algorithm to extract relevant biometrics from the segmentations. Our dataset included 248 3D TPUS volumes, 126/122 rest/Valsalva split, from 135 patients. System performance was assessed by comparing the automated results with manual ground truth data using the Dice similarity coefficient (DSC) and average absolute difference (AD). Intra-class correlation coefficient (ICC) and time difference were used to compare reproducibility and efficiency between manual and automated methods respectively. High ICC, low AD and reduction in time indicated an accurate and reliable automated system, making TPUS an efficient alternative for POP assessment. Paired t-test and non-parametric Wilcoxon signed-rank test were conducted, with p < 0.05 determining significance. RESULTS The nnU-Net segmentation model reported average DSC and p values (in brackets), compared to the next best tested model, of 87.4% (<0.0001), 68.5% (<0.0001), 61.0% (0.1), 54.6% (0.04), 49.2% (<0.0001) and 33.7% (0.02) for bladder, rectum, urethra, pubic symphysis, anorectal angle, and rectal ampulla respectively. The average ADs for the bladder neck position, bladder descent, rectal ampulla descent and retrovesical angle were 3.2 mm, 4.5 mm, 5.3 mm and 27.3°, respectively. The biometric algorithm had an ICC > 0.80 for the bladder neck position, bladder descent and rectal ampulla descent when compared to manual measurements, indicating high reproducibility. The proposed algorithms required approximately 1.27 s to analyze one image. The manual ground truths were performed by a single expert operator. In addition, due to high operator dependency for TPUS image collection, we would need to pursue further studies with images collected from multiple operators. CONCLUSIONS Based on our search in scientific databases (i.e., Web of Science, IEEE Xplore Digital Library, Elsevier ScienceDirect and PubMed), this is the first reported work of an automated segmentation and biometric measurement system for the mid-sagittal plane of 3D TPUS volumes. The proposed algorithm pipeline can improve the efficiency (1.27 s compared to 15 min manually) and has high reproducibility (high ICC values) compared to manual TPUS analysis for pelvic floor disorder diagnosis. Further studies are needed to verify this system's viability using multiple TPUS operators and multiple experts for performing manual segmentation and extracting biometrics from the images.
Collapse
Affiliation(s)
| | | | - Christopher X Hong
- Department of Obstetrics & Gynaecology, University of Michigan, Ann Arbor, Michigan, USA
| | - Rachel Y K Cheung
- Department of Obstetrics & Gynaecology, Faculty of Medicine, The Chinese University of Hong Kong, Hong Kong
| | - Eranga Ukwatta
- School of Engineering, University of Guelph, Guelph, Ontario, Canada
| | - Ahmed Eltahawi
- Cosm Medical, Toronto, Ontario, Canada
- Information System Department, Faculty of Computers and Informatics, Suez Canal University, Ismailia, Egypt
| |
Collapse
|
7
|
Chen Y, Lin X, Zhang M, Qu E, Huang D, Mao Y, Huang Z, Zhang X. Validation of an automatic method for reconstruction, delineation, and measurement of levator hiatus in clinical practice. Neurourol Urodyn 2023; 42:1547-1554. [PMID: 37358312 DOI: 10.1002/nau.25231] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2023] [Revised: 04/18/2023] [Accepted: 06/12/2023] [Indexed: 06/27/2023]
Abstract
OBJECTIVES To evaluate the concordance between an automatic software program and manual evaluation in reconstructing, delineating, and measuring the levator hiatus (LH) on maximal Valsalva maneuver. METHODS This was a retrospective study analyzing archived raw ultrasound imaging data of 100 patients underwent transperineal ultrasound (TPUS) examination. Each data were assessed by the automatic Smart Pelvic System software program and manual evaluation. The Dice similarity index (DSI), mean absolute distance (MAD), and Hausdorff distance (HDD) were calculated to quantify delineation accuracy of LH. Agreement between automatic and manual measurement of levator hiatus area was assessed by intraclass correlation coefficient (ICC) and Bland-Altman method. RESULTS The satisfaction rate of automatic reconstruction was 94%. Six images were recognized as unsatisfactory reconstructed images for some gas in the rectum and anal canal. Compared with satisfactory reconstructed images, DSI of unsatisfactory reconstructed images was lower, MAD and HDD were larger (p = 0.001, p = 0.001, p = 0.006, respectively). The ICC was up to 0.987 in 94 satisfactory reconstructed images. CONCLUSIONS The Smart Pelvic System software program had good performance in reconstruction, delineation, and measurement of LH on maximal Valsalva maneuver in clinical practice, despite misidentification of the border of posterior aspect of LH due to the influence of gas in the rectum.
Collapse
Affiliation(s)
- Ying Chen
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| | - Xin Lin
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| | - Man Zhang
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| | - Enze Qu
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| | - Dongmei Huang
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| | - Yongjiang Mao
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| | - Zeping Huang
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| | - Xinling Zhang
- Department of Ultrasound, The Third Affiliated Hospital of Sun Yat-Sen University, Guangzhou, Guangdong Province, China
| |
Collapse
|
8
|
Rabbat N, Qureshi A, Hsu KT, Asif Z, Chitnis P, Shobeiri SA, Wei Q. Automated Segmentation of Levator Ani Muscle from 3D Endovaginal Ultrasound Images. Bioengineering (Basel) 2023; 10:894. [PMID: 37627779 PMCID: PMC10451809 DOI: 10.3390/bioengineering10080894] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2023] [Revised: 07/18/2023] [Accepted: 07/21/2023] [Indexed: 08/27/2023] Open
Abstract
Levator ani muscle (LAM) avulsion is a common complication of vaginal childbirth and is linked to several pelvic floor disorders. Diagnosing and treating these conditions require imaging of the pelvic floor and examination of the obtained images, which is a time-consuming process subjected to operator variability. In our study, we proposed using deep learning (DL) to automate the segmentation of the LAM from 3D endovaginal ultrasound images (EVUS) to improve diagnostic accuracy and efficiency. Over one thousand images extracted from the 3D EVUS data of healthy subjects and patients with pelvic floor disorders were utilized for the automated LAM segmentation. A U-Net model was implemented, with Intersection over Union (IoU) and Dice metrics being used for model performance evaluation. The model achieved a mean Dice score of 0.86, demonstrating a better performance than existing works. The mean IoU was 0.76, indicative of a high degree of overlap between the automated and manual segmentation of the LAM. Three other models including Attention UNet, FD-UNet and Dense-UNet were also applied on the same images which showed comparable results. Our study demonstrated the feasibility and accuracy of using DL segmentation with U-Net architecture to automate LAM segmentation to reduce the time and resources required for manual segmentation of 3D EVUS images. The proposed method could become an important component in AI-based diagnostic tools, particularly in low socioeconomic regions where access to healthcare resources is limited. By improving the management of pelvic floor disorders, our approach may contribute to better patient outcomes in these underserved areas.
Collapse
Affiliation(s)
- Nada Rabbat
- Department of Bioengineering, George Mason University, Fairfax, VA 22030, USA; (N.R.); (A.Q.); (K.-T.H.); (P.C.); (S.A.S.)
| | - Amad Qureshi
- Department of Bioengineering, George Mason University, Fairfax, VA 22030, USA; (N.R.); (A.Q.); (K.-T.H.); (P.C.); (S.A.S.)
| | - Ko-Tsung Hsu
- Department of Bioengineering, George Mason University, Fairfax, VA 22030, USA; (N.R.); (A.Q.); (K.-T.H.); (P.C.); (S.A.S.)
| | - Zara Asif
- Department of Bioengineering, George Mason University, Fairfax, VA 22030, USA; (N.R.); (A.Q.); (K.-T.H.); (P.C.); (S.A.S.)
| | - Parag Chitnis
- Department of Bioengineering, George Mason University, Fairfax, VA 22030, USA; (N.R.); (A.Q.); (K.-T.H.); (P.C.); (S.A.S.)
| | - Seyed Abbas Shobeiri
- Department of Bioengineering, George Mason University, Fairfax, VA 22030, USA; (N.R.); (A.Q.); (K.-T.H.); (P.C.); (S.A.S.)
- Inova Fairfax Hospital, Fairfax, VA 22042, USA
| | - Qi Wei
- Department of Bioengineering, George Mason University, Fairfax, VA 22030, USA; (N.R.); (A.Q.); (K.-T.H.); (P.C.); (S.A.S.)
| |
Collapse
|
9
|
van den Noort F, Manzini C, Hofsteenge M, Sirmacek B, van der Vaart CH, Slump CH. Unsupervised convolutional autoencoders for 4D transperineal ultrasound classification. J Med Imaging (Bellingham) 2023; 10:014004. [PMID: 36785585 PMCID: PMC9921518 DOI: 10.1117/1.jmi.10.1.014004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Accepted: 01/23/2023] [Indexed: 02/13/2023] Open
Abstract
Purpose 4D Transperineal ultrasound (TPUS) is used to examine female pelvic floor disorders. Muscle movement, like performing a muscle contraction or a Valsalva maneuver, can be captured on TPUS. Our work investigates the possibility for unsupervised analysis and classification of the TPUS data. Approach An unsupervised 3D-convolutional autoencoder is trained to compress TPUS volume frames into a latent feature vector (LFV) of 128 elements. The (co)variance of the features are analyzed and statistical tests are performed to analyze how features contribute in storing contraction and Valsalva information. Further dimensionality reduction is applied (principal component analysis or a 2D-convolutional autoencoder) to the LFVs of the frames of the TPUS movie to compress the data and analyze the interframe movement. Clustering algorithms ( K -means clustering and Gaussian mixture models) are applied to this representation of the data to investigate the possibilities of unsupervised classification. Results The majority of the features show a significant difference between contraction and Valsalva. The (co)variance of the features from the LFVs was investigated and features most prominent in capturing muscle movement were identified. Furthermore, the first principal component of the frames from a single TPUS movie can be used to identify movement between the frames. The best classification results were obtained after applying principal component analysis and Gaussian mixture models to the LFVs of the TPUS movies, yielding a 91.2% accuracy. Conclusion Unsupervised analysis and classification of TPUS data yields relevant information about the type and amount of muscle movement present.
Collapse
Affiliation(s)
- Frieda van den Noort
- University of Twente, Technical Medical Centre, Robotics and Mechatronics, Faculty of Electrical Engineering Mathematics and Computer Science, Enschede, The Netherlands
| | - Claudia Manzini
- University Medical Centre Utrecht, Department of Obstetrics and Gynecology, Utrecht, The Netherlands
| | - Merijn Hofsteenge
- University of Twente, Technical Medical Centre, Robotics and Mechatronics, Faculty of Electrical Engineering Mathematics and Computer Science, Enschede, The Netherlands
| | - Beril Sirmacek
- Saxion University of Applied Sciences, School of Creative Technology, Smart Cities Group, Enschede, The Netherlands
| | - Carl H. van der Vaart
- University Medical Centre Utrecht, Department of Obstetrics and Gynecology, Utrecht, The Netherlands
| | - Cornelis H. Slump
- University of Twente, Technical Medical Centre, Robotics and Mechatronics, Faculty of Electrical Engineering Mathematics and Computer Science, Enschede, The Netherlands
| |
Collapse
|
10
|
Chen TT, Fang JH, Long CY. Re: Automatic identification and segmentation of slice of minimal hiatal dimensions in transperineal ultrasound volumes. ULTRASOUND IN OBSTETRICS & GYNECOLOGY : THE OFFICIAL JOURNAL OF THE INTERNATIONAL SOCIETY OF ULTRASOUND IN OBSTETRICS AND GYNECOLOGY 2022; 60:588-589. [PMID: 36183345 DOI: 10.1002/uog.26055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Accepted: 05/19/2022] [Indexed: 05/27/2023]
Affiliation(s)
- T-T Chen
- Department of Obstetrics and Gynecology, Kaohsiung Medical University Hospital, Kaohsiung Medical University, Kaohsiung, Taiwan
| | - J-H Fang
- Department of Obstetrics and Gynecology, Kaohsiung Municipal Siaogang Hospital, Kaohsiung Medical University, Kaohsiung, Taiwan
| | - C-Y Long
- Department of Obstetrics and Gynecology, Kaohsiung Medical University Hospital, Kaohsiung Medical University, Kaohsiung, Taiwan
- Department of Obstetrics and Gynecology, Kaohsiung Municipal Siaogang Hospital, Kaohsiung Medical University, Kaohsiung, Taiwan
| |
Collapse
|
11
|
van den Noort F, Manzini C, van der Vaart CH, van Limbeek MAJ, Slump CH, Grob ATM. Reply. ULTRASOUND IN OBSTETRICS & GYNECOLOGY : THE OFFICIAL JOURNAL OF THE INTERNATIONAL SOCIETY OF ULTRASOUND IN OBSTETRICS AND GYNECOLOGY 2022; 60:589-590. [PMID: 36183346 DOI: 10.1002/uog.26056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Affiliation(s)
- F van den Noort
- Robotics and Mechatronics, Faculty of Electrical Engineering Mathematics and Computer Science, Technical Medical Centre, University of Twente, Enschede, The Netherlands
| | - C Manzini
- Department of Obstetrics and Gynecology, University Medical Centre Utrecht, Utrecht, The Netherlands
| | - C H van der Vaart
- Department of Obstetrics and Gynecology, University Medical Centre Utrecht, Utrecht, The Netherlands
| | - M A J van Limbeek
- Dynamics of Complex Fluids, Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany
| | - C H Slump
- Robotics and Mechatronics, Faculty of Electrical Engineering Mathematics and Computer Science, Technical Medical Centre, University of Twente, Enschede, The Netherlands
| | - A T M Grob
- Multi-Modality Medical Imaging, Faculty of Science and Technology, Technical Medical Centre, University of Twente, Enschede, The Netherlands
| |
Collapse
|