1
|
Yeung C, Ungi T, Hu Z, Jamzad A, Kaufmann M, Walker R, Merchant S, Engel CJ, Jabs D, Rudan J, Mousavi P, Fichtinger G. From quantitative metrics to clinical success: assessing the utility of deep learning for tumor segmentation in breast surgery. Int J Comput Assist Radiol Surg 2024; 19:1193-1201. [PMID: 38642296 DOI: 10.1007/s11548-024-03133-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2024] [Accepted: 03/28/2024] [Indexed: 04/22/2024]
Abstract
PURPOSE Preventing positive margins is essential for ensuring favorable patient outcomes following breast-conserving surgery (BCS). Deep learning has the potential to enable this by automatically contouring the tumor and guiding resection in real time. However, evaluation of such models with respect to pathology outcomes is necessary for their successful translation into clinical practice. METHODS Sixteen deep learning models based on established architectures in the literature are trained on 7318 ultrasound images from 33 patients. Models are ranked by an expert based on their contours generated from images in our test set. Generated contours from each model are also analyzed using recorded cautery trajectories of five navigated BCS cases to predict margin status. Predicted margins are compared with pathology reports. RESULTS The best-performing model using both quantitative evaluation and our visual ranking framework achieved a mean Dice score of 0.959. Quantitative metrics are positively associated with expert visual rankings. However, the predictive value of generated contours was limited with a sensitivity of 0.750 and a specificity of 0.433 when tested against pathology reports. CONCLUSION We present a clinical evaluation of deep learning models trained for intraoperative tumor segmentation in breast-conserving surgery. We demonstrate that automatic contouring is limited in predicting pathology margins despite achieving high performance on quantitative metrics.
Collapse
Affiliation(s)
- Chris Yeung
- School of Computing, Queen's University, Kingston, ON, Canada.
| | - Tamas Ungi
- School of Computing, Queen's University, Kingston, ON, Canada
| | - Zoe Hu
- School of Medicine, Queen's University, Kingston, ON, Canada
| | - Amoon Jamzad
- School of Computing, Queen's University, Kingston, ON, Canada
| | - Martin Kaufmann
- Department of Surgery, Queen's University, Kingston, ON, Canada
| | - Ross Walker
- Department of Surgery, Queen's University, Kingston, ON, Canada
| | - Shaila Merchant
- Department of Surgery, Queen's University, Kingston, ON, Canada
| | - Cecil Jay Engel
- Department of Surgery, Queen's University, Kingston, ON, Canada
| | - Doris Jabs
- Department of Radiology, Queen's University, Kingston, ON, Canada
| | - John Rudan
- Department of Surgery, Queen's University, Kingston, ON, Canada
| | - Parvin Mousavi
- School of Computing, Queen's University, Kingston, ON, Canada
| | | |
Collapse
|
2
|
Park TY, Koh H, Lee W, Park SH, Chang WS, Kim H. Real-Time Acoustic Simulation Framework for tFUS: A Feasibility Study Using Navigation System. Neuroimage 2023; 282:120411. [PMID: 37844771 DOI: 10.1016/j.neuroimage.2023.120411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2023] [Revised: 10/10/2023] [Accepted: 10/13/2023] [Indexed: 10/18/2023] Open
Abstract
Transcranial focused ultrasound (tFUS), in which acoustic energy is focused on a small region in the brain through the skull, is a non-invasive therapeutic method with high spatial resolution and depth penetration. Image-guided navigation has been widely utilized to visualize the location of acoustic focus in the cranial cavity. However, this system is often inaccurate because of the significant aberrations caused by the skull. Therefore, acoustic simulations using a numerical solver have been widely adopted to compensate for this inaccuracy. Although the simulation can predict the intracranial acoustic pressure field, real-time application during tFUS treatment is almost impossible due to the high computational cost. In this study, we propose a neural network-based real-time acoustic simulation framework and test its feasibility by implementing a simulation-guided navigation (SGN) system. Real-time acoustic simulation is performed using a 3D conditional generative adversarial network (3D-cGAN) model featuring residual blocks and multiple loss functions. This network was trained by the conventional numerical acoustic simulation program (i.e., k-Wave). The SGN system is then implemented by integrating real-time acoustic simulation with a conventional image-guided navigation system. The proposed system can provide simulation results with a frame rate of 5 Hz (i.e., about 0.2 s), including all processing times. In numerical validation (3D-cGAN vs. k-Wave), the average peak intracranial pressure error was 6.8 ± 5.5%, and the average acoustic focus position error was 5.3 ± 7.7 mm. In experimental validation using a skull phantom (3D-cGAN vs. actual measurement), the average peak intracranial pressure error was 4.5%, and the average acoustic focus position error was 6.6 mm. These results demonstrate that the SGN system can predict the intracranial acoustic field according to transducer placement in real-time.
Collapse
Affiliation(s)
- Tae Young Park
- Bionics Research Center, Biomedical Research Division, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea; Division of Bio-Medical Science and Technology, KIST School, Korea University of Science and Technology, Seoul 02792, Republic of Korea
| | - Heekyung Koh
- Bionics Research Center, Biomedical Research Division, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea
| | - Wonhye Lee
- Bionics Research Center, Biomedical Research Division, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea; Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA
| | - So Hee Park
- Department of Neurosurgery, Yeungnam University Medical Center, Daegu 42415, Republic of Korea
| | - Won Seok Chang
- Department of Neurosurgery, Brain Research Institute, Yonsei University College of Medicine, Seoul 04527, Republic of Korea
| | - Hyungmin Kim
- Bionics Research Center, Biomedical Research Division, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea; Division of Bio-Medical Science and Technology, KIST School, Korea University of Science and Technology, Seoul 02792, Republic of Korea.
| |
Collapse
|
3
|
Recent Advances in Intraoperative Lumpectomy Margin Assessment for Breast Cancer. CURRENT BREAST CANCER REPORTS 2022. [DOI: 10.1007/s12609-022-00451-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
4
|
Hu Z, Nasute Fauerbach PV, Yeung C, Ungi T, Rudan J, Engel CJ, Mousavi P, Fichtinger G, Jabs D. Real-time automatic tumor segmentation for ultrasound-guided breast-conserving surgery navigation. Int J Comput Assist Radiol Surg 2022; 17:1663-1672. [PMID: 35588339 DOI: 10.1007/s11548-022-02658-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2022] [Accepted: 04/22/2022] [Indexed: 11/05/2022]
Abstract
PURPOSE Ultrasound-based navigation is a promising method in breast-conserving surgery, but tumor contouring often requires a radiologist at the time of surgery. Our goal is to develop a real-time automatic neural network-based tumor contouring process for intraoperative guidance. Segmentation accuracy is evaluated by both pixel-based metrics and expert visual rating. METHODS This retrospective study includes 7318 intraoperative ultrasound images acquired from 33 breast cancer patients, randomly split between 80:20 for training and testing. We implement a u-net architecture to label each pixel on ultrasound images as either tumor or healthy breast tissue. Quantitative metrics are calculated to evaluate the model's accuracy. Contour quality and usability are also assessed by fellowship-trained breast radiologists and surgical oncologists. Additionally, the viability of using our u-net model in an existing surgical navigation system is evaluated by measuring the segmentation frame rate. RESULTS The mean dice similarity coefficient of our u-net model is 0.78, with an area under the receiver-operating characteristics curve of 0.94, sensitivity of 0.95, and specificity of 0.67. Expert visual ratings are positive, with 93% of responses rating tumor contour quality at or above 7/10, and 75% of responses rating contour quality at or above 8/10. Real-time tumor segmentation achieved a frame rate of 16 frames-per-second, sufficient for clinical use. CONCLUSION Neural networks trained with intraoperative ultrasound images provide consistent tumor segmentations that are well received by clinicians. These findings suggest that neural networks are a promising adjunct to alleviate radiologist workload as well as improving efficiency in breast-conserving surgery navigation systems.
Collapse
Affiliation(s)
- Zoe Hu
- School of Medicine, Queen's University, 88 Stuart Street, Kingston, ON, K7L 3N6, Canada.
| | | | - Chris Yeung
- School of Computing, Queen's University, Kingston, ON, Canada
| | - Tamas Ungi
- School of Computing, Queen's University, Kingston, ON, Canada
| | - John Rudan
- Department of Surgery, Queen's University, Kingston, ON, Canada
| | - Cecil Jay Engel
- Department of Surgery, Queen's University, Kingston, ON, Canada
| | - Parvin Mousavi
- School of Computing, Queen's University, Kingston, ON, Canada
| | | | - Doris Jabs
- Department of Radiology, Queen's University, Kingston, ON, Canada
| |
Collapse
|
5
|
Poole M, Ungi T, Fichtinger G, Zevin B. Training in soft tissue resection using real-time visual computer navigation feedback from the Surgery Tutor: A randomized controlled trial. Surgery 2021; 172:89-95. [PMID: 34969526 DOI: 10.1016/j.surg.2021.11.037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2021] [Revised: 10/13/2021] [Accepted: 11/29/2021] [Indexed: 10/19/2022]
Abstract
BACKGROUND In competency-based medical education, surgery trainees are often required to learn procedural skills in a simulated setting before proceeding to the clinical environment. The Surgery Tutor computer navigation platform allows for real-time proctor-less assessment of open soft tissue resection skills; however, the use of this platform as an aid in acquisition of procedural skills is yet to be explored. METHODS In this prospective randomized controlled trial, 20 final year medical students were randomized to receive either training with real-time computer navigation feedback (Intervention, n = 10) or simulation training without navigation feedback (Control, n = 10) during resection of simulated non-palpable soft tissue tumors. Real-time computer navigation feedback allowed participants to visualize the position of their scalpel relative to the tumor. Computer navigation feedback was removed for postintervention assessment. Primary outcome was positive margin rate. Secondary outcomes were procedure time, mass of tissue excised, number of scalpel motions, and distance traveled by the scalpel. RESULTS Training with real-time computer navigation resulted in a significantly lower positive margin rate as compared to training without navigation feedback (0% vs 40%, P = .025). All other performance metrics were not significantly different between the 2 groups. Participants in the intervention group displayed significant improvement in positive margin rate from baseline to final assessment (80% vs 0%, P < .01), whereas participants in the Control group did not. CONCLUSION Real-time visual computer navigation feedback from the Surgery Tutor resulted in superior acquisition of procedural skills as compared to training without navigation feedback.
Collapse
Affiliation(s)
- Meredith Poole
- Kingston Health Sciences Center, Queen's University, Kingston, Ontario, Canada
| | - Tamas Ungi
- Kingston Health Sciences Center, Queen's University, Kingston, Ontario, Canada
| | - Gabor Fichtinger
- Kingston Health Sciences Center, Queen's University, Kingston, Ontario, Canada
| | - Boris Zevin
- Kingston Health Sciences Center, Queen's University, Kingston, Ontario, Canada.
| |
Collapse
|