1
|
Chen YL, Chiang HK. Development of Single-Channel Dual-Element Custom-Made Ultrasound Scanner with Miniature Optical Position Tracker for Freehand Imaging. BIOSENSORS 2023; 13:bios13040431. [PMID: 37185505 PMCID: PMC10136573 DOI: 10.3390/bios13040431] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/08/2023] [Revised: 03/10/2023] [Accepted: 03/20/2023] [Indexed: 05/17/2023]
Abstract
Handheld ultrasound has great potential in resource-limited areas, and can improve healthcare for rural populations. Single-channel ultrasound has been widely used in many clinical ultrasound applications, and optical tracking is considered accurate and reliable. In this study, we developed a 10 MHz lead magnesium niobate-lead titanate (PMN-PT) dual-element ultrasound transducer combined with a miniature optical position tracker, and then measured the rectus femoris of the thigh, upper arm, and cheek muscles. Compared to single-element transducers, dual-element transducers improve the contrast of near-field signals, effectively reduce noise, and are suitable for measuring curved surfaces. The purpose of position tracking is to calculate the location of the ultrasound transducer during the measurement process. By utilizing positioning information, 2D ultrasound imaging can be achieved while maintaining structural integrity. The dual-element ultrasound scanner presented in this study can enable continuous scanning over a large area without a scanning width limitation. The custom-made dual-element ultrasound scanner has the advantage of being a portable, reliable, and low-cost ultrasound device, and is helpful in popularizing medical care for remote villages.
Collapse
Affiliation(s)
- Yen-Lung Chen
- Department of BioMedical Engineering, National Yang Ming Chiao Tung University, Taipei 112, Taiwan
| | - Huihua Kenny Chiang
- Department of BioMedical Engineering, National Yang Ming Chiao Tung University, Taipei 112, Taiwan
| |
Collapse
|
2
|
Cardan R, Covington EL, Popple R. A Holographic Augmented Reality Guidance System for Patient Alignment: A Feasibility Study. Cureus 2021; 13:e14695. [PMID: 34055539 PMCID: PMC8153089 DOI: 10.7759/cureus.14695] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022] Open
Abstract
Purpose To evaluate the accuracy of an augmented reality holographic guidance system for potential use in patient alignment in radiotherapy applications. Methods A cubic phantom was scanned on a CT simulator and a 3D mesh was extracted using the Eclipse Scripting API. An application was created for the Microsoft HoloLens to allow users to see the scanned mesh as a hologram overlaid in the treatment vault. Six therapists were equipped with the HoloLens glasses and instructed to move the real phantom to align with the perceived spatial hologram using only couch controls. The initial couch coordinates were recorded and then recorded at each step as the therapist moved the phantom to each new location. The application varied the position of the virtual phantom to 10 preprogrammed locations within a 40-cm cubic volume in a combination of vertical, longitudinal, and lateral axis shifts. The absolute position difference between the holographic world and real-world phantom was recorded at each step. Also, the relative position from one position to the next was recorded. Results Fifty shifts were collected across the six therapists. The mean difference between the physical position and instructed holographic position was 0.58 ± 0.31 cm for relative shifts and 0.51 ± 0.33 cm for absolute position. The maximum difference between the holographic position and the actual post shift position was 1.53 cm for relative and 1.58 cm for absolute. Conclusion Holographic augmented reality guidance using the Microsoft HoloLens provides adequate accuracy for initial treatment alignment but lacks the fine alignment accuracy of X-ray imaging systems.
Collapse
Affiliation(s)
- Rex Cardan
- Radiation Oncology, University of Alabama at Birmingham, Birmingham, USA
| | | | - Richard Popple
- Radiation Oncology, University of Alabama at Birmingham, Birmingham, USA
| |
Collapse
|
3
|
Fotouhi J, Mehrfard A, Song T, Johnson A, Osgood G, Unberath M, Armand M, Navab N. Development and Pre-Clinical Analysis of Spatiotemporal-Aware Augmented Reality in Orthopedic Interventions. IEEE TRANSACTIONS ON MEDICAL IMAGING 2021; 40:765-778. [PMID: 33166252 PMCID: PMC8317976 DOI: 10.1109/tmi.2020.3037013] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. As a consequence, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. The main contribution of this paper is to reveal how exemplary workflows are redefined by taking full advantage of head-mounted displays when entirely co-registered with the imaging system at all times. The awareness of the system from the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. Our system achieved an error of 4.76 ± 2.91mm for placing K-wire in a fracture management procedure, and yielded errors of 1.57 ± 1.16° and 1.46 ± 1.00° in the abduction and anteversion angles, respectively, for total hip arthroplasty (THA). We compared the results with the outcomes from baseline standard operative and non-immersive AR procedures, which had yielded errors of [4.61mm, 4.76°, 4.77°] and [5.13mm, 1.78°, 1.43°], respectively, for wire placement, and abduction and anteversion during THA. We hope that our holistic approach towards improving the interface of surgery not only augments the surgeon's capabilities but also augments the surgical team's experience in carrying out an effective intervention with reduced complications and provide novel approaches of documenting procedures for training purposes.
Collapse
|
4
|
Ma L, Fei B. Comprehensive review of surgical microscopes: technology development and medical applications. JOURNAL OF BIOMEDICAL OPTICS 2021; 26:JBO-200292VRR. [PMID: 33398948 PMCID: PMC7780882 DOI: 10.1117/1.jbo.26.1.010901] [Citation(s) in RCA: 31] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/10/2020] [Accepted: 12/04/2020] [Indexed: 05/06/2023]
Abstract
SIGNIFICANCE Surgical microscopes provide adjustable magnification, bright illumination, and clear visualization of the surgical field and have been increasingly used in operating rooms. State-of-the-art surgical microscopes are integrated with various imaging modalities, such as optical coherence tomography (OCT), fluorescence imaging, and augmented reality (AR) for image-guided surgery. AIM This comprehensive review is based on the literature of over 500 papers that cover the technology development and applications of surgical microscopy over the past century. The aim of this review is threefold: (i) providing a comprehensive technical overview of surgical microscopes, (ii) providing critical references for microscope selection and system development, and (iii) providing an overview of various medical applications. APPROACH More than 500 references were collected and reviewed. A timeline of important milestones during the evolution of surgical microscope is provided in this study. An in-depth technical overview of the optical system, mechanical system, illumination, visualization, and integration with advanced imaging modalities is provided. Various medical applications of surgical microscopes in neurosurgery and spine surgery, ophthalmic surgery, ear-nose-throat (ENT) surgery, endodontics, and plastic and reconstructive surgery are described. RESULTS Surgical microscopy has been significantly advanced in the technical aspects of high-end optics, bright and shadow-free illumination, stable and flexible mechanical design, and versatile visualization. New imaging modalities, such as hyperspectral imaging, OCT, fluorescence imaging, photoacoustic microscopy, and laser speckle contrast imaging, are being integrated with surgical microscopes. Advanced visualization and AR are being added to surgical microscopes as new features that are changing clinical practices in the operating room. CONCLUSIONS The combination of new imaging technologies and surgical microscopy will enable surgeons to perform challenging procedures and improve surgical outcomes. With advanced visualization and improved ergonomics, the surgical microscope has become a powerful tool in neurosurgery, spinal, ENT, ophthalmic, plastic and reconstructive surgeries.
Collapse
Affiliation(s)
- Ling Ma
- University of Texas at Dallas, Department of Bioengineering, Richardson, Texas, United States
| | - Baowei Fei
- University of Texas at Dallas, Department of Bioengineering, Richardson, Texas, United States
- University of Texas Southwestern Medical Center, Department of Radiology, Dallas, Texas, United States
| |
Collapse
|
5
|
Dallas-Orr D, Penev Y, Schultz R, Courtier J. Comparing Computed Tomography-Derived Augmented Reality Holograms to a Standard Picture Archiving and Communication Systems Viewer for Presurgical Planning: Feasibility Study. JMIR Perioper Med 2020; 3:e18367. [PMID: 33393933 PMCID: PMC7709855 DOI: 10.2196/18367] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/29/2020] [Accepted: 08/13/2020] [Indexed: 01/19/2023] Open
Abstract
Background Picture archiving and communication systems (PACS) are ubiquitously used to store, share, and view radiological information for preoperative planning across surgical specialties. Although traditional PACS software has proven reliable in terms of display accuracy and ease of use, it remains limited by its inherent representation of medical imaging in 2 dimensions. Augmented reality (AR) systems present an exciting opportunity to complement traditional PACS capabilities. Objective This study aims to evaluate the technical feasibility of using a novel AR platform, with holograms derived from computed tomography (CT) imaging, as a supplement to traditional PACS for presurgical planning in complex surgical procedures. Methods Independent readers measured objects of predetermined, anthropomorphically correlated sizes using the circumference and angle tools of standard-of-care PACS software and a newly developed augmented reality presurgical planning system (ARPPS). Results Measurements taken with the standard PACS and the ARPPS showed no statistically significant differences. Bland-Altman analysis showed a mean difference of 0.08% (95% CI –4.20% to 4.36%) for measurements taken with PACS versus ARPPS’ circumference tools and –1.84% (95% CI –6.17% to 2.14%) for measurements with the systems’ angle tools. Lin’s concordance correlation coefficients were 1.00 and 0.98 for the circumference and angle measurements, respectively, indicating almost perfect strength of agreement between ARPPS and PACS. Intraclass correlation showed no statistically significant difference between the readers for either measurement tool on each system. Conclusions ARPPS can be an effective, accurate, and precise means of 3D visualization and measurement of CT-derived holograms in the presurgical care timeline.
Collapse
Affiliation(s)
- David Dallas-Orr
- Department of Bioengineering and Therapeutic Sciences, University of California, San Francisco, San Francisco, CA, United States.,Department of Bioengineering, University of California, Berkeley, Berkeley, CA, United States
| | - Yordan Penev
- Department of Bioengineering and Therapeutic Sciences, University of California, San Francisco, San Francisco, CA, United States.,Department of Bioengineering, University of California, Berkeley, Berkeley, CA, United States
| | - Robert Schultz
- Department of Bioengineering and Therapeutic Sciences, University of California, San Francisco, San Francisco, CA, United States.,Department of Bioengineering, University of California, Berkeley, Berkeley, CA, United States
| | - Jesse Courtier
- Department of Radiology, Mission Bay Hospital, University of California, San Francisco, San Francisco, CA, United States.,Department of Radiology and Biomedical Imaging, University of California, San Francisco, San Francisco, CA, United States
| |
Collapse
|
6
|
Wen T, Wang C, Zhang Y, Zhou S. A Novel Ultrasound Probe Spatial Calibration Method Using a Combined Phantom and Stylus. ULTRASOUND IN MEDICINE & BIOLOGY 2020; 46:2079-2089. [PMID: 32446677 DOI: 10.1016/j.ultrasmedbio.2020.03.018] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/31/2019] [Revised: 03/06/2020] [Accepted: 03/22/2020] [Indexed: 06/11/2023]
Abstract
Intra-operative ultrasound (US) is a popular imaging modality for its non-radiative and real-time advantages. However, it is still challenging to perform an interventional procedure under two-dimensional (2-D) US image guidance. Accordingly, the trend has been to perform three-dimensional (3-D) US image guidance by equipping the US probe with a spatial position tracking device, which requires accurate probe calibration for determining the spatial position between the B-scan image and the tracked probe. In this report, we propose a novel probe spatial calibration method by developing a calibration phantom combined with the tracking stylus. The calibration phantom is custom-designed to simplify the alignment between the stylus tip and the B-scan image plane. The spatial position of the stylus tip is tracked in real time, and its 2-D image pixel location is extracted and collected simultaneously. Gaussian distribution is used to model the spatial position of the stylus tip and the iterative closest point-based optimization algorithm is used to estimate the spatial transformation that matches these two point sets. Once the probe is calibrated, its trajectory and the B-scan image are collected and used for the volume reconstruction in our freehand 3-D US imaging system. Experimental results demonstrate that the probe calibration approach results in less than 1-mm mean point reconstruction accuracy. It requires less than 5 min for an inexperienced user to complete the probe calibration procedure with minimal training. The mockup test shows that the 3-D images are geometrically correct with 0.28°-angle accuracy and 0.40-mm distance accuracy.
Collapse
Affiliation(s)
- Tiexiang Wen
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, P. R. China; Key Laboratory of Health Informatics, Chinese Academy of Sciences, Shenzhen, P. R. China; University of Chinese Academy of Sciences, Beijing, P.R. China
| | - Cheng Wang
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, P. R. China
| | - Yi Zhang
- Center of Interventional Radiology & Vascular Surgery, Department of Radiology, Zhongda Hospital, Medical School, Southeast University, Nanjing, P.R. China
| | - Shoujun Zhou
- Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, P. R. China; Key Laboratory of Health Informatics, Chinese Academy of Sciences, Shenzhen, P. R. China.
| |
Collapse
|
7
|
|
8
|
Chatrasingh M, Suthakorn J. A Novel Design of N-Fiducial Phantom for Automatic Ultrasound Calibration. J Med Phys 2019; 44:191-200. [PMID: 31576067 PMCID: PMC6764176 DOI: 10.4103/jmp.jmp_92_18] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Background: Freehand ultrasound (US) is a technique used to acquire three-dimensional (3D) US images using a tracked 2D US probe. Calibrating the probe with a proper calibration phantom improves the precision of the technique and allows several applications in computer-assisted surgery. N-fiducial phantom is widely used due to the robustness of precise fabrication and convenience of use. In principle, the design supports single-frame calibration by providing at least three noncollinear points in 3D space at once. Due to this requirement, most designs contain multiple N-fiducials in unpatterned and noncollinear arrangements. The unpatterned multiple N-fiducials appearing as scattered dots in the US image are difficult to extract, and the extracted data are usually contaminated with noise. In practice, the extraction mostly relied on manual interventions, and calibration with N-fiducial phantom has not yet achieved high accuracy with single or few frame calibrations due to noise contamination. Aims: In this article, we propose a novel design of the N-fiducial US calibration phantom to enable automatic feature extraction with comparable accuracy to multiple frame calibration. Materials and Methods: Along with the design, the Random Sample Consensus (RANSAC) algorithm was used for feature extraction with both 2D and 3D models estimation. The RANSAC feature extraction algorithm was equipped with a closed-form calibration method to achieve automatic calibration. Results: The accuracy, precision, and shape reconstruction errors of the calibration acquired from the experiment were significantly matched with the previous literature reports. Conclusions: The results showed that our proposed method has a high efficiency to perform automatic feature extraction compared to conventional extraction performed by humans.
Collapse
Affiliation(s)
- Maria Chatrasingh
- Department of Biomedical Engineering, Center for Biomedical and Robotics Technology (BART LAB), Faculty of Engineering, Mahidol University, Salaya, Thailand
| | - Jackrit Suthakorn
- Department of Biomedical Engineering, Center for Biomedical and Robotics Technology (BART LAB), Faculty of Engineering, Mahidol University, Salaya, Thailand
| |
Collapse
|
9
|
Duraes M, Crochet P, Pagès E, Grauby E, Lasch L, Rebel L, Van Meer F, Rathat G. Surgery of nonpalpable breast cancer: First step to a virtual per‐operative localization? First step to virtual breast cancer localization. Breast J 2019; 25:874-879. [DOI: 10.1111/tbj.13379] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Affiliation(s)
- Martha Duraes
- Department of gynaecological surgery Montpellier University Hospital Montpellier France
| | - Patrice Crochet
- Department of gynaecological surgery Montpellier University Hospital Montpellier France
| | - Emmanuelle Pagès
- Department of radiology Montpellier University Hospital Montpellier France
| | - Elsa Grauby
- Department of gynaecological surgery Montpellier University Hospital Montpellier France
| | - Lidia Lasch
- Department of gynaecological surgery Montpellier University Hospital Montpellier France
| | - Lucie Rebel
- Department of gynaecological surgery Montpellier University Hospital Montpellier France
| | | | - Gauthier Rathat
- Department of gynaecological surgery Montpellier University Hospital Montpellier France
| |
Collapse
|
10
|
Karmonik C, Boone TB, Khavari R. Workflow for Visualization of Neuroimaging Data with an Augmented Reality Device. J Digit Imaging 2019; 31:26-31. [PMID: 28685319 DOI: 10.1007/s10278-017-9991-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023] Open
Abstract
Commercial availability of three-dimensional (3D) augmented reality (AR) devices has increased interest in using this novel technology for visualizing neuroimaging data. Here, a technical workflow and algorithm for importing 3D surface-based segmentations derived from magnetic resonance imaging data into a head-mounted AR device is presented and illustrated on selected examples: the pial cortical surface of the human brain, fMRI BOLD maps, reconstructed white matter tracts, and a brain network of functional connectivity.
Collapse
Affiliation(s)
- Christof Karmonik
- MRI core, Houston Methodist Hospital Research Institute, 6565 Fannin Street, Houston, TX, 77030, USA.
| | - Timothy B Boone
- Department of Urology, Houston Methodist Hospital, Houston, TX, USA
| | - Rose Khavari
- Department of Urology, Houston Methodist Hospital, Houston, TX, USA
| |
Collapse
|
11
|
Karmonik C, Elias SN, Zhang JY, Diaz O, Klucznik RP, Grossman RG, Britz GW. Augmented Reality with Virtual Cerebral Aneurysms: A Feasibility Study. World Neurosurg 2018; 119:e617-e622. [DOI: 10.1016/j.wneu.2018.07.222] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2018] [Revised: 07/24/2018] [Accepted: 07/25/2018] [Indexed: 10/28/2022]
|
12
|
Lin MA, Siu AF, Bae JH, Cutkosky MR, Daniel BL. HoloNeedle: Augmented Reality Guidance System for Needle Placement Investigating the Advantages of Three-Dimensional Needle Shape Reconstruction. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2018.2863381] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
13
|
Gerard IJ, Kersten-Oertel M, Drouin S, Hall JA, Petrecca K, De Nigris D, Di Giovanni DA, Arbel T, Collins DL. Combining intraoperative ultrasound brain shift correction and augmented reality visualizations: a pilot study of eight cases. J Med Imaging (Bellingham) 2018; 5:021210. [PMID: 29392162 DOI: 10.1117/1.jmi.5.2.021210] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2017] [Accepted: 01/08/2018] [Indexed: 11/14/2022] Open
Abstract
We present our work investigating the feasibility of combining intraoperative ultrasound for brain shift correction and augmented reality (AR) visualization for intraoperative interpretation of patient-specific models in image-guided neurosurgery (IGNS) of brain tumors. We combine two imaging technologies for image-guided brain tumor neurosurgery. Throughout surgical interventions, AR was used to assess different surgical strategies using three-dimensional (3-D) patient-specific models of the patient's cortex, vasculature, and lesion. Ultrasound imaging was acquired intraoperatively, and preoperative images and models were registered to the intraoperative data. The quality and reliability of the AR views were evaluated with both qualitative and quantitative metrics. A pilot study of eight patients demonstrates the feasible combination of these two technologies and their complementary features. In each case, the AR visualizations enabled the surgeon to accurately visualize the anatomy and pathology of interest for an extended period of the intervention. Inaccuracies associated with misregistration, brain shift, and AR were improved in all cases. These results demonstrate the potential of combining ultrasound-based registration with AR to become a useful tool for neurosurgeons to improve intraoperative patient-specific planning by improving the understanding of complex 3-D medical imaging data and prolonging the reliable use of IGNS.
Collapse
Affiliation(s)
- Ian J Gerard
- McGill University, Montreal Neurological Institute and Hospital, Department of Biomedical Engineering, Montreal, Québec, Canada
| | - Marta Kersten-Oertel
- Concordia University, PERFORM Centre, Department of Computer Science and Software Engineering, Montreal, Québec, Canada
| | - Simon Drouin
- McGill University, Montreal Neurological Institute and Hospital, Department of Biomedical Engineering, Montreal, Québec, Canada
| | - Jeffery A Hall
- McGill University, Montreal Neurological Institute and Hospital, Department of Neurology and Neurosurgery, Montreal, Québec, Canada
| | - Kevin Petrecca
- McGill University, Montreal Neurological Institute and Hospital, Department of Neurology and Neurosurgery, Montreal, Québec, Canada
| | - Dante De Nigris
- McGill University, Centre for Intelligent Machines, Department of Electrical and Computer Engineering, Montreal, Québec, Canada
| | - Daniel A Di Giovanni
- McGill University, Montreal Neurological Institute and Hospital, Department of Neurology and Neurosurgery, Montreal, Québec, Canada
| | - Tal Arbel
- McGill University, Centre for Intelligent Machines, Department of Electrical and Computer Engineering, Montreal, Québec, Canada
| | - D Louis Collins
- McGill University, Montreal Neurological Institute and Hospital, Department of Biomedical Engineering, Montreal, Québec, Canada.,McGill University, Montreal Neurological Institute and Hospital, Department of Neurology and Neurosurgery, Montreal, Québec, Canada.,McGill University, Centre for Intelligent Machines, Department of Electrical and Computer Engineering, Montreal, Québec, Canada
| |
Collapse
|
14
|
Mozaffari MH, Lee WS. Freehand 3-D Ultrasound Imaging: A Systematic Review. ULTRASOUND IN MEDICINE & BIOLOGY 2017; 43:2099-2124. [PMID: 28716431 DOI: 10.1016/j.ultrasmedbio.2017.06.009] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/04/2017] [Revised: 06/01/2017] [Accepted: 06/05/2017] [Indexed: 05/20/2023]
Abstract
Two-dimensional ultrasound (US) imaging has been successfully used in clinical applications as a low-cost, portable and non-invasive image modality for more than three decades. Recent advances in computer science and technology illustrate the promise of the 3-D US modality as a medical imaging technique that is comparable to other prevalent modalities and that overcomes certain drawbacks of 2-D US. This systematic review covers freehand 3-D US imaging between 1970 and 2017, highlighting the current trends in research fields, the research methods, the main limitations, the leading researchers, standard assessment criteria and clinical applications. Freehand 3-D US systems are more prevalent in the academic environment, whereas in clinical applications and industrial research, most studies have focused on 3-D US transducers and improvement of hardware performance. This topic is still an interesting active area for researchers, and there remain many unsolved problems to be addressed.
Collapse
Affiliation(s)
- Mohammad Hamed Mozaffari
- School of Electrical Engineering and Computer Science (EECS), University of Ottawa, Ottawa, Ontario, Canada.
| | - Won-Sook Lee
- School of Electrical Engineering and Computer Science (EECS), University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
15
|
Ghaderi MA, Heydarzadeh M, Nourani M, Gupta G, Tamil L. Augmented reality for breast tumors visualization. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2017; 2016:4391-4394. [PMID: 28269251 DOI: 10.1109/embc.2016.7591700] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
3D visualization of breast tumors are shown to be effective by previous studies. In this paper, we introduce a new augmented reality application that can help doctors and surgeons to have a more accurate visualization of breast tumors; this system uses a marker-based image-processing technique to render a 3D model of the tumors on the body. The model can be created using a combination of breast 3D mammography by experts. We have tested the system using an Android smartphone and a head-mounted device. This proof of concept can be useful for oncologists to have a more effective screening, and surgeons to plan the surgery.
Collapse
|
16
|
Segmentation of the spinous process and its acoustic shadow in vertebral ultrasound images. Comput Biol Med 2016; 72:201-11. [DOI: 10.1016/j.compbiomed.2016.03.018] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2015] [Revised: 03/22/2016] [Accepted: 03/23/2016] [Indexed: 11/21/2022]
|
17
|
Jiang WW, Li C, Li AH, Zheng YP. Clinical Evaluation of a 3-D Automatic Annotation Method for Breast Ultrasound Imaging. ULTRASOUND IN MEDICINE & BIOLOGY 2016; 42:870-881. [PMID: 26725169 DOI: 10.1016/j.ultrasmedbio.2015.11.028] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/29/2015] [Revised: 11/20/2015] [Accepted: 11/30/2015] [Indexed: 06/05/2023]
Abstract
The routine clinical breast ultrasound annotation method is limited by the time it consumes, inconsistency, inaccuracy and incomplete notation. A novel 3-D automatic annotation method for breast ultrasound imaging has been developed that uses a spatial sensor to track and record conventional B-mode scanning so as to provide more objective annotation. The aim of the study described here was to test the feasibility of the automatic annotation method in clinical breast ultrasound scanning. An ultrasound scanning procedure using the new method was established. The new method and the conventional manual annotation method were compared in 46 breast cancer patients (49 ± 12 y). The time used for scanning a patient was recorded and compared for the two methods. Intra-observer and inter-observer experiments were performed, and intra-class correlation coefficients (ICCs) were calculated to analyze system reproducibility. The results revealed that the new annotation method had an average scanning time 36 s (42.9%) less than that of the conventional method. There were high correlations between the results of the two annotation methods (r = 0.933, p < 0.0001 for distance; r = 0.995, p < 0.0001 for radial angle). Intra-observer and inter-observer reproducibility was excellent, with all ICCs > 0.92. The results indicated that the 3-D automatic annotation method is reliable for clinical breast ultrasound scanning and can greatly reduce scanning time. Although large-scale clinical studies are still needed, this work verified that the new annotation method has potential to be a valuable tool in breast ultrasound examination.
Collapse
Affiliation(s)
- Wei-Wei Jiang
- Interdisciplinary Division of Biomedical Engineering, Hong Kong Polytechnic University, Kowloon, Hong Kong, China
| | - Cheng Li
- Department of Ultrasound, State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University Cancer Center, Guangzhou, China; Department of Ultrasound, Hospital of Traditional Chinese Medicine of Zhongshan, Zhongshan, China
| | - An-Hua Li
- Department of Ultrasound, State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University Cancer Center, Guangzhou, China
| | - Yong-Ping Zheng
- Interdisciplinary Division of Biomedical Engineering, Hong Kong Polytechnic University, Kowloon, Hong Kong, China.
| |
Collapse
|
18
|
New simple image overlay system using a tablet PC for pinpoint identification of the appropriate site for anastomosis in peripheral arterial reconstruction. Surg Today 2016; 46:1387-1393. [PMID: 26988854 DOI: 10.1007/s00595-016-1326-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2015] [Accepted: 02/12/2016] [Indexed: 10/22/2022]
Abstract
PURPOSE To evaluate the accuracy and utility of a new image overlay system using a tablet PC for patients undergoing peripheral arterial reconstruction. METHODS Eleven limbs treated with distal bypass surgery were studied. Three-dimensional images obtained by processing a preoperative contrast-enhanced computed tomography scan were superimposed onto the back-camera images of a tablet PC. We used this system to pinpoint a planned distal anastomotic site preoperatively and to make a precise incision directly above it during surgery. We used a branch artery near the distal anastomotic site as a reference point and the accuracy of the system was validated by comparing its results with the intraoperative findings. The precision of the system was also compared with that of a preoperative ultrasonographic examination. RESULTS Both the image overlay system and ultrasonography (US) accurately identified the target branch artery in all except one limb. In that limb, which had a very small reference branch artery, preoperative US wrongly identified another branch, whereas the image overlay system located the target branch with an error of 10 mm. CONCLUSIONS Our image overlay system was easy to use and allowed us to precisely identify a target artery preoperatively. Therefore, this system could be helpful for pinpointing the most accurate incision site during surgery.
Collapse
|
19
|
Assessment of MRI image quality for various setup positions used in breast radiotherapy planning. Radiother Oncol 2016; 119:57-60. [PMID: 26970675 DOI: 10.1016/j.radonc.2016.02.024] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2015] [Revised: 01/27/2016] [Accepted: 02/19/2016] [Indexed: 11/20/2022]
Abstract
This study investigates breast magnetic resonance imaging (MRI) image quality for 3 different breast radiotherapy positions (prone, supine flat and supine inclined) and associated choice of breast coils. Supine breast MRI has comparable image quality to prone breast MRI for the purposes of radiotherapy delineation for T2-weighted sequences.
Collapse
|
20
|
Maas S, Ingler M, Overhoff HM. Using smart glasses for ultrasound diagnostics. CURRENT DIRECTIONS IN BIOMEDICAL ENGINEERING 2015. [DOI: 10.1515/cdbme-2015-0049] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Abstract
Ultrasound has been established as a diagnostic tool in a wide range of applications. Especially for beginners, the alignment of sectional images to patient’s spatial anatomy can be cumbersome. A direct view onto the patient’s anatomy while regarding ultrasound images may help to overcome unergonomic examination.
To solve these issues an affordable augmented reality system using smart glasses was created, that displays a (virtual) ultrasound image beneath a (real) ultrasound transducer.
Collapse
Affiliation(s)
- Stefan Maas
- Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
| | - Marvin Ingler
- Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
| | - Heinrich Martin Overhoff
- Westfälische Hochschule, Department Electrical Engineering and Applied Natural Sciences, Neidenburger Straße 43, 45877 Gelsenkirchen, Germany
| |
Collapse
|
21
|
Bø LE, Hofstad EF, Lindseth F, Hernes TAN. Versatile robotic probe calibration for position tracking in ultrasound imaging. Phys Med Biol 2015; 60:3499-513. [DOI: 10.1088/0031-9155/60/9/3499] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
22
|
Chan WY, Heng PA. Visualization of needle access pathway and a five-DoF evaluation. IEEE J Biomed Health Inform 2014; 18:643-53. [PMID: 24608064 DOI: 10.1109/jbhi.2013.2275741] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
It is a common practice nowadays to plan needle access pathways to the volumetric organs before performing surgeries. An enormous amount of needle access planning systems has been proposed in recent years. Recent works mainly focus on the system usability or target accessibility. Visualization of the planned access pathways has drawn little attention and its effect on insertion quality is left unattended. We aim to address this problem by introducing an all-round evaluation framework that links up with human motions and computer graphics. Our evaluation framework provides an objective and quantitative analysis of the illustrativeness of the needle access pathway visualization techniques to an extent of five degrees of freedom. Our experimental results show that the visualization method adopted greatly influences insertion accuracy. Based on this finding, we propose a new visualization technique that intuitively conveys placement and orientation information. We also show that our method better conveys pathway orientation and thus enables a higher quality of insertion.
Collapse
|
23
|
Schlosser J, Kirmizibayrak C, Shamdasani V, Metz S, Hristov D. Automatic 3D ultrasound calibration for image guided therapy using intramodality image registration. Phys Med Biol 2013; 58:7481-96. [PMID: 24099806 DOI: 10.1088/0031-9155/58/21/7481] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Many real time ultrasound (US) guided therapies can benefit from management of motion-induced anatomical changes with respect to a previously acquired computerized anatomy model. Spatial calibration is a prerequisite to transforming US image information to the reference frame of the anatomy model. We present a new method for calibrating 3D US volumes using intramodality image registration, derived from the 'hand-eye' calibration technique. The method is fully automated by implementing data rejection based on sensor displacements, automatic registration over overlapping image regions, and a self-consistency error metric evaluated continuously during calibration. We also present a novel method for validating US calibrations based on measurement of physical phantom displacements within US images. Both calibration and validation can be performed on arbitrary phantoms. Results indicate that normalized mutual information and localized cross correlation produce the most accurate 3D US registrations for calibration. Volumetric image alignment is more accurate and reproducible than point selection for validating the calibrations, yielding <1.5 mm root mean square error, a significant improvement relative to previously reported hand-eye US calibration results. Comparison of two different phantoms for calibration and for validation revealed significant differences for validation (p = 0.003) but not for calibration (p = 0.795).
Collapse
Affiliation(s)
- Jeffrey Schlosser
- Department of Mechanical Engineering, Stanford University, Stanford, CA 94305, USA. Department of Bioengineering, Stanford University, Stanford, CA 94305, USA
| | | | | | | | | |
Collapse
|
24
|
Utility of augmented reality system in hepatobiliary surgery. JOURNAL OF HEPATO-BILIARY-PANCREATIC SCIENCES 2013; 20:249-53. [PMID: 22399157 DOI: 10.1007/s00534-012-0504-z] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
BACKGROUND/PURPOSE The aim of this study was to evaluate the utility of an image display system for augmented reality in hepatobiliary surgery under laparotomy. METHODS An overlay display of organs, vessels, or tumor was obtained using a video see-through system as a display system developed at our institute. Registration between visceral organs and the surface-rendering image reconstructed by preoperative computed tomography (CT) was carried out with an optical location sensor. Using this system, we performed laparotomy for a patient with benign biliary stricture, a patient with gallbladder carcinoma, and a patient with hepatocellular carcinoma. RESULTS The operative procedures performed consisted of choledochojejunostomy, right hepatectomy, and microwave coagulation therapy. All the operations were carried out safely using images of the site of tumor, preserved organs, and resection aspect overlaid onto the operation field images observed on the monitors. The position of each organ in the overlaid image closely corresponded with that of the actual organ. Intraoperative information generated from this system provided us with useful navigation. However, several problems such as registration error and lack of depth knowledge were noted. CONCLUSION The image display system appeared to be useful in performing hepatobiliary surgery under laparotomy. Further improvement of the system with individualized function for each operation will be essential, with feedback from clinical trials in the future.
Collapse
|
25
|
Kersten-Oertel M, Jannin P, Collins DL. The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 2013; 37:98-112. [PMID: 23490236 DOI: 10.1016/j.compmedimag.2013.01.009] [Citation(s) in RCA: 106] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2012] [Revised: 01/04/2013] [Accepted: 01/23/2013] [Indexed: 11/26/2022]
Abstract
This paper presents a review of the state of the art of visualization in mixed reality image guided surgery (IGS). We used the DVV (data, visualization processing, view) taxonomy to classify a large unbiased selection of publications in the field. The goal of this work was not only to give an overview of current visualization methods and techniques in IGS but more importantly to analyze the current trends and solutions used in the domain. In surveying the current landscape of mixed reality IGS systems, we identified a strong need to assess which of the many possible data sets should be visualized at particular surgical steps, to focus on novel visualization processing techniques and interface solutions, and to evaluate new systems.
Collapse
Affiliation(s)
- Marta Kersten-Oertel
- Department of Biomedical Engineering, McGill University, McConnell Brain Imaging Center, Montreal Neurological Institute, Montréal, Canada.
| | | | | |
Collapse
|
26
|
A pilot study on magnetic navigation for transcatheter aortic valve implantation using dynamic aortic model and US image guidance. Int J Comput Assist Radiol Surg 2013; 8:677-90. [PMID: 23307285 DOI: 10.1007/s11548-012-0809-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2012] [Accepted: 12/20/2012] [Indexed: 10/27/2022]
Abstract
PURPOSE In this paper, we propose a pilot study for transcatheter aortic valve implantation guided by an augmented magnetic tracking system (MTS) with a dynamic aortic model and intra-operative ultrasound (US) images. METHODS The dynamic 3D aortic model is constructed from the preoperative 4D computed tomography, which is animated according to the real-time electrocardiograph (ECG) input of patient. Before the procedure, the US probe calibration is performed to map the US image coordinate to the tracked device coordinate. A temporal alignment is performed to synchronize the ECG signals, the intra-operative US image and the tracking information. Thereafter, with the assistance of synchronized ECG signals, the spatial registration is performed by using a feature-based registration. Then the augmented MTS guides the surgeon to confidently position and deploy the transcatheter aortic valve prosthesis to the target. RESULTS The approach was validated by US probe calibration evaluation and animal study. The US calibration accuracy achieved [Formula: see text], whereas in the animal study on three porcine subjects, fiducial, target, deployment distance and tilting errors reached [Formula: see text], [Formula: see text], [Formula: see text] and [Formula: see text], respectively. CONCLUSION Our pilot study has revealed that the proposed approach is feasible and accurate for delivery and deployment of transcatheter aortic valve prosthesis.
Collapse
|
27
|
Bayu MZ, Arshad H, Ali NM. Nutritional Information Visualization Using Mobile Augmented Reality Technology. ACTA ACUST UNITED AC 2013. [DOI: 10.1016/j.protcy.2013.12.208] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
28
|
Luo Z, Cai J, Wang S, Zhao Q, Peters TM, Gu L. Magnetic navigation for thoracic aortic stent-graft deployment using ultrasound image guidance. IEEE Trans Biomed Eng 2012. [PMID: 23193229 DOI: 10.1109/tbme.2012.2206388] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
We propose a system for thoracic aortic stent-graft deployment that employs a magnetic tracking system (MTS) and intraoperative ultrasound (US). A preoperative plan is first performed using a general public utilities-accelerated cardiac modeling method to determine the target position of the stent-graft. During the surgery, an MTS is employed to track sensors embedded in the catheter, cannula, and the US probe, while a fiducial landmark based registration is used to map the patient's coordinate to the image coordinate. The surgical target is tracked in real time via a calibrated intraoperative US image. Under the guidance of the MTS integrated with the real-time US images, the stent-graft can be deployed to the target position without the use of ionizing radiation. This navigation approach was validated using both phantom and animal studies. In the phantom study, we demonstrate a US calibration accuracy of 1.5 ± 0.47 mm, and a deployment error of 1.4 ± 0.16 mm. In the animal study, we performed experiments on five porcine subjects and recorded fiducial, target, and deployment errors of 2.5 ± 0.32, 4.2 ± 0.78, and 2.43 ± 0.69 mm, respectively. These results demonstrate that delivery and deployment of thoracic stent-graft under MTS-guided navigation using US imaging is feasible and appropriate for clinical application.
Collapse
Affiliation(s)
- Zhe Luo
- Image Guided Surgery and Therapy Laboratory, School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200030, China.
| | | | | | | | | | | |
Collapse
|
29
|
Enhanced Targeting in Breast Tissue Using a Robotic Tissue Preloading-Based Needle Insertion System. IEEE T ROBOT 2012. [DOI: 10.1109/tro.2012.2183055] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
30
|
Abstract
PURPOSE OF REVIEW Advancements in surgery are progressing at a rapid rate; however, there are still limitations, including the ability to accurately visualize the target organ, in particular during laparoscopic surgery. Augmented reality visualization is a novel technique that has been developed to allow the fusion of three-dimensional medical images, such as those from transrectal ultrasound or computed tomography/MRI, with live camera images in real-time. In this review, we describe the current advancements and future directions of augmented reality and its application to laparoscopic surgery. RECENT FINDINGS Geometrically-correct superimposed images can be generated by tracking of the laparoscope and registration of the target organ. The fused image between the live laparoscopic images and the reconstructed three-dimensional organ model aides the surgeon in his or her understanding of anatomical structures. Laparoscopic and robot-assisted surgeries in both general surgery and urology have been performed with technical success to date. The primary limitation of the current augmented reality systems is its infancy in dynamic tracking of organ motion or deformation. Recently, augmented reality systems with organ tracking based on real-time image analysis were developed. Further improvement and/or development of such new technologies would resolve these issues. SUMMARY Augmented reality visualization is a significant advancement, improving the precision of laparoscopic/endoscopic surgery. New technologies to improve the dynamic tracking of organ motion or deformation are currently under investigation.
Collapse
|
31
|
Siegler P, Holloway CMB, Causer P, Thevathasan G, Plewes DB. Supine breast MRI. J Magn Reson Imaging 2011; 34:1212-7. [PMID: 21928381 DOI: 10.1002/jmri.22605] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2010] [Accepted: 03/09/2011] [Indexed: 11/10/2022] Open
Abstract
PURPOSE To achieve high-quality unilateral supine breast magnetic resonance imaging (MRI) as a step to facilitate image aiding of clinical applications, which are often performed in the supine position. Contrast-enhanced breast MRI is a powerful tool for the diagnosis of cancer. However, prone patient positioning typically used for breast MRI hinders its use for image aiding. MATERIALS AND METHODS A fixture and a flexible four-element receive coil were designed for patient-specific shaping and placement of the coil in close conformity to the supine breast. A 3D spoiled gradient sequence was modified to incorporate compensation of respiratory motion. The entire setup was tested in volunteer experiments and in a pilot patient study. RESULTS The flexible coil design and the motion compensation produced supine breast MR images of high diagnostic value. Variations in breast shape and in tissue morphology within the breast were observed between a supine and a diagnostic prone MRI of a patient. CONCLUSION The presented supine breast MRI achieved an image quality comparable to diagnostic breast MRI. Since supine positioning is common in many clinical applications such as ultrasound-guided breast biopsy or breast-conserving surgery, the registration of the supine images will aid these applications.
Collapse
Affiliation(s)
- Peter Siegler
- Sunnybrook Health Sciences Centre, Division of Imaging Research, Toronto, ON, Canada.
| | | | | | | | | |
Collapse
|
32
|
Tagaya N, Aoyagi H, Nakagawa A, Abe A, Iwasaki Y, Tachibana M, Kubota K. A novel approach for sentinel lymph node identification using fluorescence imaging and image overlay navigation surgery in patients with breast cancer. World J Surg 2011; 35:154-8. [PMID: 20931198 DOI: 10.1007/s00268-010-0811-y] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
BACKGROUND We reported a novel technique of sentinel lymph node (SLN) identification using fluorescence imaging of indocyanine green injection. Furthermore, to obtain safe and accurate identification of SLN during surgery, we introduce the image overlay navigation surgery and evaluate its efficacy. METHODS This study enrolled 50 patients with a tumors <2 cm in diameter. Initially, we obtained three-dimensional (3-D) imaging from multidetector-row computed tomography (MD-CT) by volume rendering. It was projected on the patient's operative field with the clear visualization of lymph node (LN) through projector. Then, the dye of indocyanine green (ICG) was injected subdermally in the areola. Subcutaneous lymphatic channels draining from the areola to the axilla were visible by fluorescence imaging immediately. Lymphatic flow was reached after LN revealed on 3-D imaging. After incising the axillary skin on the point of LN mapping, SLN was then dissected under the guidance of fluorescence imaging with adequate adjustment of sensitivity and 3-D imaging. RESULTS Lymphatic channels and SLN were successfully identified by Photodynamic eye (PDE) in all patients. And the sites of skin incision also were identical with the LN being demonstrated by 3-D imaging in all patients. The mean number of SLN was 3.7. The image overlay navigation surgery was visually easy to identify the location of SLN from the axillary skin. There were no intra- or postoperative complications associated with SLN identification. CONCLUSIONS This combined navigations of fluorescence and 3-D imaging revealed more easy and effective to detect SLN intraoperatively than fluorescence imaging alone.
Collapse
Affiliation(s)
- Nobumi Tagaya
- Second Department of Surgery, Dokkyo Medical University, 880 Kitakobayashi, Mibu, Tochigi, 321-0293, Japan.
| | | | | | | | | | | | | |
Collapse
|
33
|
Image Visualization. Med Image Anal 2011. [DOI: 10.1002/9780470918548.ch13] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
34
|
Image Registration. Med Image Anal 2011. [DOI: 10.1002/9780470918548.ch12] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
35
|
Luan K, Liao H, Ohya T, Kobayashi E, Sakuma I. Automatic and Robust Freehand Ultrasound Calibration Using a Tracked Pointer. ACTA ACUST UNITED AC 2011. [DOI: 10.5759/jscas.13.437] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Affiliation(s)
- Kuan Luan
- Graduate School of Engineering, The University of Tokyo
- College of Automation, Harbin Engineering University
| | - Hongen Liao
- Graduate School of Engineering, The University of Tokyo
| | - Takashi Ohya
- Graduate School of Engineering, The University of Tokyo
- Department of Oral and Maxillofacial Surgery, Yokohama City University Graduate School of Medicine
| | | | - Ichiro Sakuma
- Graduate School of Engineering, The University of Tokyo
| |
Collapse
|
36
|
Yim Y, Wakid M, Kirmizibayrak C, Bielamowicz S, Hahn J. Registration of 3D CT Data to 2D Endoscopic Image using a Gradient Mutual Information based Viewpoint Matching for Image-Guided Medialization Laryngoplasty. ACTA ACUST UNITED AC 2010. [DOI: 10.5626/jcse.2010.4.4.368] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
37
|
Alderliesten T, Loo C, Paape A, Muller S, Rutgers E, Peeters MJV, Gilhuijs K. On the feasibility of MRI-guided navigation to demarcate breast cancer for breast-conserving surgery. Med Phys 2010; 37:2617-26. [PMID: 20632573 DOI: 10.1118/1.3429048] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
Abstract
PURPOSE The aim of this study was to investigate the feasibility of image-guided navigation approaches to demarcate breast cancer on the basis of preacquired magnetic resonance (MR) imaging in supine patient orientation. METHODS Strategies were examined to minimize the uncertainty in the instrument-tip position, based on the hypothesis that the release of instrument pressure returns the breast tissue to its predeformed state. For this purpose, four sources of uncertainty were taken into account: (1) U(ligaments): Uncertainty in the reproducibility of the internal mammary gland geometry during repeat patient setup in supine orientation; (2) U(r_breathing): Residual uncertainty in registration of the breast after compensation for breathing motion using an external marker; (3) U(reconstruction): Uncertainty in the reconstructed location of the tip of the needle using an optical image-navigation system (phantom experiments, n = 50); and (4) U(deformation): Uncertainty in displacement of breast tumors due to needle-induced tissue deformations (patients, n = 21). A Monte Carlo study was performed to establish the 95% confidence interval (CI) of the combined uncertainties. This region of uncertainty was subsequently visualized around the reconstructed needle tip as an additional navigational aid in the preacquired MR images. Validation of the system was performed in five healthy volunteers (localization of skin markers only) and in two patients. In the patients, the navigation system was used to monitor ultrasound-guided radioactive seed localization of breast cancer. Nearest distances between the needle tip and the tumor boundary in the ultrasound images were compared to those in the concurrently reconstructed MR images. RESULTS Both U(reconstruction) and U(deformation) were normally distributed with 0.1 +/- 1.2 mm (mean +/- 1 SD) and 0.1 +/- 0.8 mm, respectively. Taking prior estimates for U(ligaments) (0.0 +/- 1.5 mm) and U(r_breathing) (-0.1 +/- 0.6 mm) into account, the combined impact resulted in 3.9 mm uncertainty in the position of the needle tip (95% CI) after release of pressure. The volunteer study showed a targeting accuracy comparable to that in the phantom experiments: 2.9 +/- 1.3 versus 2.7 +/- 1.1 mm, respectively. In the patient feasibility study, the deviations were within the 3.9 mm CI. CONCLUSIONS Image-guided navigation to demarcate breast cancer on the basis of preacquired MR images in supine orientation appears feasible if patient breathing is tracked during the navigation procedure, positional uncertainty is visualized and pressure on the localization instrument is released prior to verification of its position.
Collapse
Affiliation(s)
- Tanja Alderliesten
- Department of Radiology, The Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam, The Netherlands
| | | | | | | | | | | | | |
Collapse
|
38
|
Freschi C, Troia E, Ferrari V, Megali G, Pietrabissa A, Mosca F. Ultrasound guided robotic biopsy using augmented reality and human-robot cooperative control. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2010; 2009:5110-3. [PMID: 19963882 DOI: 10.1109/iembs.2009.5332720] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Ultrasound-guided biopsy is a proficient mininvasive approach for tumors staging but requires very long training and particular manual and 3D space perception abilities of the physician, for the planning of the needle trajectory and the execution of the procedure. In order to simplify this difficult task, we have developed an integrated system that provides the clinician two types of assistance: an augmented reality visualization allows accurate and easy planning of needle trajectory and target reaching verification; a robot arm with a six-degree-of-freedom force sensor allows the precise positioning of the needle holder and allows the clinician to adjust the planned trajectory (cooperative control) to overcome needle deflection and target motion. Preliminary tests have been executed on an ultrasound phantom showing high precision of the system in static conditions and the utility and usability of the cooperative control in simulated no-rigid conditions.
Collapse
|
39
|
Tamaki Y, Sato Y, Nakamoto M, Sasama T, Sakita I, Sekimoto M, Ohue M, Tomita N, Tamura S, Monden M. Intraoperative Navigation for Breast Cancer Surgery Using 3D Ultrasound Images. ACTA ACUST UNITED AC 2010. [DOI: 10.3109/10929089909148157] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
40
|
Betrouni N, Lopes R, Makni N, Dewalle AS, Vermandel M, Rousseau J. Volume quantification by fuzzy logic modelling in freehand ultrasound imaging. ULTRASONICS 2009; 49:646-652. [PMID: 19409591 DOI: 10.1016/j.ultras.2009.03.008] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/17/2008] [Revised: 03/23/2009] [Accepted: 03/28/2009] [Indexed: 05/27/2023]
Abstract
INTRODUCTION Many algorithms exist for 3D reconstruction of data from freehand 2D ultrasound slices. These methods are based on interpolation techniques to fill the voxels from the pixels. For quantification purposes, segmentation is involved to delineate the structure of interest. However, speckle and partial volume effect errors can affect quantification. OBJECTIVE This study aimed to assess the effect of the combination of a fuzzy model and 3D reconstruction algorithms of freehand ultrasound images on these errors. METHODS We introduced a fuzzification step to correct the initial segmentation, by weighting the pixels by a distribution function, taking into account the local gray levels, the orientation of the local gradient, and the local contrast-to-noise ratio. We then used two of the most wide-spread reconstruction algorithms (pixel nearest neighbour (PNN) and voxel nearest neighbour (VNN)) to interpolate and create the volume of the structure. Finally, defuzzification was used to estimate the optimal volume. VALIDATION B-scans were acquired using 5 MHz and 8 MHz ultrasound probes on ultrasound tissue-mimicking phantoms. Quantitative evaluation of the reconstructed structures was done by comparing the method output to the real volumes. Comparison was also done with classical PNN and VNN algorithms. RESULTS With the fuzzy model quantification errors were less than 4.3%, whereas with classical algorithms, errors were larger (10.3% using PNN, 17.2% using VNN). Furthermore, for very small structures (0.5 cm(3)), errors reached 24.3% using the classical VNN algorithm, while they were about 9.6% with the fuzzy VNN model. CONCLUSION These experiments prove that the fuzzy model allows volumes to be determined with better accuracy and reproducibility, especially for small structures (<3 cm(3)).
Collapse
Affiliation(s)
- N Betrouni
- INSERM U703, Pavillon Vancostanobel, University Hospital of Lille (CHRU), Lille 59037, France.
| | | | | | | | | | | |
Collapse
|
41
|
Deguchi D, Mori K, Feuerstein M, Kitasaka T, Maurer Jr. CR, Suenaga Y, Takabatake H, Mori M, Natori H. Selective image similarity measure for bronchoscope tracking based on image registration. Med Image Anal 2009; 13:621-33. [DOI: 10.1016/j.media.2009.06.001] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2008] [Revised: 05/29/2009] [Accepted: 06/02/2009] [Indexed: 10/20/2022]
|
42
|
|
43
|
Solberg OV, Langø T, Tangen GA, Mårvik R, Ystgaard B, Rethy A, Hernes TAN. Navigated ultrasound in laparoscopic surgery. MINIM INVASIV THER 2009; 18:36-53. [PMID: 18855204 DOI: 10.1080/13645700802383975] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
Laparoscopic surgery is performed through small incisions that limit free sight and possibility to palpate organs. Although endoscopes provide an overview of organs inside the body, information beyond the surface of the organs is missing. Ultrasound can provide real-time essential information of inside organs, which is valuable for increased safety and accuracy in guidance of procedures. We have tested the use of 2D and 3D ultrasound combined with 3D CT data in a prototype navigation system. In our laboratory, micro-positioning sensors were integrated into a flexible intraoperative ultrasound probe, making it possible to measure the position and orientation of the real-time 2D ultrasound image as well as to perform freehand 3D ultrasound acquisitions. Furthermore, we also present a setup with the probe optically tracked from the shaft with the flexible part locked in one position. We evaluated the accuracy of the 3D laparoscopic ultrasound solution and obtained average values ranging from 1.6% to 3.6% volume deviation from the phantom specifications. Furthermore, we investigated the use of an electromagnetic tracking in the operating room. The results showed that the operating room setup disturbs the electromagnetic tracking signal by increasing the root mean square (RMS) distance error from 0.3 mm to 2.3 mm in the center of the measurement volume, but the surgical instruments and the ultrasound probe added no further inaccuracies. Tracked surgical tools, such as endoscopes, pointers, and probes, allowed surgeons to interactively control the display of both registered preoperative medical images, as well as intraoperatively acquired 3D ultrasound data, and have potential to increase the safety of guidance of surgical procedures.
Collapse
Affiliation(s)
- O V Solberg
- Department of Medical Technology, SINTEF Health Research, Trondheim, Norway.
| | | | | | | | | | | | | |
Collapse
|
44
|
Sielhorst T, Feuerstein M, Navab N. Advanced Medical Displays: A Literature Review of Augmented Reality. ACTA ACUST UNITED AC 2008. [DOI: 10.1109/jdt.2008.2001575] [Citation(s) in RCA: 201] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
45
|
Evans KD, Sammet S, Ramos Y, Knopp MV. Image Segmentation for Evaluating Axillary Lymph Nodes. JOURNAL OF DIAGNOSTIC MEDICAL SONOGRAPHY 2008. [DOI: 10.1177/8756479308324954] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
A review is provided of the literature that has been published on image segmentation relative to sonography. Manual and automatic techniques for partitioning a sonogram are highlighted. In addition, a preliminary set of results is provided on the interrater reliability of the manual segmentation of axillary lymph nodes that have been sonographically imaged. A correlation between sonographers conducting manual segmentation is very high ( r = 0.9 with P < .00 at the .01 alpha level). This work is set to provide additional information on lymph node cubic volume and the agreement between manual and automatic segmentation of axillary lymph nodes.
Collapse
Affiliation(s)
- Kevin D. Evans
- Ohio State University, School of Allied Medical Professions, Columbus, Ohio,
| | - Steffen Sammet
- Ohio State University, School of Allied Medical Professions, Columbus, Ohio
| | - Yvette Ramos
- Ohio State University, School of Allied Medical Professions, Columbus, Ohio
| | - Michael V. Knopp
- Ohio State University, School of Allied Medical Professions, Columbus, Ohio
| |
Collapse
|
46
|
Hsu PW, Treece GM, Prager RW, Houghton NE, Gee AH. Comparison of freehand 3-D ultrasound calibration techniques using a stylus. ULTRASOUND IN MEDICINE & BIOLOGY 2008; 34:1610-1621. [PMID: 18420335 DOI: 10.1016/j.ultrasmedbio.2008.02.015] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2007] [Revised: 02/09/2008] [Accepted: 02/21/2008] [Indexed: 05/26/2023]
Abstract
In a freehand 3-D ultrasound system, a probe calibration is required to find the rigid body transformation from the corner of the B-scan to the electrical center of the position sensor. The most intuitive way to perform such a calibration is by locating fiducial points in the scan plane directly with a stylus. The main problem of this approach is the difficulty in aligning the tip of the stylus with the scan plane. The thick beamwidth makes the tip of the stylus visible in the B-scan, even if the tip is not exactly at the elevational center of the scan plane. We present a novel stylus and phantom that simplify the alignment process for more accurate probe calibration. We also compare our calibration techniques with a range of styli. We show that our stylus and cone phantom are both simple in design and can achieve a point reconstruction accuracy of 2.2 mm and 1.8 mm, respectively, an improvement from 3.2 mm and 3.6 mm with the sharp and spherical stylus. The performance of our cone stylus and phantom lie between the state-of-the-art Z-phantom and Cambridge phantom, where accuracies of 2.5 mm and 1.7 mm are achieved.
Collapse
Affiliation(s)
- Po-Wei Hsu
- Department of Engineering, University of Cambridge, Cambridge, United Kingdom.
| | | | | | | | | |
Collapse
|
47
|
Langø T, Tangen GA, Mårvik R, Ystgaard B, Yavuz Y, Kaspersen JH, Solberg OV, Hernes TAN. Navigation in laparoscopy--prototype research platform for improved image-guided surgery. MINIM INVASIV THER 2008; 17:17-33. [PMID: 18270874 DOI: 10.1080/13645700701797879] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
The manipulation of the surgical field in laparoscopic surgery, through small incisions with rigid instruments, reduces free sight, dexterity, and tactile feedback. To help overcome some of these drawbacks, we present a prototype research and development platform, CustusX, for navigation in minimally invasive therapy. The system can also be used for planning and follow-up studies. With this platform we can import and display a range of medical images, also real-time data such as ultrasound and X-ray, during surgery. Tracked surgical tools, such as pointers, video laparoscopes, graspers, and various probes, allow surgeons to interactively control the display of medical images during the procedure. This paper introduces navigation technologies and methods for laparoscopic therapy, and presents our software and hardware research platform. Furthermore, we illustrate the use of the system with examples from two pilots performed during laparoscopic therapy. We also present new developments that are currently being integrated into the system for future use in the operating room. Our initial results from pilot studies using this technology with preoperative images and guidance in the retroperitoneum during laparoscopy are promising. Finally, we shortly describe an ongoing multicenter study using this surgical navigation system platform.
Collapse
Affiliation(s)
- T Langø
- SINTEF Health Research, Dept. Medical Technology, Trondheim, Norway.
| | | | | | | | | | | | | | | |
Collapse
|
48
|
Oshiro M, Nishimura T. A contour extraction method using active contour model on ultrasonic images. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2008; 2007:825-8. [PMID: 18002083 DOI: 10.1109/iembs.2007.4352417] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
At excision operations of a breast tumor, a navigation to display three-dimensional models during the operations is demanded to grasp a position and a size of the tumor, and to decide an area of the excision. Speckle noises which are characteristic of an ultrasonic image are caused by interference of sound waves. The noises cause a low resolution of a region of interest (ROI), and those are obstacle of constructing recognizable three-dimensional image. To reconstruct a three-dimensional model from two-dimensional ultrasonic tomograms, a speckle reduction and a contour extraction of the ROI are demanded. The purpose of this study is a contour extraction of a ROI on ultrasonic images. An active contour model using a gradient vector flow was employed. The contour of a lesion area of the ultrasonic images which speckle are reduced was extracted.
Collapse
Affiliation(s)
- Masakuni Oshiro
- Graduate School of Information, Production and Systems, Waseda University, 2-7 Hibikino, Wakamatsu-ku, Kitakyushu, Japan.
| | | |
Collapse
|
49
|
Nakamoto M, Nakada K, Sato Y, Konishi K, Hashizume M, Tamura S. Intraoperative magnetic tracker calibration using a magneto-optic hybrid tracker for 3-D ultrasound-based navigation in laparoscopic surgery. IEEE TRANSACTIONS ON MEDICAL IMAGING 2008; 27:255-270. [PMID: 18334447 DOI: 10.1109/tmi.2007.911003] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
This paper describes a ultrasound (3-D US) system that aims to achieve augmented reality (AR) visualization during laparoscopic surgery, especially for the liver. To acquire 3-D US data of the liver, the tip of a laparoscopic ultrasound probe is tracked inside the abdominal cavity using a magnetic tracker. The accuracy of magnetic trackers, however, is greatly affected by magnetic field distortion that results from the close proximity of metal objects and electronic equipment, which is usually unavoidable in the operating room. In this paper, we describe a calibration method for intraoperative magnetic distortion that can be applied to laparoscopic 3-D US data acquisition; we evaluate the accuracy and feasibility of the method by in vitro and in vivo experiments. Although calibration data can be acquired freehand using a magneto-optic hybrid tracker, there are two problems associated with this method--error caused by the time delay between measurements of the optical and magnetic trackers, and instability of the calibration accuracy that results from the uniformity and density of calibration data. A temporal calibration procedure is developed to estimate the time delay, which is then integrated into the calibration, and a distortion model is formulated by zeroth-degree to fourth-degree polynomial fitting to the calibration data. In the in vivo experiment using a pig, the positional error caused by magnetic distortion was reduced from 44.1 to 2.9 mm. The standard deviation of corrected target positions was less than 1.0 mm. Freehand acquisition of calibration data was performed smoothly using a magneto-optic hybrid sampling tool through a trocar under guidance by realtime 3-D monitoring of the tool trajectory; data acquisition time was less than 2 min. The present study suggests that our proposed method could correct for magnetic field distortion inside the patient's abdomen during a laparoscopic procedure within a clinically permissible period of time, as well as enabling an accurate 3-D US reconstruction to be obtained that can be superimposed onto live endoscopic images.
Collapse
Affiliation(s)
- Masahiko Nakamoto
- Division of Image Analysis, Osaka University Graduate School of Medicine, Osaka, Japan
| | | | | | | | | | | |
Collapse
|
50
|
Hsu PW, Prager RW, Gee AH, Treece GM. Real-time freehand 3D ultrasound calibration. ULTRASOUND IN MEDICINE & BIOLOGY 2008; 34:239-251. [PMID: 17935870 DOI: 10.1016/j.ultrasmedbio.2007.07.020] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/08/2006] [Revised: 06/22/2007] [Accepted: 07/27/2007] [Indexed: 05/25/2023]
Abstract
Z-fiducial phantoms allow three-dimensional ultrasound probe calibration with a single B-scan. One of the main difficulties in using this phantom is the need for reliable segmentation of the wires in the ultrasound images, which necessitates manual intervention. In this article, we have shown how we can solve this problem by mounting a thin rubber membrane on top of the phantom. The membrane is segmented automatically and the wires can be easily located as they are at known positions relative to the membrane. This enables us to segment the wires automatically at the full PAL frame rate of 25 Hz, to produce calibrations in real-time, while achieving accuracies similar to those reported in the literature. We have also devised a technique to improve the estimation of the elevational offset (calibration parameter) by capturing a few images of the planar membrane. If spatial calibration is known, fully automatic wire segmentation allows the fiducials to be tracked in real-time. This also enables temporal calibration to be performed in real-time as the probe is moved away from the phantom. We have evaluated the performance of our phantom by calibrating a probe at 8 cm and 15 cm depth. The precision of the calibrations are 0.7 mm and 1.2 mm, respectively. The point reconstruction accuracies of fiducial points provided by the same Z-phantom are slightly below 1.5 mm. The point reconstruction accuracies obtained by scanning the end of a wire tip are 2.5 mm and 3.0 mm. These results match the accuracies achieved in the literature. It takes approximately 2 min to set up the experiment, submerge the phantom in the water bath, locate the phantom in space with a pointer and capture six images of the planar membrane. After this, spatial calibration can be performed in less than a second. Temporal calibration can be completed in approximately 3 s.
Collapse
Affiliation(s)
- Po-Wei Hsu
- Department of Engineering, University of Cambridge, Trumpington Street, Cambridge, United Kingdom.
| | | | | | | |
Collapse
|