1
|
Isaieva K, Fauvel M, Weber N, Vuissoz PA, Felblinger J, Oster J, Odille F. A hardware and software system for MRI applications requiring external device data. Magn Reson Med 2022; 88:1406-1418. [PMID: 35506503 DOI: 10.1002/mrm.29280] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Revised: 04/07/2022] [Accepted: 04/07/2022] [Indexed: 01/22/2023]
Abstract
PURPOSE Numerous MRI applications require data from external devices. Such devices are often independent of the MRI system, so synchronizing these data with the MRI data is often tedious and limited to offline use. In this work, a hardware and software system is proposed for acquiring data from external devices during MR imaging, for use online (in real-time) or offline. METHODS The hardware includes a set of external devices - electrocardiography (ECG) devices, respiration sensors, microphone, electronics of the MR system etc. - using various channels for data transmission (analog, digital, optical fibers), all connected to a server through a universal serial bus (USB) hub. The software is based on a flexible client-server architecture, allowing real-time processing pipelines to be configured and executed. Communication protocols and data formats are proposed, in particular for transferring the external device data to an open-source reconstruction software (Gadgetron), for online image reconstruction using external physiological data. The system performance is evaluated in terms of accuracy of the recorded signals and delays involved in the real-time processing tasks. Its flexibility is shown with various applications. RESULTS The real-time system had low delays and jitters (on the order of 1 ms). Example MRI applications using external devices included: prospectively gated cardiac cine imaging, multi-modal acquisition of the vocal tract (image, sound, and respiration) and online image reconstruction with nonrigid motion correction. CONCLUSION The performance of the system and its versatile architecture make it suitable for a wide range of MRI applications requiring online or offline use of external device data.
Collapse
Affiliation(s)
- Karyna Isaieva
- IADI, Université de Lorraine, INSERM U1254, Nancy, France
| | - Marc Fauvel
- CIC-IT 1433, Université de Lorraine, INSERM, CHRU de Nancy, Nancy, France
| | - Nicolas Weber
- IADI, Université de Lorraine, INSERM U1254, Nancy, France
| | | | - Jacques Felblinger
- IADI, Université de Lorraine, INSERM U1254, Nancy, France.,CIC-IT 1433, Université de Lorraine, INSERM, CHRU de Nancy, Nancy, France
| | - Julien Oster
- IADI, Université de Lorraine, INSERM U1254, Nancy, France
| | - Freddy Odille
- IADI, Université de Lorraine, INSERM U1254, Nancy, France.,CIC-IT 1433, Université de Lorraine, INSERM, CHRU de Nancy, Nancy, France
| |
Collapse
|
2
|
Madore B, Belsley G, Cheng CC, Preiswerk F, Foley Kijewski M, Wu PH, Martell LB, Pluim JPW, Di Carli M, Moore SC. Ultrasound-based sensors for respiratory motion assessment in multimodality PET imaging. Phys Med Biol 2021; 67. [PMID: 34891142 DOI: 10.1088/1361-6560/ac4213] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2021] [Accepted: 12/10/2021] [Indexed: 11/11/2022]
Abstract
Breathing motion can displace internal organs by up to several cm; as such, it is a primary factor limiting image quality in medical imaging. Motion can also complicate matters when trying to fuse images from different modalities, acquired at different locations and/or on different days. Currently available devices for monitoring breathing motion often do so indirectly, by detecting changes in the outline of the torso rather than the internal motion itself, and these devices are often fixed to floors, ceilings or walls, and thus cannot accompany patients from one location to another. We have developed small ultrasound-based sensors, referred to as 'organ configuration motion' (OCM) sensors, that attach to the skin and provide rich motion-sensitive information. In the present work we tested the ability of OCM sensors to enable respiratory gating during in vivo PET imaging. A motion phantom involving an FDG solution was assembled, and two cancer patients scheduled for a clinical PET/CT exam were recruited for this study. OCM signals were used to help reconstruct phantom and in vivo data into time series of motion-resolved images. As expected, the motion-resolved images captured the underlying motion. In Patient #1, a single large lesion proved to be mostly stationary through the breathing cycle. However, in Patient #2, several small lesions were mobile during breathing, and our proposed new approach captured their breathing-related displacements. In summary, a relatively inexpensive hardware solution was developed here for respiration monitoring. Because the proposed sensors attach to the skin, as opposed to walls or ceilings, they can accompany patients from one procedure to the next, potentially allowing data gathered in different places and at different times to be combined and compared in ways that account for breathing motion.
Collapse
Affiliation(s)
- Bruno Madore
- Harvard Medical School, Department of Radiology, Brigham and Women's Hospital, 75 Francis Street, Boston, Massachusetts, 02115, UNITED STATES
| | - Gabriela Belsley
- Oxford Centre for Clinical Magnetic Resonance Research, Radcliffe Department of Medicine, University of Oxford, Oxford, Oxford, OX3 9DU, UNITED KINGDOM OF GREAT BRITAIN AND NORTHERN IRELAND
| | - Cheng-Chieh Cheng
- Computer Science and Engineering, National Sun Yat-sen University, 70 Lianhai Road, Kaohsiung, 804, TAIWAN
| | - Frank Preiswerk
- Amazon Robotics, Westborough, MA, USA, Amazon Robotics, 50 Otis St, Westborough, Massachusetts, 01581, UNITED STATES
| | - Marie Foley Kijewski
- Harvard Medical School, Department of Radiology, Brigham and Women's Hospital, 75 Francis Street, Boston, Massachusetts, 02115, UNITED STATES
| | - Pei-Hsin Wu
- Electrical Engineering, National Sun Yat-sen University, 70 Lianhai Road, Kaohsiung, 804, TAIWAN
| | - Laurel B Martell
- Department of Radiology, Brigham and Women's Hospital, 75 Francis Street, Boston, Massachusetts, 02115, UNITED STATES
| | - Josien P W Pluim
- Department of Biomedical Engineering, Eindhoven University of Technology, 5600 MB Eindhoven, Eindhoven, PO Box 513, NETHERLANDS
| | - Marcelo Di Carli
- Harvard Medical School, Department of Radiology, Brigham and Women's Hospital, 75 Francis Street, Boston, Massachusetts, 02115, UNITED STATES
| | - Stephen C Moore
- Department of Radiology, University of Pennsylvania Perelman School of Medicine, 3400 Civic Center Blvd, Philadelphia, Pennsylvania, 19104, UNITED STATES
| |
Collapse
|
3
|
Madore B, Preiswerk F, Bredfeldt JS, Zong S, Cheng CC. Ultrasound-based sensors to monitor physiological motion. Med Phys 2021; 48:3614-3622. [PMID: 33999423 DOI: 10.1002/mp.14949] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2020] [Revised: 12/28/2020] [Accepted: 05/01/2021] [Indexed: 12/25/2022] Open
Abstract
PURPOSE Medical procedures can be difficult to perform on anatomy that is constantly moving. Respiration displaces internal organs by up to several centimeters with respect to the surface of the body, and patients often have limited ability to hold their breath. Strategies to compensate for motion during diagnostic and therapeutic procedures require reliable information to be available. However, current devices often monitor respiration indirectly, through changes on the outline of the body, and they may be fixed to floors or ceilings, and thus unable to follow a given patient through different locations. Here we show that small ultrasound-based sensors referred to as "organ configuration motion" (OCM) sensors can be fixed to the abdomen and/or chest and provide information-rich, breathing-related signals. METHODS By design, the proposed sensors are relatively inexpensive. Breathing waveforms were obtained from tissues at varying depths and/or using different sensor placements. Validation was performed against breathing waveforms derived from magnetic resonance imaging (MRI) and optical tracking signals in five and eight volunteers, respectively. RESULTS Breathing waveforms from different modalities were scaled so they could be directly compared. Differences between waveforms were expressed in the form of a percentage, as compared to the amplitude of a typical breath. Expressed in this manner, for shallow tissues, OCM-derived waveforms on average differed from MRI and optical tracking results by 13.1% and 15.5%, respectively. CONCLUSION The present results suggest that the proposed sensors provide measurements that properly characterize breathing states. While OCM-based waveforms from shallow tissues proved similar in terms of information content to those derived from MRI or optical tracking, OCM further captured depth-dependent and position-dependent (i.e., chest and abdomen) information. In time, the richer information content of OCM-based waveforms may enable better respiratory gating to be performed, to allow diagnostic and therapeutic equipment to perform at their best.
Collapse
Affiliation(s)
- Bruno Madore
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA
| | - Frank Preiswerk
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA.,Amazon Robotics, North Reading, MA, USA
| | - Jeremy S Bredfeldt
- Department of Radiation Oncology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA
| | - Shenyan Zong
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA
| | - Cheng-Chieh Cheng
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA.,Department of Computer Science and Engineering, National Sun Yat-sen University, Kaohsiung, Taiwan
| |
Collapse
|
4
|
|
5
|
Giger A, Stadelmann M, Preiswerk F, Jud C, De Luca V, Celicanin Z, Bieri O, Salomir R, Cattin PC. Ultrasound-driven 4D MRI. Phys Med Biol 2018; 63:145015. [PMID: 29864021 DOI: 10.1088/1361-6560/aaca1d] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
We present an ultrasound-driven 4D magnetic resonance imaging (US-4DMRI) method for respiratory motion imaging in the thorax and abdomen. The proposed US-4DMRI comes along with a high temporal resolution, and allows for organ motion imaging beyond a single respiratory cycle. With the availability of the US surrogate both inside and outside the MR bore, 4D MR images can be reconstructed for 4D treatment planning and online respiratory motion prediction during radiotherapy. US-4DMRI relies on simultaneously acquired 2D liver US images and abdominal 2D MR multi-slice scans under free respiration. MR volumes are retrospectively composed by grouping the MR slices corresponding to the most similar US images. We present two different US similarity metrics: an intensity-based approach, and a similarity measure relying on predefined fiducials which are being tracked over time. The proposed method is demonstrated on MR liver scans of eight volunteers acquired over a duration of 5.5 min each at a temporal resolution of 2.6 Hz with synchronous US imaging at 14 Hz-17 Hz. Visual inspection of the reconstructed MR volumes revealed satisfactory results in terms of continuity in organ boundaries and blood vessels. In quantitative leave-one-out experiments, both US similarity metrics reach the performance level of state-of-the-art navigator-based approaches.
Collapse
Affiliation(s)
- Alina Giger
- Department of Biomedical Engineering, University of Basel, Allschwil, Switzerland. Center for Medical Image Analysis & Navigation, University of Basel, Allschwil, Switzerland
| | | | | | | | | | | | | | | | | |
Collapse
|
6
|
Preiswerk F, Toews M, Cheng CC, Chiou JYG, Mei CS, Schaefer LF, Hoge WS, Schwartz BM, Panych LP, Madore B. Hybrid MRI-Ultrasound acquisitions, and scannerless real-time imaging. Magn Reson Med 2016; 78:897-908. [PMID: 27739101 DOI: 10.1002/mrm.26467] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2016] [Revised: 07/28/2016] [Accepted: 08/25/2016] [Indexed: 11/11/2022]
Abstract
PURPOSE To combine MRI, ultrasound, and computer science methodologies toward generating MRI contrast at the high frame rates of ultrasound, inside and even outside the MRI bore. METHODS A small transducer, held onto the abdomen with an adhesive bandage, collected ultrasound signals during MRI. Based on these ultrasound signals and their correlations with MRI, a machine-learning algorithm created synthetic MR images at frame rates up to 100 per second. In one particular implementation, volunteers were taken out of the MRI bore with the ultrasound sensor still in place, and MR images were generated on the basis of ultrasound signal and learned correlations alone in a "scannerless" manner. RESULTS Hybrid ultrasound-MRI data were acquired in eight separate imaging sessions. Locations of liver features, in synthetic images, were compared with those from acquired images: The mean error was 1.0 pixel (2.1 mm), with best case 0.4 and worst case 4.1 pixels (in the presence of heavy coughing). For results from outside the bore, qualitative validation involved optically tracked ultrasound imaging with/without coughing. CONCLUSION The proposed setup can generate an accurate stream of high-speed MR images, up to 100 frames per second, inside or even outside the MR bore. Magn Reson Med 78:897-908, 2017. © 2016 International Society for Magnetic Resonance in Medicine.
Collapse
Affiliation(s)
- Frank Preiswerk
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | - Matthew Toews
- The Laboratory for Imagery, Vision and Artificial Intelligence, École de Technologie Supérieure, Montréal, QC, Canada
| | - Cheng-Chieh Cheng
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | - Jr-Yuan George Chiou
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | | | - Lena F Schaefer
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | - W Scott Hoge
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | | | - Lawrence P Panych
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| | - Bruno Madore
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA
| |
Collapse
|