1
|
Gefter WB, Prokop M, Seo JB, Raoof S, Langlotz CP, Hatabu H. Human-AI Symbiosis: A Path Forward to Improve Chest Radiography and the Role of Radiologists in Patient Care. Radiology 2024; 310:e232778. [PMID: 38259206 PMCID: PMC10831473 DOI: 10.1148/radiol.232778] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Revised: 12/08/2023] [Accepted: 12/18/2023] [Indexed: 01/24/2024]
Affiliation(s)
- Warren B. Gefter
- From the Department of Radiology, Penn Medicine, University of Pennsylvania, Philadelphia, Pa (W.B.G.); Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen, the Netherlands (M.P.); Department of Radiology, Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul, South Korea (J.B.S.); Department of Medicine and Radiology, Zucker School of Medicine, Hofstra/Northwell and Lung Institute, Lenox Hill Hospital, New York, NY (S.R.); Department of Radiology and Biomedical Informatics and Center for Artificial Intelligence in Medicine and Imaging, Stanford University, Palo Alto, Calif (C.P.L.); and Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, 75 Francis St, Boston, MA 02215 (H.H.)
| | - Mathias Prokop
- From the Department of Radiology, Penn Medicine, University of Pennsylvania, Philadelphia, Pa (W.B.G.); Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen, the Netherlands (M.P.); Department of Radiology, Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul, South Korea (J.B.S.); Department of Medicine and Radiology, Zucker School of Medicine, Hofstra/Northwell and Lung Institute, Lenox Hill Hospital, New York, NY (S.R.); Department of Radiology and Biomedical Informatics and Center for Artificial Intelligence in Medicine and Imaging, Stanford University, Palo Alto, Calif (C.P.L.); and Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, 75 Francis St, Boston, MA 02215 (H.H.)
| | - Joon Beom Seo
- From the Department of Radiology, Penn Medicine, University of Pennsylvania, Philadelphia, Pa (W.B.G.); Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen, the Netherlands (M.P.); Department of Radiology, Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul, South Korea (J.B.S.); Department of Medicine and Radiology, Zucker School of Medicine, Hofstra/Northwell and Lung Institute, Lenox Hill Hospital, New York, NY (S.R.); Department of Radiology and Biomedical Informatics and Center for Artificial Intelligence in Medicine and Imaging, Stanford University, Palo Alto, Calif (C.P.L.); and Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, 75 Francis St, Boston, MA 02215 (H.H.)
| | - Suhail Raoof
- From the Department of Radiology, Penn Medicine, University of Pennsylvania, Philadelphia, Pa (W.B.G.); Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen, the Netherlands (M.P.); Department of Radiology, Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul, South Korea (J.B.S.); Department of Medicine and Radiology, Zucker School of Medicine, Hofstra/Northwell and Lung Institute, Lenox Hill Hospital, New York, NY (S.R.); Department of Radiology and Biomedical Informatics and Center for Artificial Intelligence in Medicine and Imaging, Stanford University, Palo Alto, Calif (C.P.L.); and Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, 75 Francis St, Boston, MA 02215 (H.H.)
| | - Curtis P. Langlotz
- From the Department of Radiology, Penn Medicine, University of Pennsylvania, Philadelphia, Pa (W.B.G.); Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen, the Netherlands (M.P.); Department of Radiology, Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul, South Korea (J.B.S.); Department of Medicine and Radiology, Zucker School of Medicine, Hofstra/Northwell and Lung Institute, Lenox Hill Hospital, New York, NY (S.R.); Department of Radiology and Biomedical Informatics and Center for Artificial Intelligence in Medicine and Imaging, Stanford University, Palo Alto, Calif (C.P.L.); and Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, 75 Francis St, Boston, MA 02215 (H.H.)
| | - Hiroto Hatabu
- From the Department of Radiology, Penn Medicine, University of Pennsylvania, Philadelphia, Pa (W.B.G.); Department of Radiology and Nuclear Medicine, Radboud University Medical Center, Nijmegen, the Netherlands (M.P.); Department of Radiology, Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul, South Korea (J.B.S.); Department of Medicine and Radiology, Zucker School of Medicine, Hofstra/Northwell and Lung Institute, Lenox Hill Hospital, New York, NY (S.R.); Department of Radiology and Biomedical Informatics and Center for Artificial Intelligence in Medicine and Imaging, Stanford University, Palo Alto, Calif (C.P.L.); and Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, 75 Francis St, Boston, MA 02215 (H.H.)
| |
Collapse
|
2
|
Yu HQ, O’Neill S, Kermanizadeh A. AIMS: An Automatic Semantic Machine Learning Microservice Framework to Support Biomedical and Bioengineering Research. Bioengineering (Basel) 2023; 10:1134. [PMID: 37892864 PMCID: PMC10603862 DOI: 10.3390/bioengineering10101134] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 09/21/2023] [Accepted: 09/25/2023] [Indexed: 10/29/2023] Open
Abstract
The fusion of machine learning and biomedical research offers novel ways to understand, diagnose, and treat various health conditions. However, the complexities of biomedical data, coupled with the intricate process of developing and deploying machine learning solutions, often pose significant challenges to researchers in these fields. Our pivotal achievement in this research is the introduction of the Automatic Semantic Machine Learning Microservice (AIMS) framework. AIMS addresses these challenges by automating various stages of the machine learning pipeline, with a particular emphasis on the ontology of machine learning services tailored to the biomedical domain. This ontology encompasses everything from task representation, service modeling, and knowledge acquisition to knowledge reasoning and the establishment of a self-supervised learning policy. Our framework has been crafted to prioritize model interpretability, integrate domain knowledge effortlessly, and handle biomedical data with efficiency. Additionally, AIMS boasts a distinctive feature: it leverages self-supervised knowledge learning through reinforcement learning techniques, paired with an ontology-based policy recording schema. This enables it to autonomously generate, fine-tune, and continually adapt to machine learning models, especially when faced with new tasks and data. Our work has two standout contributions demonstrating that machine learning processes in the biomedical domain can be automated, while integrating a rich domain knowledge base and providing a way for machines to have self-learning ability, ensuring they handle new tasks effectively. To showcase AIMS in action, we have highlighted its prowess in three case studies of biomedical tasks. These examples emphasize how our framework can simplify research routines, uplift the caliber of scientific exploration, and set the stage for notable advances.
Collapse
|
3
|
Chepelev LL, Kwan D, Kahn CE, Filice RW, Wang KC. Ontologies in the New Computational Age of Radiology: RadLex for Semantics and Interoperability in Imaging Workflows. Radiographics 2023; 43:e220098. [PMID: 36757882 DOI: 10.1148/rg.220098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/10/2023]
Abstract
From basic research to the bedside, precise terminology is key to advancing medicine and ensuring optimal and appropriate patient care. However, the wide spectrum of diseases and their manifestations superimposed on medical team-specific and discipline-specific communication patterns often impairs shared understanding and the shared use of common medical terminology. Common terms are currently used in medicine to ensure interoperability and facilitate integration of biomedical information for clinical practice and emerging scientific and educational applications alike, from database integration to supporting basic clinical operations such as billing. Such common terminologies can be provided in ontologies, which are formalized representations of knowledge in a particular domain. Ontologies unambiguously specify common concepts and describe the relationships between those concepts by using a form that is mathematically precise and accessible to humans and machines alike. RadLex® is a key RSNA initiative that provides a shared domain model, or ontology, of radiology to facilitate integration of information in radiology education, clinical care, and research. As the contributions of the computational components of common radiologic workflows continue to increase with the ongoing development of big data, artificial intelligence, and novel image analysis and visualization tools, the use of common terminologies is becoming increasingly important for supporting seamless computational resource integration across medicine. This article introduces ontologies, outlines the fundamental semantic web technologies used to create and apply RadLex, and presents examples of RadLex applications in everyday radiology and research. It concludes with a discussion of emerging applications of RadLex, including artificial intelligence applications. © RSNA, 2023 Quiz questions for this article are available in the supplemental material.
Collapse
Affiliation(s)
- Leonid L Chepelev
- From the Joint Department of Medical Imaging, University Health Network, University of Toronto, Toronto General Hospital, 585 University Ave, 1-PMB 286, Toronto, ON, Canada M5G 2N2 (L.L.C.); Insygnia Consulting, Toronto, ON, Canada (D.K.); Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA (C.E.K.); Department of Radiology, MedStar Georgetown University Hospital, Washington, DC (R.W.F.); and Imaging Service, Baltimore VA Medical Center, Baltimore, MD, and Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD (K.C.W.)
| | - David Kwan
- From the Joint Department of Medical Imaging, University Health Network, University of Toronto, Toronto General Hospital, 585 University Ave, 1-PMB 286, Toronto, ON, Canada M5G 2N2 (L.L.C.); Insygnia Consulting, Toronto, ON, Canada (D.K.); Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA (C.E.K.); Department of Radiology, MedStar Georgetown University Hospital, Washington, DC (R.W.F.); and Imaging Service, Baltimore VA Medical Center, Baltimore, MD, and Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD (K.C.W.)
| | - Charles E Kahn
- From the Joint Department of Medical Imaging, University Health Network, University of Toronto, Toronto General Hospital, 585 University Ave, 1-PMB 286, Toronto, ON, Canada M5G 2N2 (L.L.C.); Insygnia Consulting, Toronto, ON, Canada (D.K.); Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA (C.E.K.); Department of Radiology, MedStar Georgetown University Hospital, Washington, DC (R.W.F.); and Imaging Service, Baltimore VA Medical Center, Baltimore, MD, and Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD (K.C.W.)
| | - Ross W Filice
- From the Joint Department of Medical Imaging, University Health Network, University of Toronto, Toronto General Hospital, 585 University Ave, 1-PMB 286, Toronto, ON, Canada M5G 2N2 (L.L.C.); Insygnia Consulting, Toronto, ON, Canada (D.K.); Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA (C.E.K.); Department of Radiology, MedStar Georgetown University Hospital, Washington, DC (R.W.F.); and Imaging Service, Baltimore VA Medical Center, Baltimore, MD, and Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD (K.C.W.)
| | - Kenneth C Wang
- From the Joint Department of Medical Imaging, University Health Network, University of Toronto, Toronto General Hospital, 585 University Ave, 1-PMB 286, Toronto, ON, Canada M5G 2N2 (L.L.C.); Insygnia Consulting, Toronto, ON, Canada (D.K.); Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA (C.E.K.); Department of Radiology, MedStar Georgetown University Hospital, Washington, DC (R.W.F.); and Imaging Service, Baltimore VA Medical Center, Baltimore, MD, and Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD (K.C.W.)
| |
Collapse
|