1
|
Economou-Zavlanos NJ, Bessias S, Cary MP, Bedoya AD, Goldstein BA, Jelovsek JE, O’Brien CL, Walden N, Elmore M, Parrish AB, Elengold S, Lytle KS, Balu S, Lipkin ME, Shariff AI, Gao M, Leverenz D, Henao R, Ming DY, Gallagher DM, Pencina MJ, Poon EG. Translating ethical and quality principles for the effective, safe and fair development, deployment and use of artificial intelligence technologies in healthcare. J Am Med Inform Assoc 2024; 31:705-713. [PMID: 38031481 PMCID: PMC10873841 DOI: 10.1093/jamia/ocad221] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Revised: 10/06/2023] [Accepted: 11/03/2023] [Indexed: 12/01/2023] Open
Abstract
OBJECTIVE The complexity and rapid pace of development of algorithmic technologies pose challenges for their regulation and oversight in healthcare settings. We sought to improve our institution's approach to evaluation and governance of algorithmic technologies used in clinical care and operations by creating an Implementation Guide that standardizes evaluation criteria so that local oversight is performed in an objective fashion. MATERIALS AND METHODS Building on a framework that applies key ethical and quality principles (clinical value and safety, fairness and equity, usability and adoption, transparency and accountability, and regulatory compliance), we created concrete guidelines for evaluating algorithmic technologies at our institution. RESULTS An Implementation Guide articulates evaluation criteria used during review of algorithmic technologies and details what evidence supports the implementation of ethical and quality principles for trustworthy health AI. Application of the processes described in the Implementation Guide can lead to algorithms that are safer as well as more effective, fair, and equitable upon implementation, as illustrated through 4 examples of technologies at different phases of the algorithmic lifecycle that underwent evaluation at our academic medical center. DISCUSSION By providing clear descriptions/definitions of evaluation criteria and embedding them within standardized processes, we streamlined oversight processes and educated communities using and developing algorithmic technologies within our institution. CONCLUSIONS We developed a scalable, adaptable framework for translating principles into evaluation criteria and specific requirements that support trustworthy implementation of algorithmic technologies in patient care and healthcare operations.
Collapse
Affiliation(s)
| | - Sophia Bessias
- Duke AI Health, Duke University School of Medicine, Durham, NC 27705, United States
| | - Michael P Cary
- Duke AI Health, Duke University School of Medicine, Durham, NC 27705, United States
- Duke University School of Nursing, Durham, NC 27710, United States
| | - Armando D Bedoya
- Duke Health Technology Solutions, Duke University Health System, Durham, NC 27705, United States
- Department of Medicine, Duke University School of Medicine, Durham, NC 27710, United States
| | - Benjamin A Goldstein
- Duke AI Health, Duke University School of Medicine, Durham, NC 27705, United States
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, NC 27705, United States
| | - John E Jelovsek
- Department of Obstetrics and Gynecology, Duke University School of Medicine, Durham, NC 27710, United States
| | - Cara L O’Brien
- Duke Health Technology Solutions, Duke University Health System, Durham, NC 27705, United States
- Department of Medicine, Duke University School of Medicine, Durham, NC 27710, United States
| | - Nancy Walden
- Duke AI Health, Duke University School of Medicine, Durham, NC 27705, United States
| | - Matthew Elmore
- Duke AI Health, Duke University School of Medicine, Durham, NC 27705, United States
| | - Amanda B Parrish
- Office of Regulatory Affairs and Quality, Duke University School of Medicine, Durham, NC 27705, United States
| | - Scott Elengold
- Office of Counsel, Duke University, Durham, NC 27701, United States
| | - Kay S Lytle
- Duke University School of Nursing, Durham, NC 27710, United States
- Duke Health Technology Solutions, Duke University Health System, Durham, NC 27705, United States
| | - Suresh Balu
- Duke Institute for Health Innovation, Duke University, Durham, NC 27701, United States
| | - Michael E Lipkin
- Department of Urology, Duke University School of Medicine, Durham, NC 27710, United States
| | - Afreen Idris Shariff
- Department of Medicine, Duke University School of Medicine, Durham, NC 27710, United States
- Duke Endocrine-Oncology Program, Duke University Health System, Durham, NC 27710, United States
| | - Michael Gao
- Duke Institute for Health Innovation, Duke University, Durham, NC 27701, United States
| | - David Leverenz
- Department of Medicine, Duke University School of Medicine, Durham, NC 27710, United States
| | - Ricardo Henao
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, NC 27705, United States
- Department of Bioengineering, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
| | - David Y Ming
- Department of Medicine, Duke University School of Medicine, Durham, NC 27710, United States
- Duke Department of Pediatrics, Duke University Health System, Durham, NC 27705, United States
- Department of Population Health Sciences, Duke University School of Medicine, Durham, NC 27701, United States
| | - David M Gallagher
- Department of Medicine, Duke University School of Medicine, Durham, NC 27710, United States
| | - Michael J Pencina
- Duke AI Health, Duke University School of Medicine, Durham, NC 27705, United States
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, NC 27705, United States
| | - Eric G Poon
- Duke Health Technology Solutions, Duke University Health System, Durham, NC 27705, United States
- Department of Medicine, Duke University School of Medicine, Durham, NC 27710, United States
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, NC 27705, United States
| |
Collapse
|
2
|
Westra BL, Whittenburg L, Lytle KS, Tokareva I, Umberfield EE, Leverette M, Buchleiter R, Johnson S, Jobman L. Clinical Knowledge Model for the Prevention of Healthcare-Associated Venous Thromboembolism. Comput Inform Nurs 2024; 42:144-150. [PMID: 38241731 DOI: 10.1097/cin.0000000000001088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2024]
Abstract
Knowledge models inform organizational behavior through the logical association of documentation processes, definitions, data elements, and value sets. The development of a well-designed knowledge model allows for the reuse of electronic health record data to promote efficiency in practice, data interoperability, and the extensibility of data to new capabilities or functionality such as clinical decision support, quality improvement, and research. The purpose of this article is to describe the development and validation of a knowledge model for healthcare-associated venous thromboembolism prevention. The team used FloMap, an Internet-based survey resource, to compare metadata from six healthcare organizations to an initial draft model. The team used consensus decision-making over time to compare survey results. The resulting model included seven panels, 41 questions, and 231 values. A second validation step included completion of an Internet-based survey with 26 staff nurse respondents representing 15 healthcare organizations, two electronic health record vendors, and one academic institution. The final knowledge model contained nine Logical Observation Identifiers Names and Codes panels, 32 concepts, and 195 values representing an additional six panels (groupings), 15 concepts (questions), and the specification of 195 values (answers). The final model is useful for consistent documentation to demonstrate the contribution of nursing practice to the prevention of venous thromboembolism.
Collapse
Affiliation(s)
- Bonnie L Westra
- Author Affiliations: School of Nursing, University of Minnesota, Minneapolis (Dr Westra); Health Informatics, Washington, DC (Dr Whittenburg); Health System Nursing and Duke Health Technology Systems, Duke University Health System, Durham, NC (Dr Lytle); Center for Healthcare Policy and Research, Sacramento, CA (Ms Tokareva); Division of Nursing Research, Department of Artificial Intelligence and Informatics, Mayo Clinic, Rochester, MN (Dr Umberfield); University of Minnesota, Minneapolis, MN (Ms Leverette); Clinical Data and Analytics, HCA Healthcare, Nashville, TN (Ms Buchleiter); University of Minnesota, Institute for Health Informatics, Minneapolis (Dr Johnson); and Association of Perioperative Registered Nurses, Denver, CO (Mr Jobman)
| | | | | | | | | | | | | | | | | |
Collapse
|
3
|
Lytle KS, Westra BL, Whittenburg L, Adams M, Akre M, Ali S, Furukawa M, Hartleben S, Hook M, Johnson SG, Settergren TT, Thibodeaux M. Information Models Offer Value to Standardize Electronic Health Record Flowsheet Data: A Fall Prevention Exemplar. J Nurs Scholarsh 2021; 53:306-314. [PMID: 33720514 DOI: 10.1111/jnu.12646] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/13/2021] [Indexed: 11/28/2022]
Abstract
PURPOSE The rapid implementation of electronic health records (EHRs) resulted in a lack of data standardization and created considerable difficulty for secondary use of EHR documentation data within and between organizations. While EHRs contain documentation data (input), nurses and healthcare organizations rarely have useable documentation data (output). The purpose of this article is to describe a method of standardizing EHR flowsheet documentation data using information models (IMs) to support exchange, quality improvement, and big data research. As an exemplar, EHR flowsheet metadata (input) from multiple organizations was used to validate a fall prevention IM. DESIGN A consensus-based, qualitative, descriptive approach was used to identify a minimum set of essential fall prevention data concepts documented by staff nurses in acute care. The goal was to increase generalizable and comparable nurse-sensitive data on the prevention of falls across organizations for big data research. METHODS The research team conducted a retrospective, observational study using an iterative, consensus-based approach to map, analyze, and evaluate nursing flowsheet metadata contributed by eight health systems. The team used FloMap software to aggregate flowsheet data across organizations for mapping and comparison of data to a reference IM. The FloMap analysis was refined with input from staff nurse subject matter experts, review of published evidence, current documentation standards, Magnet Recognition nursing standards, and informal fall prevention nursing use cases. FINDINGS Flowsheet metadata analyzed from the EHR systems represented 6.6 million patients, 27 million encounters, and 683 million observations. Compared to the original reference IM, five new IM classes were added, concepts were reduced by 14 (from 57 to 43), and 157 value set items were added. The final fall prevention IM incorporated 11 condition or age-specific fall risk screening tools and a fall event details class with 14 concepts. CONCLUSION The iterative, consensus-based refinement and validation of the fall prevention IM from actual EHR fall prevention flowsheet documentation contributes to the ability to semantically exchange and compare fall prevention data across multiple health systems and organizations. This method and approach provides a process for standardizing flowsheet data as coded data for information exchange and use in big data research. CLINICAL RELEVANCE Opportunities exist to work with EHR vendors and the Office of the National Coordinator for Health Information Technology to implement standardized IMs within EHRs to expand interoperability of nurse-sensitive data.
Collapse
Affiliation(s)
- Kay S Lytle
- Alpha Alpha & Beta Nu, Chief Nursing Information Officer, Duke University Health System, Durham, NC, USA
| | - Bonnie L Westra
- Associate Professor Emerita, School of Nursing, University of Minnesota, Minneapolis, MN, USA
| | | | - Mischa Adams
- Clinical Outcomes Improvement Director, Health Catalyst, Minneapolis, MN, USA
| | - Mari Akre
- Sr. Population Health Strategy Executive, Cerner Corporation, Kansas City, MO, USA
| | - Samira Ali
- Adjunct Faculty, School of Nursing, Grand Canyon University, Phoenix, AZ, USA
| | - Meg Furukawa
- Nurse Informaticist, University of California Los Angeles Health, Los Angeles, CA, USA
| | - Stephanie Hartleben
- Kappa Upsilon, Senior Principal Clinical Informatics, Elsevier Clinical Solutions, Bismarck, ND, USA
| | - Mary Hook
- Eta Nu, Nursing Research Manager, AdvocateAuroraHealth, Milwaukee, WI, USA
| | - Steven G Johnson
- Director CTSI Clinical Informatics Services, Institute for Health Informatics, University of Minnesota, Minneapolis, MN, USA
| | | | | |
Collapse
|
4
|
Douthit BJ, Musser RC, Lytle KS, Richesson RL. A Closer Look at the "Right" Format for Clinical Decision Support: Methods for Evaluating a Storyboard BestPractice Advisory. J Pers Med 2020; 10:jpm10040142. [PMID: 32977564 PMCID: PMC7712422 DOI: 10.3390/jpm10040142] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2020] [Revised: 09/11/2020] [Accepted: 09/18/2020] [Indexed: 01/17/2023] Open
Abstract
(1) Background: The five rights of clinical decision support (CDS) are a well-known framework for planning the nuances of CDS, but recent advancements have given us more options to modify the format of the alert. One-size-fits-all assessments fail to capture the nuance of different BestPractice Advisory (BPA) formats. To demonstrate a tailored evaluation methodology, we assessed a BPA after implementation of Storyboard for changes in alert fatigue, behavior influence, and task completion; (2) Methods: Data from 19 weeks before and after implementation were used to evaluate differences in each domain. Individual clinics were evaluated for task completion and compared for changes pre- and post-redesign; (3) Results: The change in format was correlated with an increase in alert fatigue, a decrease in erroneous free text answers, and worsened task completion at a system level. At a local level, however, 14% of clinics had improved task completion; (4) Conclusions: While the change in BPA format was correlated with decreased performance, the changes may have been driven primarily by the COVID-19 pandemic. The framework and metrics proposed can be used in future studies to assess the impact of new CDS formats. Although the changes in this study seemed undesirable in aggregate, some positive changes were observed at the level of individual clinics. Personalized implementations of CDS tools based on local need should be considered.
Collapse
Affiliation(s)
- Brian J. Douthit
- School of Nursing, Duke University, Durham, NC 27710, USA;
- Correspondence:
| | - R. Clayton Musser
- School of Medicine, Duke University, Durham, NC 27710, USA;
- Duke Health, Duke School of Medicine, Durham, NC 27710 USA;
| | - Kay S. Lytle
- Duke Health, Duke School of Medicine, Durham, NC 27710 USA;
| | - Rachel L. Richesson
- School of Nursing, Duke University, Durham, NC 27710, USA;
- Department of Biostatistics and Bioinformatics, Duke University, Durham, NC 27710, USA
| |
Collapse
|
5
|
Woo M, Alhanti B, Lusk S, Dunston F, Blackwelder S, Lytle KS, Goldstein BA, Bedoya A. Evaluation of ML-Based Clinical Decision Support Tool to Replace an Existing Tool in an Academic Health System: Lessons Learned. J Pers Med 2020; 10:E104. [PMID: 32867023 PMCID: PMC7565401 DOI: 10.3390/jpm10030104] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2020] [Revised: 08/19/2020] [Accepted: 08/24/2020] [Indexed: 12/03/2022] Open
Abstract
There is increasing application of machine learning tools to problems in healthcare, with an ultimate goal to improve patient safety and health outcomes. When applied appropriately, machine learning tools can augment clinical care provided to patients. However, even if a model has impressive performance characteristics, prospectively evaluating and effectively implementing models into clinical care remains difficult. The primary objective of this paper is to recount our experiences and challenges in comparing a novel machine learning-based clinical decision support tool to legacy, non-machine learning tools addressing potential safety events in the hospitals and to summarize the obstacles which prevented evaluation of clinical efficacy of tools prior to widespread institutional use. We collected and compared safety events data, specifically patient falls and pressure injuries, between the standard of care approach and machine learning (ML)-based clinical decision support (CDS). Our assessment was limited to performance of the model rather than the workflow due to challenges in directly comparing both approaches. We did note a modest improvement in falls with ML-based CDS; however, it was not possible to determine that overall improvement was due to model characteristics.
Collapse
Affiliation(s)
- Myung Woo
- Department of Medicine, Duke University School of Medicine, Durham, NC 27708, USA;
| | - Brooke Alhanti
- Duke Clinical Research Institute, Duke University School of Medicine, Durham, NC 27701, USA; (B.A.); (S.L.); (B.A.G.)
| | - Sam Lusk
- Duke Clinical Research Institute, Duke University School of Medicine, Durham, NC 27701, USA; (B.A.); (S.L.); (B.A.G.)
| | - Felicia Dunston
- Duke Health Technology Solutions, Duke University Health System, Durham, NC 27703, USA; (F.D.); (S.B.)
| | - Stephen Blackwelder
- Duke Health Technology Solutions, Duke University Health System, Durham, NC 27703, USA; (F.D.); (S.B.)
- Health Sector Management Program, Duke Fuqua School of Business, Durham, NC 27708, USA
| | - Kay S. Lytle
- Health System Nursing and Duke Health Technology Solutions, Duke University Health System, Durham, NC 27710, USA;
| | - Benjamin A. Goldstein
- Duke Clinical Research Institute, Duke University School of Medicine, Durham, NC 27701, USA; (B.A.); (S.L.); (B.A.G.)
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, NC 27708, USA
| | - Armando Bedoya
- Department of Medicine, Duke University School of Medicine, Durham, NC 27708, USA;
| |
Collapse
|
6
|
Affiliation(s)
- Kay S Lytle
- Kay S. Lytle is the chief nursing information officer at Duke University Health System in Durham, N.C
| |
Collapse
|
7
|
Lytle KS, Bailey DW, Dorman KF, Moos MK. Just a beta.... Proc AMIA Symp 1999:580-4. [PMID: 10566425 PMCID: PMC2232862] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/14/2023] Open
Abstract
Traditional implementation of clinical information systems follows a predictable project management process. The selection, development, implementation, and evaluation of the system and the project management aspects of those phases require considerable time and effort. The purpose of this paper is to describe the beta site implementation of a knowledge-based clinical information system in a specialty area of a southeastern hospital that followed a less than traditional approach to implementation. Highlighted are brief descriptions of the hospital's traditional process, the nontraditional process, and key findings from the experience. Preliminary analysis suggests that selection of an implementation process is contextual. Selection of elements from each of these methods may provide a more useful process. The non-traditional process approached the elements of communication, areas of responsibility, training, follow-up and leadership differently. These elements are common to both processes and provide a focal point for future research.
Collapse
Affiliation(s)
- K S Lytle
- University of North Carolina Hospitals, Chapel Hill, USA
| | | | | | | |
Collapse
|
8
|
Peter MA, Lytle KS, Swearengen P. Feedback to nurse managers about staff nurses' perceptions of their jobs. Semin Nurse Manag 1997; 5:209-16. [PMID: 9460480] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
This article describes a survey feedback intervention in which staff nurses were surveyed about various job characteristics, job satisfaction, and intent to remain in the organization. Nurse managers received the feedback through graphs and a workshop. A year later the same survey was conducted, and the results were compared with preintervention data. Of the 13 units surveyed, six showed significant improvement in one area and one showed significant improvement in 11. Nurse managers considered the survey feedback helpful, but the feedback alone was not sufficient for achieving broad changes in 1 year. However, the feedback is a useful component of continuous quality improvement efforts.
Collapse
Affiliation(s)
- M A Peter
- School of Nursing, University of North Carolina, Chapel Hill 27599-7460, USA
| | | | | |
Collapse
|