1
|
Hoyo V, Nehl E, Dozier A, Harvey J, Kane C, Perry A, Samuels E, Schmidt S, Hunt J. A landscape assessment of CTSA evaluators and their work in the CTSA consortium, 2021 survey findings. J Clin Transl Sci 2024; 8:e79. [PMID: 38745877 PMCID: PMC11091919 DOI: 10.1017/cts.2024.526] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2024] [Revised: 03/22/2024] [Accepted: 04/15/2024] [Indexed: 05/16/2024] Open
Abstract
This article presents a landscape assessment of the findings from the 2021 Clinical and Translational Science Award (CTSA) Evaluators Survey. This survey was the most recent iteration of a well established, national, peer-led systematic snapshot of the CTSA evaluators, their skillsets, listed evaluation resources, preferred methods, and identified best practices. Three questions guided our study: who are the CTSA evaluators, what competencies do they share and how is their work used within hubs. We describe our survey process (logistics of development, deployment, and differences in historical context with prior instruments); and present its main findings. We provide specific recommendations for evaluation practice in two main categories (National vs Group-level) including, among others, the need for a national, strategic plan for evaluation as well as enhanced mentoring and training of the next generation of evaluators. Although based on the challenges and opportunities currently within the CTSA Consortium, takeaways from this study constitute important lessons with potential for application in other large evaluation consortia. To our knowledge, this is the first time 2021 survey findings are disseminated widely, to increase transparency of the CTSA evaluators' work and to motivate conversations within hub and beyond, as to how best to leverage existent evaluative capacity.
Collapse
Affiliation(s)
- Verónica Hoyo
- Northwestern University Clinical and Translational Sciences Institute (NUCATS), Northwestern University, Chicago, IL, USA
| | - Eric Nehl
- Georgia CTSA, Emory University, Atlanta, GA, USA
| | - Ann Dozier
- University of Rochester Clinical and Translational Science Institute, Rochester, NY, USA
| | - Jillian Harvey
- MUSC South Carolina Clinical and Translational Science Institute, Charleston, SC, USA
| | - Cathleen Kane
- NYU Langone Health Clinical and Translational Science Institute, New York, NY, USA
| | - Anna Perry
- Wake Forest Clinical and Translational Science Institute, Winston-Salem, NC, USA
| | - Elias Samuels
- Michigan Institute for Clinical and Health Research (MICHR), University of Michigan, Ann Arbor, MI, USA
| | - Susanne Schmidt
- UT Health San Antonio, Institute for Integration of Medicine and Science, San Antonio, TX, USA
| | - Joe Hunt
- Indiana Clinical and Translational Science Institute, Indianapolis, IN, USA
| |
Collapse
|
2
|
Hall ES, Melton GB, Payne PRO, Dorr DA, Vawdrey DK. How Are Leading Research Institutions Engaging with Data Sharing Tools and Programs? AMIA ... ANNUAL SYMPOSIUM PROCEEDINGS. AMIA SYMPOSIUM 2024; 2023:397-406. [PMID: 38222386 PMCID: PMC10785902] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/16/2024]
Abstract
With widespread electronic health record (EHR) adoption and improvements in health information interoperability in the United States, troves of data are available for knowledge discovery. Several data sharing programs and tools have been developed to support research activities, including efforts funded by the National Institutes of Health (NIH), EHR vendors, and other public- and private-sector entities. We surveyed 65 leading research institutions (77% response rate) about their use of and value derived from ten programs/tools, including NIH's Accrual to Clinical Trials, Epic Corporation's Cosmos, and the Observational Health Data Sciences and Informatics consortium. Most institutions participated in multiple programs/tools but reported relatively low usage (even when they participated, they frequently indicated that fewer than one individual/month benefitted from the platform to support research activities). Our findings suggest that investments in research data sharing have not yet achieved desired results.
Collapse
|
3
|
Carter-Edwards L, Leverty R, Bilheimer A, Bailey L, Adeshina B, Shrestha P, Yu Z, Dave G, Cohen-Wolkowiez M, Kibbe W, Corbie G. Adapting the Evidence Academy model for virtual stakeholder engagement in a national setting during the COVID-19 pandemic. J Clin Transl Sci 2023; 7:e98. [PMID: 37250998 PMCID: PMC10225263 DOI: 10.1017/cts.2023.37] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Revised: 02/10/2023] [Accepted: 03/01/2023] [Indexed: 03/22/2023] Open
Abstract
The COVID-19 pandemic raised the importance of adaptive capacity and preparedness when engaging historically marginalized populations in research and practice. The Rapid Acceleration of Diagnostics in Underserved Populations' COVID-19 Equity Evidence Academy Series (RADx-UP EA) is a virtual, national, interactive conference model designed to support and engage community-academic partnerships in a collaborative effort to improve practices that overcome disparities in SARS-CoV-2 testing and testing technologies. The RADx-UP EA promotes information sharing, critical reflection and discussion, and creation of translatable strategies for health equity. Staff and faculty from the RADx-UP Coordination and Data Collection Center developed three EA events with diverse geographic, racial, and ethnic representation of attendees from RADx-UP community-academic project teams: February 2021 (n = 319); November 2021 (n = 242); and September 2022 (n = 254). Each EA event included a data profile; 2-day, virtual event; event summary report; community dissemination product; and an evaluation strategy. Operational and translational delivery processes were iteratively adapted for each EA across one or more of five adaptive capacity domains: assets, knowledge and learning, social organization, flexibility, and innovation. The RADx-UP EA model can be generalized beyond RADx-UP and tailored by community and academic input to respond to local or national health emergencies.
Collapse
Affiliation(s)
| | - Renee Leverty
- Duke Clinical Research Institute, Duke University, Durham, NC, USA
| | - Alicia Bilheimer
- University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Lindsay Bailey
- University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Bukola Adeshina
- University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
- Joan C. Edwards School of Medicine, Marshall University, Huntington, WV, USA
| | | | - Zhitong Yu
- University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | - Gaurav Dave
- University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| | | | - Warren Kibbe
- Duke Clinical Research Institute, Duke University, Durham, NC, USA
| | - Giselle Corbie
- University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
| |
Collapse
|