1
|
Fareed N, Olvera RG, Wang Y, Hayes M, Larimore EL, Balvanz P, Langley R, Noel CA, Rock P, Redmond D, Neufeld J, Kosakowski S, Harris D, LaRochelle M, Huerta TR, Glasgow L, Oga E, Villani J, Wu E. Lessons Learned From Developing Dashboards to Support Decision-Making for Community Opioid Response by Community Stakeholders: Mixed Methods and Multisite Study. JMIR Hum Factors 2024; 11:e51525. [PMID: 39250216 PMCID: PMC11420584 DOI: 10.2196/51525] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2023] [Revised: 11/08/2023] [Accepted: 05/05/2024] [Indexed: 09/10/2024] Open
Abstract
BACKGROUND Data dashboards are published tools that present visualizations; they are increasingly used to display data about behavioral health, social determinants of health, and chronic and infectious disease risks to inform or support public health endeavors. Dashboards can be an evidence-based approach used by communities to influence decision-making in health care for specific populations. Despite widespread use, evidence on how to best design and use dashboards in the public health realm is limited. There is also a notable dearth of studies that examine and document the complexity and heterogeneity of dashboards in community settings. OBJECTIVE Community stakeholders engaged in the community response to the opioid overdose crisis could benefit from the use of data dashboards for decision-making. As part of the Communities That HEAL (CTH) intervention, community data dashboards were created for stakeholders to support decision-making. We assessed stakeholders' perceptions of the usability and use of the CTH dashboards for decision-making. METHODS We conducted a mixed methods assessment between June and July 2021 on the use of CTH dashboards. We administered the System Usability Scale (SUS) and conducted semistructured group interviews with users in 33 communities across 4 states of the United States. The SUS comprises 10 five-point Likert-scale questions measuring usability, each scored from 0 to 4. The interview guides were informed by the technology adoption model (TAM) and focused on perceived usefulness, perceived ease of use, intention to use, and contextual factors. RESULTS Overall, 62 users of the CTH dashboards completed the SUS and interviews. SUS scores (grand mean 73, SD 4.6) indicated that CTH dashboards were within the acceptable range for usability. From the qualitative interview data, we inductively created subthemes within the 4 dimensions of the TAM to contextualize stakeholders' perceptions of the dashboard's usefulness and ease of use, their intention to use, and contextual factors. These data also highlighted gaps in knowledge, design, and use, which could help focus efforts to improve the use and comprehension of dashboards by stakeholders. CONCLUSIONS We present a set of prioritized gaps identified by our national group and list a set of lessons learned for improved data dashboard design and use for community stakeholders. Findings from our novel application of both the SUS and TAM provide insights and highlight important gaps and lessons learned to inform the design of data dashboards for use by decision-making community stakeholders. TRIAL REGISTRATION ClinicalTrials.gov NCT04111939; https://clinicaltrials.gov/study/NCT04111939.
Collapse
Affiliation(s)
- Naleef Fareed
- Department of Biomedical Informatics, College of Medicine, The Ohio State University, Columbus, OH, United States
| | - Ramona G Olvera
- Center for the Advancement of Team Science, Analytics, and Systems Thinking, College of Medicine, The Ohio State University, Columbus, OH, United States
| | - Yiting Wang
- Department of Research Information Technology, College of Medicine, The Ohio State University, Columbus, OH, United States
| | - Michael Hayes
- Research Triangle Institute, Research Triangle Park, NC, United States
| | - Elizabeth Liz Larimore
- Center for Drug and Alcohol Research, University of Kentucky, Lexington, KY, United States
| | - Peter Balvanz
- Clinical Addiction Research and Evaluation Unit, Section of General Internal Medicine, Boston Medical Center, Boston, MA, United States
| | - Ronald Langley
- Center for Drug and Alcohol Research, University of Kentucky, Lexington, KY, United States
| | - Corinna A Noel
- Department of Public and Ecosystem Health, Cornell University, Ithaca, NY, United States
| | - Peter Rock
- Center for Drug and Alcohol Research, University of Kentucky, Lexington, KY, United States
| | - Daniel Redmond
- Institute for Biomedical Informatics, University of Kentucky, Kentucky, KY, United States
| | - Jessica Neufeld
- Social Intervention Group, School of Social Work, Columbia University, New York, NY, United States
| | - Sarah Kosakowski
- Clinical Addiction Research and Evaluation Unit, Section of General Internal Medicine, Boston Medical Center, Boston, MA, United States
| | - Daniel Harris
- Institute for Pharmaceutical Outcomes and Policy, University of Kentucky, Lexington, KY, United States
| | - Marc LaRochelle
- Clinical Addiction Research and Evaluation Unit, Section of General Internal Medicine, Boston Medical Center, Boston, MA, United States
| | - Timothy R Huerta
- Center for the Advancement of Team Science, Analytics, and Systems Thinking, College of Medicine, The Ohio State University, Columbus, OH, United States
- Department of Research Information Technology, College of Medicine, The Ohio State University, Columbus, OH, United States
| | - LaShawn Glasgow
- Research Triangle Institute, Research Triangle Park, NC, United States
| | - Emmanuel Oga
- Department of Research Information Technology, College of Medicine, The Ohio State University, Columbus, OH, United States
| | | | - Elwin Wu
- Social Intervention Group, School of Social Work, Columbia University, New York, NY, United States
| |
Collapse
|
2
|
Fakhoury H, Trochez R, Kripalani S, Choma N, Blessinger E, Nelson LA. Patient engagement with an automated postdischarge text messaging program for improving care transitions. J Hosp Med 2024; 19:513-517. [PMID: 38497416 DOI: 10.1002/jhm.13334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/20/2023] [Revised: 02/19/2024] [Accepted: 03/07/2024] [Indexed: 03/19/2024]
Abstract
Automated text messaging is a promising approach to monitor patients after hospital discharge and avert readmissions; however, it is not known to what extent patients would engage with this type of program and whether engagement may vary based on patients' characteristics. Using data from a 30-day postdischarge texting program at a large university hospital, we examined engagement over time (operationalized as response rate to text messages) and patient characteristics associated with engagement. Of the 1324 patients in the study sample, 838 (63%) stayed in the program for the full duration. Among those retained, the median response rate was 33% (interquartile range: 11%-77%) and decreased over time. Patients who were male (p < .05), were Black/African American (p < .001), had lower health literacy (p < .01), or had not recently logged into the patient portal (p < .001), all had lower response rates. Results support closer examinations of patient engagement in hospital-based texting programs and who is positioned to benefit.
Collapse
Affiliation(s)
- Hassan Fakhoury
- Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Ricardo Trochez
- Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA
- Center for Health Services Research, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Sunil Kripalani
- Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA
- Center for Health Services Research, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Neesha Choma
- Department of Quality, Safety, and Risk Prevention, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Emily Blessinger
- Vanderbilt Discharge Care Center, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| | - Lyndsay A Nelson
- Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA
- Center for Health Services Research, Vanderbilt University Medical Center, Nashville, Tennessee, USA
| |
Collapse
|
3
|
Nelson LA, Spieker AJ, LeStourgeon LM, Greevy Jr RA, Molli S, Roddy MK, Mayberry LS. The Goldilocks Dilemma on Balancing User Response and Reflection in mHealth Interventions: Observational Study. JMIR Mhealth Uhealth 2024; 12:e47632. [PMID: 38297891 PMCID: PMC10850735 DOI: 10.2196/47632] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Revised: 11/29/2023] [Accepted: 11/30/2023] [Indexed: 02/02/2024] Open
Abstract
Background Mobile health (mHealth) has the potential to radically improve health behaviors and quality of life; however, there are still key gaps in understanding how to optimize mHealth engagement. Most engagement research reports only on system use without consideration of whether the user is reflecting on the content cognitively. Although interactions with mHealth are critical, cognitive investment may also be important for meaningful behavior change. Notably, content that is designed to request too much reflection could result in users' disengagement. Understanding how to strike the balance between response burden and reflection burden has critical implications for achieving effective engagement to impact intended outcomes. Objective In this observational study, we sought to understand the interplay between response burden and reflection burden and how they impact mHealth engagement. Specifically, we explored how varying the response and reflection burdens of mHealth content would impact users' text message response rates in an mHealth intervention. Methods We recruited support persons of people with diabetes for a randomized controlled trial that evaluated an mHealth intervention for diabetes management. Support person participants assigned to the intervention (n=148) completed a survey and received text messages for 9 months. During the 2-year randomized controlled trial, we sent 4 versions of a weekly, two-way text message that varied in both reflection burden (level of cognitive reflection requested relative to that of other messages) and response burden (level of information requested for the response relative to that of other messages). We quantified engagement by using participant-level response rates. We compared the odds of responding to each text and used Poisson regression to estimate associations between participant characteristics and response rates. Results The texts requesting the most reflection had the lowest response rates regardless of response burden (high reflection and low response burdens: median 10%, IQR 0%-40%; high reflection and high response burdens: median 23%, IQR 0%-51%). The response rate was highest for the text requesting the least reflection (low reflection and low response burdens: median 90%, IQR 61%-100%) yet still relatively high for the text requesting medium reflection (medium reflection and low response burdens: median 75%, IQR 38%-96%). Lower odds of responding were associated with higher reflection burden (P<.001). Younger participants and participants who had a lower socioeconomic status had lower response rates to texts with more reflection burden, relative to those of their counterparts (all P values were <.05). Conclusions As reflection burden increased, engagement decreased, and we found more disparities in engagement across participants' characteristics. Content encouraging moderate levels of reflection may be ideal for achieving both cognitive investment and system use. Our findings provide insights into mHealth design and the optimization of both engagement and effectiveness.
Collapse
Affiliation(s)
- Lyndsay A Nelson
- Department of Medicine, Vanderbilt University Medical Center, NashvilleTN, United States
- Center for Health Behavior and Health Education, Vanderbilt University Medical Center, NashvilleTN, United States
| | - Andrew J Spieker
- Department of Biostatistics, Vanderbilt University Medical Center, NashvilleTN, United States
| | - Lauren M LeStourgeon
- Department of Medicine, Vanderbilt University Medical Center, NashvilleTN, United States
- Center for Health Behavior and Health Education, Vanderbilt University Medical Center, NashvilleTN, United States
| | - Robert A Greevy Jr
- Department of Biostatistics, Vanderbilt University Medical Center, NashvilleTN, United States
| | - Samuel Molli
- Department of Medicine, Vanderbilt University Medical Center, NashvilleTN, United States
- Center for Health Behavior and Health Education, Vanderbilt University Medical Center, NashvilleTN, United States
| | - McKenzie K Roddy
- Department of Medicine, Vanderbilt University Medical Center, NashvilleTN, United States
- Center for Health Behavior and Health Education, Vanderbilt University Medical Center, NashvilleTN, United States
| | - Lindsay S Mayberry
- Department of Medicine, Vanderbilt University Medical Center, NashvilleTN, United States
- Center for Health Behavior and Health Education, Vanderbilt University Medical Center, NashvilleTN, United States
- Department of Biomedical Informatics, Vanderbilt University Medical Center, NashvilleTN, United States
| |
Collapse
|