1
|
Edwards P, Perkins C. Response is increased using postal rather than electronic questionnaires - new results from an updated Cochrane Systematic Review. BMC Med Res Methodol 2024; 24:209. [PMID: 39285263 PMCID: PMC11403848 DOI: 10.1186/s12874-024-02332-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2023] [Accepted: 09/06/2024] [Indexed: 09/22/2024] Open
Abstract
BACKGROUND A decade ago paper questionnaires were more common in epidemiology than those administered online, but increasing Internet access may have changed this. Researchers planning to use a self-administered questionnaire should know whether response rates to questionnaires administered electronically differ to those of questionnaires administered by post. We analysed trials included in a recently updated Cochrane Review to answer this question. METHODS We exported data of randomised controlled trials included in three comparisons in the Cochrane Review that had evaluated hypotheses relevant to our research objective and imported them into Stata for a series of meta-analyses not conducted in the Cochrane review. We pooled odds ratios for response using random effects meta-analyses. We explored causes of heterogeneity among study results using subgroups. We assessed evidence for reporting bias using Harbord's modified test for small-study effects. RESULTS Twenty-seven trials (66,118 participants) evaluated the effect on response of an electronic questionnaire compared with postal. Results were heterogeneous (I-squared = 98%). There was evidence for biased (greater) effect estimates in studies at high risk of bias; A synthesis of studies at low risk of bias indicates that response was increased (OR = 1.43; 95% CI 1.08-1.89) using postal questionnaires. Ten trials (39,523 participants) evaluated the effect of providing a choice of mode (postal or electronic) compared to an electronic questionnaire only. Response was increased with a choice of mode (OR = 1.63; 95% CI 1.18-2.26). Eight trials (20,909 participants) evaluated the effect of a choice of mode (electronic or postal) compared to a postal questionnaire only. There was no evidence for an effect on response of a choice of mode compared with postal only (OR = 0.94; 95% CI 0.86-1.02). CONCLUSIONS Postal questionnaires should be used in preference to, or offered in addition to, electronic modes.
Collapse
Affiliation(s)
- Phil Edwards
- Data Collection Methodology Group, Department of Population Health, Faculty of Epidemiology and Population Health, London School of Hygiene & Tropical Medicine, Keppel Street, London, WC1E 7HT, UK.
| | - Chloe Perkins
- Data Collection Methodology Group, Department of Population Health, Faculty of Epidemiology and Population Health, London School of Hygiene & Tropical Medicine, Keppel Street, London, WC1E 7HT, UK
| |
Collapse
|
2
|
Edwards PJ, Roberts I, Clarke MJ, DiGuiseppi C, Woolf B, Perkins C. Methods to increase response to postal and electronic questionnaires. Cochrane Database Syst Rev 2023; 11:MR000008. [PMID: 38032037 PMCID: PMC10687884 DOI: 10.1002/14651858.mr000008.pub5] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/01/2023]
Abstract
BACKGROUND Self-administered questionnaires are widely used to collect data in epidemiological research, but non-response reduces the effective sample size and can introduce bias. Finding ways to increase response to postal and electronic questionnaires would improve the quality of epidemiological research. OBJECTIVES To identify effective strategies to increase response to postal and electronic questionnaires. SEARCH METHODS We searched 14 electronic databases up to December 2021 and manually searched the reference lists of relevant trials and reviews. We contacted the authors of all trials or reviews to ask about unpublished trials; where necessary, we also contacted authors to confirm the methods of allocation used and to clarify results presented. SELECTION CRITERIA Randomised trials of methods to increase response to postal or electronic questionnaires. We assessed the eligibility of each trial using pre-defined criteria. DATA COLLECTION AND ANALYSIS We extracted data on the trial participants, the intervention, the number randomised to intervention and comparison groups and allocation concealment. For each strategy, we estimated pooled odds ratios (OR) and 95% confidence intervals (CI) in a random-effects model. We assessed evidence for selection bias using Egger's weighted regression method and Begg's rank correlation test and funnel plot. We assessed heterogeneity amongst trial odds ratios using a Chi2 test and quantified the degree of inconsistency between trial results using the I2 statistic. MAIN RESULTS Postal We found 670 eligible trials that evaluated over 100 different strategies of increasing response to postal questionnaires. We found substantial heterogeneity amongst trial results in half of the strategies. The odds of response almost doubled when: using monetary incentives (odds ratio (OR) 1.86; 95% confidence interval (CI) 1.73 to 1.99; heterogeneity I2 = 85%); using a telephone reminder (OR 1.96; 95% CI 1.03 to 3.74); and when clinical outcome questions were placed last (OR 2.05; 95% CI 1.00 to 4.24). The odds of response increased by about half when: using a shorter questionnaire (OR 1.58; 95% CI 1.40 to 1.78); contacting participants before sending questionnaires (OR 1.36; 95% CI 1.23 to 1.51; I2 = 87%); incentives were given with questionnaires (i.e. unconditional) rather than when given only after participants had returned their questionnaire (i.e. conditional on response) (OR 1.53; 95% CI 1.35 to 1.74); using personalised SMS reminders (OR 1.53; 95% CI 0.97 to 2.42); using a special (recorded) delivery service (OR 1.68; 95% CI 1.36 to 2.08; I2 = 87%); using electronic reminders (OR 1.60; 95% CI 1.10 to 2.33); using intensive follow-up (OR 1.69; 95% CI 0.93 to 3.06); using a more interesting/salient questionnaire (OR 1.73; 95% CI 1.12 to 2.66); and when mentioning an obligation to respond (OR 1.61; 95% CI 1.16 to 2.22). The odds of response also increased with: non-monetary incentives (OR 1.16; 95% CI 1.11 to 1.21; I2 = 80%); a larger monetary incentive (OR 1.24; 95% CI 1.15 to 1.33); a larger non-monetary incentive (OR 1.15; 95% CI 1.00 to 1.33); when a pen was included (OR 1.44; 95% CI 1.38 to 1.50); using personalised materials (OR 1.15; 95% CI 1.09 to 1.21; I2 = 57%); using a single-sided rather than a double-sided questionnaire (OR 1.13; 95% CI 1.02 to 1.25); using stamped return envelopes rather than franked return envelopes (OR 1.23; 95% CI 1.13 to 1.33; I2 = 69%), assuring confidentiality (OR 1.33; 95% CI 1.24 to 1.42); using first-class outward mailing (OR 1.11; 95% CI 1.02 to 1.21); and when questionnaires originated from a university (OR 1.32; 95% CI 1.13 to 1.54). The odds of response were reduced when the questionnaire included questions of a sensitive nature (OR 0.94; 95% CI 0.88 to 1.00). Electronic We found 88 eligible trials that evaluated over 30 different ways of increasing response to electronic questionnaires. We found substantial heterogeneity amongst trial results in half of the strategies. The odds of response tripled when: using a brief letter rather than a detailed letter (OR 3.26; 95% CI 1.79 to 5.94); and when a picture was included in an email (OR 3.05; 95% CI 1.84 to 5.06; I2 = 19%). The odds of response almost doubled when: using monetary incentives (OR 1.88; 95% CI 1.31 to 2.71; I2 = 79%); and using a more interesting topic (OR 1.85; 95% CI 1.52 to 2.26). The odds of response increased by half when: using non-monetary incentives (OR 1.60; 95% CI 1.25 to 2.05); using shorter e-questionnaires (OR 1.51; 95% CI 1.06 to 2.16; I2 = 94%); and using a more interesting e-questionnaire (OR 1.85; 95% CI 1.52 to 2.26). The odds of response increased by a third when: offering survey results as an incentive (OR 1.36; 95% CI 1.16 to 1.59); using a white background (OR 1.31; 95% CI 1.10 to 1.56); and when stressing the benefits to society of response (OR 1.38; 95% CI 1.07 to 1.78; I2 = 41%). The odds of response also increased with: personalised e-questionnaires (OR 1.24; 95% CI 1.17 to 1.32; I2 = 41%); using a simple header (OR 1.23; 95% CI 1.03 to 1.48); giving a deadline (OR 1.18; 95% CI 1.03 to 1.34); and by giving a longer time estimate for completion (OR 1.25; 95% CI 0.96 to 1.64). The odds of response were reduced when: "Survey" was mentioned in the e-mail subject (OR 0.81; 95% CI 0.67 to 0.97); when the email or the e-questionnaire was from a male investigator, or it included a male signature (OR 0.55; 95% CI 0.38 to 0.80); and by using university sponsorship (OR 0.84; 95%CI 0.69 to 1.01). The odds of response using a postal questionnaire were over twice those using an e-questionnaire (OR 2.33; 95% CI 2.25 to 2.42; I2 = 98%). Response also increased when: providing a choice of response mode (electronic or postal) rather than electronic only (OR 1.76 95% CI 1.67 to 1.85; I2 = 97%); and when administering the e-questionnaire by computer rather than by smartphone (OR 1.62 95% CI 1.36 to 1.94). AUTHORS' CONCLUSIONS Researchers using postal and electronic questionnaires can increase response using the strategies shown to be effective in this Cochrane review.
Collapse
Affiliation(s)
- Philip James Edwards
- Faculty of Epidemiology and Population Health, London School of Hygiene & Tropical Medicine, London, UK
| | - Ian Roberts
- Faculty of Epidemiology and Population Health, London School of Hygiene & Tropical Medicine, London, UK
| | - Mike J Clarke
- Centre for Public Health, Queens University Belfast, Belfast, UK
| | - Carolyn DiGuiseppi
- Colorado School of Public Health, University of Colorado Anschutz Medical Campus, Aurora, CO, USA
| | - Benjamin Woolf
- School of Psychological Science, University of Bristol, Bristol, UK
| | | |
Collapse
|
3
|
Williams JA, Vriniotis MG, Gundersen DA, Boden LI, Collins JE, Katz JN, Wagner GR, Sorensen G. How to ask: Surveying nursing directors of nursing homes. Health Sci Rep 2021; 4:e304. [PMID: 34136659 PMCID: PMC8177897 DOI: 10.1002/hsr2.304] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Revised: 04/19/2021] [Accepted: 04/25/2021] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND AND AIMS Nursing home research may involve eliciting information from managers, yet response rates for Directors of Nursing have not been recently studied. As a part of a more extensive study, we surveyed all nursing homes in three states in 2018 and 2019, updating how to survey these leaders effectively. We focus on response rates as a measure of non-response error and comparison of nursing home's characteristics to their population values as a measure of representation error. METHODS We surveyed Directors of Nursing or their designees in nursing homes serving adult residents with at least 30 beds in California, Massachusetts, and Ohio (N = 2389). We collected contact information for respondents and then emailed survey invitations and links, followed by three email reminders and a paper version. Nursing home associations in two of the states contacted their members on our behalf. We compared the response rates across waves and states. We also compared the characteristics of nursing homes based on whether the response was via email or paper. In a multivariable logit regression, we used characteristics of the survey and the nursing homes to predict whether their DON responded to the survey using adjustments for multiple comparisons. RESULTS The response rate was higher for the first wave than for the second (30% vs 20.5%). The highest response rate was in Massachusetts (31.8%), followed by Ohio (25.8%) and California (19.5%). Nursing home characteristics did not vary by response mode. Additionally, we did not find any statistically significant predictors of whether a nursing home responded. CONCLUSION A single-mode survey may provide a reasonably representative sample at the cost of sample size. With that said, however, switching modes can increase sample size without potentially biasing the sample.
Collapse
Affiliation(s)
| | - Mary G. Vriniotis
- Center for Community‐Based ResearchDana‐Farber Cancer InstituteBostonMassachusetts
- Social and Behavioral SciencesHarvard T.H. Chan School of Public HealthBostonMassachusetts
| | - Daniel A. Gundersen
- Survey and Qualitative Methods CoreDana‐Farber Cancer InstituteBostonMassachusetts
| | - Leslie I. Boden
- Environmental HealthBoston University School of Public HealthBostonMassachusetts
| | - Jamie E. Collins
- Orthopedic SurgeryBrigham and Women's Hospital, Harvard Medical SchoolBostonMassachusetts
| | - Jeffrey N. Katz
- Orthopedic SurgeryBrigham and Women's Hospital, Harvard Medical SchoolBostonMassachusetts
- MedicineBrigham and Women's Hospital, Harvard Medical SchoolBostonMassachusetts
- EpidemiologyHarvard T.H. Chan School of Public HealthBostonMassachusetts
| | - Gregory R. Wagner
- Environmental HealthHarvard T.H. Chan School of Public HealthBostonMassachusetts
| | - Glorian Sorensen
- Center for Community‐Based ResearchDana‐Farber Cancer InstituteBostonMassachusetts
- Social and Behavioral SciencesHarvard T.H. Chan School of Public HealthBostonMassachusetts
| |
Collapse
|
4
|
Schwartz ML, Lima JC, Clark MA, Miller SC. End-of-Life Culture Change Practices in U.S. Nursing Homes in 2016/2017. J Pain Symptom Manage 2019; 57:525-534. [PMID: 30578935 PMCID: PMC6668722 DOI: 10.1016/j.jpainsymman.2018.12.330] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/21/2018] [Revised: 12/06/2018] [Accepted: 12/09/2018] [Indexed: 11/17/2022]
Abstract
CONTEXT The nursing home (NH) culture change (CC) movement, which emphasizes person-centered care, is particularly relevant to meeting the unique needs of residents near the end of life. OBJECTIVES We aimed to evaluate the NH-reported adoption of person-centered end-of-life culture change (EOL-CC) practices and identify NH characteristics associated with greater adoption. METHODS We used NH and state policy data for 1358 NHs completing a nationally representative 2016/17 NH Culture Change Survey. An 18-point EOL-CC score was created by summarizing responses from six survey items related to practices for residents who were dying/had died. NHs were divided into quartiles reflecting their EOL-CC score, and multivariable ordered logistic regression was used to identify NH characteristics associated with having higher (quartile) scores. RESULTS The mean EOL-CC score was 13.7 (SD = 3.0). Correlates of higher scores differed from those previously found for non-EOL-CC practices. Higher NH leadership scores and nonprofit status were consistently associated with higher EOL-CC scores. For example, a three-point leadership score increase was associated with higher odds of an NH performing in the top EOL-CC quartile (odds ratio [OR] = 2.0, 95% CI: 1.82-2.30), whereas for-profit status was associated with lower odds (OR = 0.7, 95% CI: 0.49-0.90). The availability of palliative care consults was associated with a greater likelihood of EOL-CC scores above the median (OR = 1.5, 95% CI: 1.10-1.93), but not in the top or bottom quartile. CONCLUSION NH-reported adoption of EOL-CC practices varies, and the presence of palliative care consults in NHs explains only some of this variation. Findings support the importance of evaluating EOL-CC practices separately from other culture change practices.
Collapse
Affiliation(s)
- Margot L Schwartz
- Brown University School of Public Health, Providence, Rhode Island, USA.
| | - Julie C Lima
- Brown University School of Public Health, Providence, Rhode Island, USA
| | - Melissa A Clark
- University of Massachusetts Medical School, Worcester, Massachusetts, USA
| | - Susan C Miller
- Brown University School of Public Health, Providence, Rhode Island, USA
| |
Collapse
|
5
|
Watkins V, Nagle C, Kent B, Hutchinson AM. Labouring Together: collaborative alliances in maternity care in Victoria, Australia-protocol of a mixed-methods study. BMJ Open 2017; 7:e014262. [PMID: 28270390 PMCID: PMC5353350 DOI: 10.1136/bmjopen-2016-014262] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
Abstract
INTRODUCTION For over a decade, enquiries into adverse perinatal outcomes have led to reports that poor collaboration has been detrimental to the safety and experience of maternity care. Despite efforts to improve collaboration, investigations into maternity care at Morecambe Bay (UK) and Djerriwarrh Health Services (Australia) have revealed that poor collaboration and decision-making remain a threat to perinatal safety. The Labouring Together study will investigate how elements hypothesised to influence the effectiveness of collaboration are reflected in perceptions and experiences of clinicians and childbearing women in Victoria, Australia. The study will explore conditions that assist clinicians and women to work collaboratively to support positive maternity outcomes. Results of the study will provide a platform for consumers, clinician groups, organisations and policymakers to work together to improve the quality, safety and experience of maternity care. METHODS AND ANALYSIS 4 case study sites have been selected to represent a range of models of maternity care in metropolitan and regional Victoria, Australia. A mixed-methods approach including cross-sectional surveys and interviews will be used in each case study site, involving both clinicians and consumers. Quantitative data analysis will include descriptive statistics, 2-way multivariate analysis of variance for the dependent and independent variables, and χ2 analysis to identify the degree of congruence between consumer preferences and experiences. Interview data will be analysed for emerging themes and concepts. Data will then be analysed for convergent lines of enquiry supported by triangulation of data to draw conclusions. ETHICS AND DISSEMINATION Organisational ethics approval has been received from the case study sites and Deakin University Human Research Ethics Committee (2014-238). Dissemination of the results of the Labouring Together study will be via peer-reviewed publications and conference presentations, and in written reports for each case study site to support organisational change.
Collapse
Affiliation(s)
- Vanessa Watkins
- School of Nursing and Midwifery, Deakin University, Burwood, Victoria, Australia
- Eastern Health, Women and Children Program, Victoria, Australia
| | - Cate Nagle
- School of Nursing and Midwifery, Deakin University, Burwood, Victoria, Australia
- Women's and Children's Division, Western Health, Sunshine Hospital, Victoria, Australia
- Centre for Quality and Patient Safety Research, Deakin University, Geelong, Victoria, Australia
| | | | - Alison M Hutchinson
- School of Nursing and Midwifery, Deakin University, Burwood, Victoria, Australia
- Centre for Quality and Patient Safety Research, Deakin University, Geelong, Victoria, Australia
- Centre for Nursing Research, Deakin University and Monash Health Partnership, Monash Health, Victoria, Australia
| |
Collapse
|
6
|
Clark MA, Roman A, Rogers ML, Tyler DA, Mor V. Surveying multiple health professional team members within institutional settings: an example from the nursing home industry. Eval Health Prof 2014; 37:287-313. [PMID: 24500999 PMCID: PMC4380513 DOI: 10.1177/0163278714521633] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Quality improvement and cost containment initiatives in health care increasingly involve interdisciplinary teams of providers. To understand organizational functioning, information is often needed from multiple members of a leadership team since no one person may have sufficient knowledge of all aspects of the organization. To minimize survey burden, it is ideal to ask unique questions of each member of the leadership team in areas of their expertise. However, this risks substantial missing data if all eligible members of the organization do not respond to the survey. Nursing home administrators (NHA) and directors of nursing (DoN) play important roles in the leadership of long-term care facilities. Surveys were administered to NHAs and DoNs from a random, nationally representative sample of U.S. nursing homes about the impact of state policies, market forces, and organizational factors that impact provider performance and residents' outcomes. Responses were obtained from a total of 2,686 facilities (response rate [RR] = 66.6%) in which at least one individual completed the questionnaire and 1,693 facilities (RR = 42.0%) in which both providers participated. No evidence of nonresponse bias was detected. A high-quality representative sample of two providers in a long-term care facility can be obtained. It is possible to optimize data collection by obtaining unique information about the organization from each provider while minimizing the number of items asked of each individual. However, sufficient resources must be available for follow-up to nonresponders with particular attention paid to lower resourced, lower quality facilities caring for higher acuity residents in highly competitive nursing home markets.
Collapse
Affiliation(s)
- Melissa A Clark
- School of Public Health, Brown University, Providence, RI, USA
| | - Anthony Roman
- Center for Survey Research, University of Massachusetts-Boston, Boston, MA, USA
| | | | - Denise A Tyler
- School of Public Health, Brown University, Providence, RI, USA
| | - Vincent Mor
- School of Public Health, Brown University, Providence, RI, USA
| |
Collapse
|
7
|
Lewis EF, Hardy M, Snaith B. Estimating the Effect of Nonresponse Bias in a Survey of Hospital Organizations. Eval Health Prof 2013; 36:330-51. [DOI: 10.1177/0163278713496565] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Nonresponse bias in survey research can result in misleading or inaccurate findings and assessment of nonresponse bias is advocated to determine response sample representativeness. Four methods of assessing nonresponse bias (analysis of known characteristics of a population, subsampling of nonresponders, wave analysis, and linear extrapolation) were applied to the results of a postal survey of U.K. hospital organizations. The purpose was to establish whether validated methods for assessing nonresponse bias at the individual level can be successfully applied to an organizational level survey. The aim of the initial survey was to investigate trends in the implementation of radiographer abnormality detection schemes, and a response rate of 63.7% (325/510) was achieved. This study identified conflicting trends in the outcomes of analysis of nonresponse bias between the different methods applied and we were unable to validate the continuum of resistance theory as applied to organizational survey data. Further work is required to ensure established nonresponse bias analysis approaches can be successfully applied to organizational survey data. Until then, it is suggested that a combination of methods should be used to enhance the rigor of survey analysis.
Collapse
Affiliation(s)
- Emily F. Lewis
- University of Bradford, Bradford, UK
- Mid Yorkshire Hospitals NHS Trust, Wakefield, UK
| | | | | |
Collapse
|
8
|
Miller SC, Looze J, Shield R, Clark MA, Lepore M, Tyler D, Sterns S, Mor V. Culture change practice in U.S. Nursing homes: prevalence and variation by state medicaid reimbursement policies. THE GERONTOLOGIST 2013; 54:434-45. [PMID: 23514674 DOI: 10.1093/geront/gnt020] [Citation(s) in RCA: 58] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
PURPOSE OF THE STUDY To estimate the prevalence of culture change practice in U.S. nursing homes (NHs) and examine how state Medicaid policies may be associated with this prevalence. DESIGN AND METHODS In 2009/2010, we conducted a survey of a stratified proportionate random sample of NH directors of nursing (DONs) and administrators (NHAs) at 4,149 U.S. NHs; contact was achieved with 3,695. Cooperation rates were 62.6% for NHAs and 61.5% for DONs. Questions focused on NH (physical) environment, resident-centered care, and staff empowerment domains. Domain scores were created and validated, in part, using qualitative interviews from 64 NHAs. Other NH covariate data were from Medicare/Medicaid surveys (Online Survey, Certification and Reporting), aggregated resident assessments (Minimum Data Set), and Medicare claims. Medicaid policies studied were a state's average NH reimbursement rate and pay-for-performance (P4P) reimbursement (including and not including culture change performance measures). Multivariate generalized ordered logit regressions were used. RESULTS Eighty-five percent of DONs reported some culture change implementation. Controlling for NH attributes, a $10 higher Medicaid rate was associated with higher NH environment scores. Compared with NHs in non-P4P states, NHs in states with P4P including culture change performance measures had twice the likelihood of superior culture change scores across all domains, and NHs in other P4P states had superior physical environment and staff empowerment scores. Qualitative interviews supported the validity of survey results. IMPLICATIONS Changes in Medicaid reimbursement policies may be a promising strategy for increasing culture change practice implementation. Future research examining NH culture change practice implementation pre-post P4P policy changes is recommended.
Collapse
Affiliation(s)
- Susan C Miller
- *Address correspondence to Susan C. Miller, Center for Gerontology & Health Care Research, Department of Health Services, Policy and Practice, Warren Alpert Medical School, Brown University, 121 South Main Street, G-S121-6, Providence, RI 02912. E-mail:
| | | | | | | | | | | | | | | |
Collapse
|