51
|
Abstract
The general linear mixed model provides a useful approach for analysing a wide variety of data structures which practising statisticians often encounter. Two such data structures which can be problematic to analyse are unbalanced repeated measures data and longitudinal data. Owing to recent advances in methods and software, the mixed model analysis is now readily available to data analysts. The model is similar in many respects to ordinary multiple regression, but because it allows correlation between the observations, it requires additional work to specify models and to assess goodness-of-fit. The extra complexity involved is compensated for by the additional flexibility it provides in model fitting. The purpose of this tutorial is to provide readers with a sufficient introduction to the theory to understand the method and a more extensive discussion of model fitting and checking in order to provide guidelines for its use. We provide two detailed case studies, one a clinical trial with repeated measures and dropouts, and one an epidemiological survey with longitudinal follow-up.
Collapse
|
52
|
Fitzmaurice GM, Laird NM. Regression models for mixed discrete and continuous responses with potentially missing values. Biometrics 1997; 53:110-22. [PMID: 9147588] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
In this paper a likelihood-based method for analyzing mixed discrete and continuous regression models is proposed. We focus on marginal regression models, that is, models in which the marginal expectation of the response vector is related to covariates by known link functions. The proposed model is based on an extension of the general location model of Olkin and Tate (1961, Annals of Mathematical Statistics 32, 448-465), and can accommodate missing responses. When there are no missing data, our particular choice of parameterization yields maximum likelihood estimates of the marginal mean parameters that are robust to misspecification of the association between the responses. This robustness property does not, in general, hold for the case of incomplete data. There are a number of potential benefits of a multivariate approach over separate analyses of the distinct responses. First, a multivariate analysis can exploit the correlation structure of the response vector to address intrinsically multivariate questions. Second, multivariate test statistics allow for control over the inflation of the type I error that results when separate analyses of the distinct responses are performed without accounting for multiple comparisons. Third, it is generally possible to obtain more precise parameter estimates by accounting for the association between the responses. Finally, separate analyses of the distinct responses may be difficult to interpret when there is nonresponse because different sets of individuals contribute to each analysis. Furthermore, separate analyses can introduce bias when the missing responses are missing at random (MAR). A multivariate analysis can circumvent both of these problems. The proposed methods are applied to two biomedical datasets.
Collapse
|
53
|
Bates DW, Spell N, Cullen DJ, Burdick E, Laird N, Petersen LA, Small SD, Sweitzer BJ, Leape LL. The costs of adverse drug events in hospitalized patients. Adverse Drug Events Prevention Study Group. JAMA 1997; 277:307-11. [PMID: 9002493] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
OBJECTIVE To assess the additional resource utilization associated with an adverse drug event (ADE). DESIGN Nested case-control study within a prospective cohort study. PARTICIPANTS The cohort included 4108 admissions to a stratified random sample of 11 medical and surgical units in 2 tertiary-care hospitals over a 6-month period. Cases were patients with an ADE, and the control for each case was the patient on the same unit as the case with the most similar pre-event length of stay. MAIN OUTCOME MEASURES Postevent length of stay and total costs. METHODS Incidents were detected by self-report stimulated by nurses and pharmacists and by daily chart review, and were classified as to whether they represented ADEs. Information on length of stay and charges was obtained from billing data, and costs were estimated by multiplying components of charges times hospital-specific ratios of costs to charges. RESULTS During the study period, there were 247 ADEs among 207 admissions. After outliers and multiple episodes were excluded, there were 190 ADEs, of which 60 were preventable. In paired regression analyses adjusting for multiple factors, including severity, comorbidity, and case mix, the additional length of stay associated with an ADE was 2.2 days (P=.04), and the increase in cost associated with an ADE was $3244 (P=.04). For preventable ADEs, the increases were 4.6 days in length of stay (P=.03) and $5857 in total cost (P=.07). After adjusting for our sampling strategy, the estimated postevent costs attributable to an ADE were $2595 for all ADEs and $4685 for preventable ADEs. Based on these costs and data about the incidence of ADEs, we estimate that the annual costs attributable to all ADEs and preventable ADEs for a 700-bed teaching hospital are $5.6 million and $2.8 million, respectively. CONCLUSIONS The substantial costs of ADEs to hospitals justify investment in efforts to prevent these events. Moreover, these estimates are conservative because they do not include the costs of injuries to patients or malpractice costs.
Collapse
|
54
|
Abstract
Since Wu and Carroll (Biometrics 44, 175-188) proposed a model for longitudinal progression in the presence of informative dropout, several researchers have developed and studied models for situations where both a vector of repeated outcomes and an event time is available for each subject. These models have been developed for either longitudinal studies with dropout or for survival studies in which a random, time-varying covariate is measured repeatedly across time. When inference about the longitudinal variable is of interest, event times are treated as covariates and are often incomplete due to censoring. If survival or event time is the primary endpoint, repeated outcomes observed prior to the event are viewed as covariates; this covariate process is often incomplete, measured with error, or observed at unscheduled times during the study. We review several models which are used to handle incomplete response and covariate data in both survival and longitudinal studies.
Collapse
|
55
|
Abstract
Many long-term clinical trials collect both a vector of repeated measurements and an event time on each subject; often, the two outcomes are dependent. One example is the use of surrogate markers to predict disease onset or survival. Another is longitudinal trials which have outcome-related dropout. We describe a mixture model for the joint distribution which accommodates incomplete repeated measures and right-censored event times, and provide methods for full maximum likelihood estimation. The methods are illustrated through analysis of data from a clinical trial for a new schizophrenia therapy; in the trial, dropout time is closely related to outcome, and the dropout process differs between treatments. The parameter estimates from the model are used to make a treatment comparison after adjusting for the effects of dropout. An added benefit of the analysis is that it permits using the repeated measures to increase efficiency of estimates of the event time distribution.
Collapse
|
56
|
Hogan JW, Laird NM. Intention-to-treat analyses for incomplete repeated measures data. Biometrics 1996; 52:1002-17. [PMID: 8805765] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
In a randomized longitudinal clinical trial designed to evaluate two or more rival treatments, an intent-to-treat analysis requires inclusion of all randomized patients, regardless of whether they remain on protocol for the duration of the study. We propose a piecewise linear random effects model for analyzing longitudinal data where the multivariate outcome can depend upon time spent on treatment. The model assumes that data are available on a random sample of subjects after treatment is terminated, and allows either a pragmatic or explanatory analysis (as defined by Schwartz and Lellouch, 1967, Journal of Chronic Diseases 20, 637-648). Full maximum likelihood estimation of the model parameters is carried out using widely available statistical software for repeated measures with missing data and for nonparametric survival curve estimation. Data from a national, multicenter pediatric AIDS clinical trial are analyzed to illustrate implementation and interpretation of the model.
Collapse
|
57
|
Fitzmaurice GM, Laird NM, Zahner GE, Daskalakis C. Bivariate logistic regression analysis of childhood psychopathology ratings using multiple informants. Am J Epidemiol 1995; 142:1194-203. [PMID: 7485066 DOI: 10.1093/oxfordjournals.aje.a117578] [Citation(s) in RCA: 60] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023] Open
Abstract
A central issue in studies of risk factors for childhood psychopathology is utilization of the information obtained about the child's mental health status from multiple informants. In this paper, the authors propose a new approach to the analysis of risk factor data when the outcomes are binary ratings (presence/absence of symptoms). This new approach has several attractive features in this setting. The strategy taken is to perform a single analysis using multivariate modeling, in which simultaneous logistic regressions are conducted for the outcomes given by each of several informants. The advantages of this approach include the following: 1) it retains the complete information about case status for each informant; 2) it permits assessment of informant-risk factor interactions as well as "overall" risk factor effects; 3) it provides measures of association between the multiple informants and adjusts for the association between responses in the analysis; and 4) missing data on a subset of respondents can be incorporated in a straightforward way, permitting all subjects with at least one informant to be used in the analysis. To illustrate the methods, the authors present findings on risk factors for measures of "Internalizing" and "Externalizing" behaviors from two surveys using parent and teacher ratings of 6- to 11-year-old children in Connecticut between 1986 and 1989.
Collapse
|
58
|
Leape LL, Bates DW, Cullen DJ, Cooper J, Demonaco HJ, Gallivan T, Hallisey R, Ives J, Laird N, Laffel G. Systems analysis of adverse drug events. ADE Prevention Study Group. JAMA 1995; 274:35-43. [PMID: 7791256] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/27/2023]
Abstract
OBJECTIVE To identify and evaluate the systems failures that underlie errors causing adverse drug events (ADEs) and potential ADEs. DESIGN Systems analysis of events from a prospective cohort study. PARTICIPANTS All admissions to 11 medical and surgical units in two tertiary care hospitals over a 6-month period. MAIN OUTCOME MEASURES Errors, proximal causes, and systems failures. METHODS Errors were detected by interviews of those involved. Errors were classified according to proximal cause and underlying systems failure by multidisciplinary teams of physicians, nurses, pharmacists, and systems analysts. RESULTS During this period, 334 errors were detected as the causes of 264 preventable ADEs and potential ADEs. Sixteen major systems failures were identified as the underlying causes of the errors. The most common systems failure was in the dissemination of drug knowledge, particularly to physicians, accounting for 29% of the 334 errors. Inadequate availability of patient information, such as the results of laboratory tests, was associated with 18% of errors. Seven systems failures accounted for 78% of the errors; all could be improved by better information systems. CONCLUSIONS Hospital personnel willingly participated in the detection and investigation of drug use errors and were able to identify underlying systems failures. The most common defects were in systems to disseminate knowledge about drugs and to make drug and patient information readily accessible at the time it is needed. Systems changes to improve dissemination and display of drug and patient data should make errors in the use of drugs less likely.
Collapse
|
59
|
Bates DW, Cullen DJ, Laird N, Petersen LA, Small SD, Servi D, Laffel G, Sweitzer BJ, Shea BF, Hallisey R. Incidence of Adverse Drug Events and Potential Adverse Drug Events. JAMA 1995. [PMID: 7791255 DOI: 10.1001/jama.1995.03530010043033] [Citation(s) in RCA: 1567] [Impact Index Per Article: 54.0] [Reference Citation Analysis] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
|
60
|
Xu X, Laird N, Dockery DW, Schouten JP, Rijcken B, Weiss ST. Age, period, and cohort effects on pulmonary function in a 24-year longitudinal study. Am J Epidemiol 1995; 141:554-66. [PMID: 7900723 DOI: 10.1093/oxfordjournals.aje.a117471] [Citation(s) in RCA: 57] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023] Open
Abstract
This paper proposes the use of two-factor models (age-period and age-cohort models) to estimate age, period, and cohort effects on pulmonary function by using the data collected in a 24-year longitudinal study in the Netherlands from 1965 to 1990. The analysis included 18,363 pulmonary function measurements on 6,148 subjects aged 20-54 years at the initial visit. The subjects were grouped into four birth cohorts (before 1923, 1923-1934, 1935-1946, and after 1946) and four survey periods (1965-1972, 1973-1978, 1979-1984, and 1985-1990). In the age-cohort model, the decrement in forced expiratory volume in 1 second (FEV1) associated with a yearly increase in age was 28.3 +/- 3.7 ml/year for a man 176 cm tall and 16.0 +/- 1.9 ml/year for a woman 163 cm tall. The estimated acceleration of decline with aging was significant for both men (beta = -0.212; standard error = 0.079 ml) and women (beta = -0.346; standard error = 0.058 ml). Compared with that of the cohort born before 1923, the average level of FEV1 was estimated to increase by 156, 277, and 379 ml, respectively, for the three younger cohorts in men (p = 0.01) and by 133, 213, and 328 ml for the three younger cohorts in women (p < 0.01). In the age-period model, the estimated linear age effect on FEV1 was 36.2 +/- 4.2 ml/year for a man and 30.5 +/- 2.3 ml/year for a woman. The age quadratic term was significant for women, but not for men. Average FEV1 was estimated to be increased by 141, 169, and 250 ml, respectively, for the periods 1973-1978, 1979-1984, and 1985-1990 in men and by 131, 138, and 219 ml in women. These period effects were significant for both men and women. In summary, this study applied the two-factor models to estimate cross-sectional and longitudinal effects of aging on FEV1 and demonstrated significant period and cohort effects, which could be attributed in part to changes in air pollutants, respiratory infections, vaccinations, types of cigarettes, diet, and lifestyles over time.
Collapse
|
61
|
Wang-Clow F, Lange M, Laird NM, Ware JH. A simulation study of estimators for rates of change in longitudinal studies with attrition. Stat Med 1995; 14:283-97. [PMID: 7724914 DOI: 10.1002/sim.4780140307] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Abstract
Many longitudinal studies and clinical trials are designed to compare rates of change over time in one or more outcome variables in several groups. Most such studies have incomplete data because some patients drop out before completing the study. The missing data may induce bias and inefficiency in naive estimates of important parameters. This paper uses Monte Carlo methods to compare the bias and efficiency of several two-stage estimators of the effect of treatment on the mean rate of change when the missing data arise from one of four processes. We also study the validity of confidence intervals and the power of hypothesis tests based on these estimates and their standard errors. In general, the weighted least squares estimator does relatively well, as does an analysis of covariance type estimator proposed by Wu et al. The best estimates of variance components are based on complete cases or maximum likelihood.
Collapse
|
62
|
Fitzmaurice GM, Laird NM, Lipsitz SR. Analysing incomplete longitudinal binary responses: a likelihood-based approach. Biometrics 1994; 50:601-12. [PMID: 7981387] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/28/2023]
Abstract
In this paper, we describe a likelihood-based method for analysing balanced but incomplete longitudinal binary responses that are assumed to be missing at random. Following the approach outlined in Zhao and Prentice (1990, Biometrika 77, 642-648), we focus on "marginal models" in which the marginal expectation of the response variable is related to a set of covariates. The association between binary responses is modelled in terms of conditional log odds-ratios. We describe a set of scoring equations for jointly estimating both the marginal parameters and the conditional association parameters. An outline of the EM algorithm used to obtain the maximum likelihood estimates is presented. This approach yields valid and efficient estimates when the responses are missing at random, but not necessarily missing completely at random. An example, using data from the Muscatine Coronary Risk Factor Study, is presented to illustrate this methodology.
Collapse
|
63
|
Abstract
BACKGROUND Common treatments used for severe psoriasis include psoralen and ultraviolet A radiation (PUVA), methotrexate, ultraviolet B (UVB), and tar. These therapies are often used for prolonged periods and may be carcinogenic. METHODS For more than 13 years, the authors have prospectively determined the incidence of skin cancer and use of treatments for psoriasis in a 1380 patient cohort originally enrolled in a therapeutic trial of PUVA at 16 university centers. RESULTS Squamous cell carcinoma (SCC) developed in more than one fourth of patients exposed to high doses of PUVA. In this group, the standard morbidity ratio for these tumors was 83 (95% confidence interval [CI], 72-96) compared with the expected number of these tumors in the general population. High-level exposure to methotrexate is a significant independent risk factor for developing SCC (relative risk, 2.1 for high versus low or no exposure; 95% CI, 1.4-2.8). Metastatic disease developed in seven patients with SCC. No significant increase in the risk of SCC was associated with long term exposure to UVB or topical tar, and no substantial increase in the risk of basal cell carcinoma was noted in association with prolonged use of any of these treatments. CONCLUSIONS Long term exposure to PUVA and methotrexate significantly increases the risk of SCC in patients with psoriasis. This risk should be considered in selection of treatment. The ultimate morbidity of these tumors is undetermined.
Collapse
|
64
|
Lipsitz SR, Fitzmaurice GM, Orav EJ, Laird NM. Performance of generalized estimating equations in practical situations. Biometrics 1994; 50:270-8. [PMID: 8086610] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/28/2023]
Abstract
Moment methods for analyzing repeated binary responses have been proposed by Liang and Zeger (1986, Biometrika 73, 13-22), and extended by Prentice (1988, Biometrics 44, 1033-1048). In their generalized estimating equations (GEE), both Liang and Zeger (1986) and Prentice (1988) estimate the parameters associated with the expected value of an individual's vector of binary responses as well as the correlations between pairs of binary responses. In this paper, we discuss one-step estimators, i.e., estimators obtained from one step of the generalized estimating equations, and compare their performance to that of the fully iterated estimators in small samples. In simulations, we find the performance of the one-step estimator to be qualitatively similar to that of the fully iterated estimator. When the sample size is small and the association between binary responses is high, we recommend using the one-step estimator to circumvent convergence problems associated with the fully iterated GEE algorithm. Furthermore, we find the GEE methods to be more efficient than ordinary logistic regression with variance correction for estimating the effect of a time-varying covariate.
Collapse
|
65
|
Lipsitz SR, Laird NM, Harrington DP. Weighted least squares analysis of repeated categorical measurements with outcomes subject to nonresponse. Biometrics 1994; 50:11-24. [PMID: 8086595] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/28/2023]
Abstract
In this paper, we describe a two-step weighted least squares method for analyzing repeated categorical outcomes when some individuals are not observed at all times of follow-up. Other weighted least squares methods for analyzing repeated measures data with missing responses have previously been proposed by Koch, Imrey, and Reinfurt (1972, Biometrics 28, 663-692) and Woolson and Clarke (1984, Journal of the Royal Statistical Society, Series A 147, 87-99). These methods give consistent estimators if the responses are missing completely at random, as discussed in Rubin (1976, Biometrika 63, 581-592). We propose a two-step method that will give consistent results under the weaker condition of missing at random, and compare it with the other two methods.
Collapse
|
66
|
Abstract
The analysis of serial measurements obtained in longitudinal studies plays an increasingly prominent role in applied research. The last few years have seen the development of many new techniques for carrying out analyses, including computer software. These methods can be used in a variety of standard problems, including repeated measures and cross-over designs, as well as growth curve analyses. We review these new methods, their application, and available computer packages. Data from a longitudinal study of lung function is used to illustrate the methods.
Collapse
|
67
|
Laird NM, Skinner J, Kenward M. An analysis of two-period crossover designs with carry-over effects. Stat Med 1992; 11:1967-79. [PMID: 1480883 DOI: 10.1002/sim.4780111415] [Citation(s) in RCA: 21] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022]
Abstract
The crossover design is a type of longitudinal study with subjects receiving different treatments in different time periods. When carry-over effects are absent, the usual crossover design is structured so that all the information about treatment effects is contained in the within-subject contrasts; standard analyses are based on these within-subject contrasts and ignore any between-subject information. With carry-over effects present these standard analyses can be very inefficient, especially for suboptimal designs. We describe alternative approaches based on methods for the analysis of longitudinal data.
Collapse
|
68
|
Berkey CS, Laird NM, Valadian I, Gardner J. Modelling adolescent blood pressure patterns and their prediction of adult pressures. Biometrics 1991; 47:1005-18. [PMID: 1742427] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
Tracking of blood pressure in adolescent boys is investigated using a mathematical model that corresponds to progression along a constant percentile. A more general analysis, based on the method of principal components, is also proposed that determines various alternative tracks or patterns that are most prevalent in the longitudinal blood pressure data. The degree of tracking along a constant percentile curve for systolic pressure was moderately high, as evidenced by a tracking index of .78 explaining 81% of the variance, but less strong for diastolic (tracking index of .60) where tracking along a percentile explained 66% of the variance. The value of the more general analysis of blood pressure patterns may lie in the assessment of adolescent risk factors for elevated adult blood pressure. Using adolescent patterns determined by either statistical model, adult systolic at age 38 was predicted (R2 = .22) by the concept of a systolic fixed percentile curve in adolescence, and similarly for diastolic (R2 = .21). However, the more general analysis based on longitudinal principal components further suggests that boys who have a larger than usual systolic peak at age 14 years, which is near the time of the adolescent physical growth spurt in these boys, may be more likely to have higher systolic pressures at age 38. Because the adult data were incomplete and highly unbalanced, these findings were obtained using random-effects models for longitudinal data.
Collapse
|
69
|
Localio AR, Lawthers AG, Brennan TA, Laird NM, Hebert LE, Peterson LM, Newhouse JP, Weiler PC, Hiatt HH. Relation between malpractice claims and adverse events due to negligence. Results of the Harvard Medical Practice Study III. N Engl J Med 1991; 325:245-51. [PMID: 2057025 DOI: 10.1056/nejm199107253250405] [Citation(s) in RCA: 506] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
BACKGROUND AND METHODS By matching the medical records of a random sample of 31,429 patients hospitalized in New York State in 1984 with statewide data on medical-malpractice claims, we identified patients who had filed claims against physicians and hospitals. These results were then compared with our findings, based on a review of the same medical records, regarding the incidence of injuries to patients caused by medical management (adverse events). RESULTS We identified 47 malpractice claims among 30,195 patients' records located on our initial visits to the hospitals, and 4 claims among 580 additional records located during follow-up visits. The overall rate of claims per discharge (weighted) was 0.13 percent (95 percent confidence interval, 0.076 to 0.18 percent). Of the 280 patients who had adverse events caused by medical negligence as defined by the study protocol, 8 filed malpractice claims (weighted rate, 1.53 percent; 95 percent confidence interval, 0 to 3.2 percent). By contrast, our estimate of the statewide ratio of adverse events caused by negligence (27,179) to malpractice claims (3570) is 7.6 to 1. This relative frequency overstates the chances that a negligent adverse event will produce a claim, however, because most of the events for which claims were made in the sample did not meet our definition of adverse events due to negligence. CONCLUSIONS Medical-malpractice litigation infrequently compensates patients injured by medical negligence and rarely identifies, and holds providers accountable for, substandard care.
Collapse
|
70
|
Abstract
An approach is illustrated for the analysis of longitudinal variables collected during adolescence. Since the method requires complete data, three techniques are compared for application when some individuals have missing values. These methods are implemented in a study of systolic blood pressures and dietary fat intakes collected longitudinally in adolescent girls. The value of adolescent systolic and fat variables in predicting systolic pressure in adult women is investigated.
Collapse
|
71
|
Leape LL, Brennan TA, Laird N, Lawthers AG, Localio AR, Barnes BA, Hebert L, Newhouse JP, Weiler PC, Hiatt H. The nature of adverse events in hospitalized patients. Results of the Harvard Medical Practice Study II. N Engl J Med 1991; 324:377-84. [PMID: 1824793 DOI: 10.1056/nejm199102073240605] [Citation(s) in RCA: 2126] [Impact Index Per Article: 64.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
BACKGROUND In a sample of 30,195 randomly selected hospital records, we identified 1133 patients (3.7 percent) with disabling injuries caused by medical treatment. We report here an analysis of these adverse events and their relation to error, negligence, and disability. METHODS Two physician-reviewers independently identified the adverse events and evaluated them with respect to negligence, errors in management, and extent of disability. One of the authors classified each event according to type of injury. We tested the significance of differences in rates of negligence and disability among categories with at least 30 adverse events. RESULTS Drug complications were the most common type of adverse event (19 percent), followed by wound infections (14 percent) and technical complications (13 percent). Nearly half the adverse events (48 percent) were associated with an operation. Adverse events during surgery were less likely to be caused by negligence (17 percent) than nonsurgical ones (37 percent). The proportion of adverse events due to negligence was highest for diagnostic mishaps (75 percent), noninvasive therapeutic mishaps ("errors of omission") (77 percent), and events occurring in the emergency room (70 percent). Errors in management were identified for 58 percent of the adverse events, among which nearly half were attributed to negligence. CONCLUSIONS Although the prevention of many adverse events must await improvements in medical knowledge, the high proportion that are due to management errors suggests that many others are potentially preventable now. Reducing the incidence of these events will require identifying their causes and developing methods to prevent error or reduce its effects.
Collapse
|
72
|
Brennan TA, Leape LL, Laird NM, Hebert L, Localio AR, Lawthers AG, Newhouse JP, Weiler PC, Hiatt HH. Incidence of adverse events and negligence in hospitalized patients. Results of the Harvard Medical Practice Study I. N Engl J Med 1991; 324:370-6. [PMID: 1987460 DOI: 10.1056/nejm199102073240604] [Citation(s) in RCA: 2569] [Impact Index Per Article: 77.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
BACKGROUND As part of an interdisciplinary study of medical injury and malpractice litigation, we estimated the incidence of adverse events, defined as injuries caused by medical management, and of the subgroup of such injuries that resulted from negligent or substandard care. METHODS We reviewed 30,121 randomly selected records from 51 randomly selected acute care, nonpsychiatric hospitals in New York State in 1984. We then developed population estimates of injuries and computed rates according to the age and sex of the patients as well as the specialties of the physicians. RESULTS Adverse events occurred in 3.7 percent of the hospitalizations (95 percent confidence interval, 3.2 to 4.2), and 27.6 percent of the adverse events were due to negligence (95 percent confidence interval, 22.5 to 32.6). Although 70.5 percent of the adverse events gave rise to disability lasting less than six months, 2.6 percent caused permanently disabling injuries and 13.6 percent led to death. The percentage of adverse events attributable to negligence increased in the categories of more severe injuries (Wald test chi 2 = 21.04, P less than 0.0001). Using weighted totals, we estimated that among the 2,671,863 patients discharged from New York hospitals in 1984 there were 98,609 adverse events and 27,179 adverse events involving negligence. Rates of adverse events rose with age (P less than 0.0001). The percentage of adverse events due to negligence was markedly higher among the elderly (P less than 0.01). There were significant differences in rates of adverse events among categories of clinical specialties (P less than 0.0001), but no differences in the percentage due to negligence. CONCLUSIONS There is a substantial amount of injury to patients from medical management, and many injuries are the result of substandard care.
Collapse
|
73
|
Abstract
We discuss maximum likelihood methods for analysing binary responses measured at two times, such as in a cross-over design. We construct a 2 x 2 table for each individual with cell probabilities corresponding to the cross-classification of the responses at the two times; the underlying likelihood for each individual is multinomial with four cells. The three dimensional parameter space of the multinomial distribution is completely specified by the two marginal probabilities of success of the 2 x 2 table and an association parameter between the binary responses at the two times. We examine a logistic model for the marginal probabilities of the 2 x 2 table for individual i; the association parameters we consider are either the correlation coefficient, the odds ratio or the relative risk. Simulations show that the parameter estimates for the logistic regression model for the marginal probabilities are not very sensitive to the parameters used to describe the association between the binary responses at the two times. Thus, we suggest choosing the measure of association for ease of interpretation.
Collapse
|
74
|
Laird NM, Wang F. Estimating rates of change in randomized clinical trials. CONTROLLED CLINICAL TRIALS 1990; 11:405-19. [PMID: 1963133 DOI: 10.1016/0197-2456(90)90018-w] [Citation(s) in RCA: 39] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
This article deals with the extension of the pretest-posttest clinical trial to the longitudinal data setting. We assume that a baseline (or pretest) measurement is taken on all individuals, who are then randomized, without regard to baseline values, to a treatment group. Repeated measurements are taken postrandomization at specified times. Our objective is to estimate the average rate of change (or slope) in the experimental groups and the differences in the slopes. Our focus is on the optimal use of the baseline measurements in the analysis. We contrast two different approaches:--a multivariate one that regards the entire vector of responses (including the baseline) as random outcomes and a univariate one that uses each individual's least squares slope as an outcome. Our multivariate approach is essentially a generalization of Stanek's Seemingly Unrelated Regression (SUR) estimator for the pretest-posttest design. The multivariate approach is natural to apply in this setting, and optimal if the assumed model is correct. However, the most efficient estimator requires assuming that the baseline mean parameters are the same for all experimental groups. Although this assumption is reasonable in the randomized setting, the resulting multivariate estimator uses postrandomization data as a covariate; if the assumed linear model is not correct, this can lead to distortions in the estimated treatment effect. We propose instead a reduced form multivariate estimator that may be somewhat less efficient, but protects against model misspecification.
Collapse
|
75
|
Brennan TA, Localio AR, Leape LL, Laird NM, Peterson L, Hiatt HH, Barnes BA. Identification of adverse events occurring during hospitalization. A cross-sectional study of litigation, quality assurance, and medical records at two teaching hospitals. Ann Intern Med 1990; 112:221-6. [PMID: 2404447 DOI: 10.7326/0003-4819-112-3-221] [Citation(s) in RCA: 117] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/31/2022] Open
Abstract
STUDY OBJECTIVES To estimate the efficacy of a medical record review for identifying adverse events and negligent case suffered by hospitalized patients. DESIGN Cross-sectional study comparing an objective medical record review with information available from hospital quality assurance records as well as risk management and litigation records. SETTING Two metropolitan teaching hospitals in the northeastern United States. MEASUREMENTS AND MAIN RESULTS Using the litigation and risk management records as a criterion standard, we found that the medical record review had a sensitivity of 80% (93 of 116; 95% CI, 73% to 88%) for discovering adverse events and a sensitivity of 76% (51 of 67; 95% CI, 66% to 86%) for discovering negligent care. We estimated that record review of a random sample of hospitalizations across a geographic region would have even higher sensitivity (adverse-event sensitivity, 84%; negligence sensitivity, 80%). Moreover, we found that the adverse events we failed to discover led to less costly malpractice claims. A significant number of adverse events (20 of 172) among hospitalizations never gave rise to litigation or risk management investigation. Six of the twenty were due to negligent care. Quality assurance efforts at the level of the clinical departments in one hospital led to review of only 12 out of 82 risk management records. CONCLUSIONS The overwhelming majority of adverse events and episodes of negligent care are discoverable with the methods we used to evaluate medical records. Quality assurance efforts using similar record review methods should be further evaluated.
Collapse
|