Guérin G, Jamali S, Soto CA, Guilbert F, Raymond J. Interobserver agreement in the interpretation of outpatient head CT scans in an academic neuroradiology practice.
AJNR Am J Neuroradiol 2014;
36:24-9. [PMID:
25059693 DOI:
10.3174/ajnr.a4058]
[Citation(s) in RCA: 7] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
BACKGROUND AND PURPOSE
The repeatability of head CT interpretations may be studied in different contexts: in peer-review quality assurance interventions or in interobserver agreement studies. We assessed the agreement between double-blind reports of outpatient CT scans in a routine academic practice.
MATERIALS AND METHODS
Outpatient head CT scans (119 patients) were randomly selected to be read twice in a blinded fashion by 8 neuroradiologists practicing in an academic institution during 1 year. Nonstandardized reports were analyzed to extract 4 items (answer to the clinical question, major findings, incidental findings, recommendations for further investigations) from each report, to identify agreement or discrepancies (classified as class 2 [mentioned or not mentioned or contradictions between reports], class 1 [mentioned in both reports but diverging in location or severity], 0 [concordant], or not applicable), according to a standardized data-extraction form. Agreement regarding the presence or absence of clinically significant or incidental findings was studied with κ statistics.
RESULTS
The interobserver agreement regarding head CT studies with positive and negative results for clinically pertinent findings was 0.86 (0.77-0.95), but concordance was only 75.6% (67.2%-82.5%). Class 2 discrepancy was found in 15.1%; class 1 discrepancy, in 9.2% of cases. The κ value for reporting incidental findings was 0.59 (0.45-0.74), with class 2 discrepancy in 29.4% of cases. Most discrepancies did not impact the clinical management of patients.
CONCLUSIONS
Discrepancies in double-blind interpretations of head CT examinations were more common than reported in peer-review quality assurance programs.
Collapse