Document Detail

The Complementary Nature of Peer Review and Quality Assurance Data Collection.
MedLine Citation:
PMID:  25188432     Owner:  NLM     Status:  Publisher    
Purpose To assess the complementary natures of (a) a peer review (PR)-mandated database for physician review and discrepancy reporting and (b) a voluntary quality assurance (QA) system for anecdotal reporting. Materials and Methods This study was institutional review board approved and HIPAA compliant; informed consent was waived. Submissions to voluntary QA and mandatory PR databases were searched for obstetrics and gynecology-related keywords. Cases were graded independently by two radiologists, with final grades resolved via consensus. Errors were categorized as perceptional, interpretive, communication related, or procedural. Effect of errors was assessed in terms of clinical and radiologic follow-up. Results There were 185 and 64 cases with issues attributed to 32 and 27 radiologists in QA and PR databases, respectively; 23 and nine radiologists, respectively, had cases attributed to only them. Procedure-related entries were submitted almost exclusively through the QA database (62 of 64 [97%]). In QA and PR databases, respectively, perceptional (47 of 185 [25%] and 27 of 64 [42%]) and interpretative (64 of 185 [34%] and 30 of 64 [47%]) issues constituted most errors. Most entries in both databases (104 of 185 [56%] in QA and 49 of 64 [76%] in PR) were considered minor events: wording in the report, findings already known from patient history or prior imaging or concurrent follow-up imaging, or delay in diagnosing a benign finding. Databases had similar percentages of moderate events (28 of 185 [15%] in QA and nine of 64 [14%] in PR), such as recommending unnecessary follow-up imaging or radiation exposure in pregnancy without knowing the patient was pregnant (nine of 64 [14%] in PR and 28 of 185 [15%] in QA). The PR database had fewer major events (one of 64 [1.6%]) than the QA database (32 of 185 [17%]). Conclusion The two quality improvement systems are complementary, with the QA database yielding less frequent but more clinically important errors, while the PR database serves to establish benchmarks for error rate in radiologists' performance. © RSNA, 2014 Online supplemental material is available for this article.
Olga R Brook; Janneth Romero; Alexander Brook; Jonathan B Kruskal; Chun S Yam; Deborah Levine
Publication Detail:
Type:  JOURNAL ARTICLE     Date:  2014-9-1
Journal Detail:
Title:  Radiology     Volume:  -     ISSN:  1527-1315     ISO Abbreviation:  Radiology     Publication Date:  2014 Sep 
Date Detail:
Created Date:  2014-9-4     Completed Date:  -     Revised Date:  -    
Medline Journal Info:
Nlm Unique ID:  0401260     Medline TA:  Radiology     Country:  -    
Other Details:
Languages:  ENG     Pagination:  132931     Citation Subset:  -    
Export Citation:
APA/MLA Format     Download EndNote     Download BibTex
MeSH Terms

From MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine

Previous Document:  Early Clinical Experience with Digital Breast Tomosynthesis for Screening Mammography.
Next Document:  High Rate of A(H1N1)pdm09 Infections among Rural Thai Villagers, 2009-2010.