Tags

Type your tag names separated by a space and hit enter

A workstation-integrated peer review quality assurance program: pilot study.
BMC Med Imaging. 2013 Jul 04; 13:19.BM

Abstract

BACKGROUND

The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice.

METHODS

Workstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system.

RESULTS

There were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable.

CONCLUSIONS

The workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.

Authors+Show Affiliations

Department of Radiology and Diagnostic Imaging, University of Alberta, and Medical Imaging Consultants, 11010-101 Street, Edmonton, AB T5H 4B9, Canada.No affiliation info availableNo affiliation info available

Pub Type(s)

Journal Article

Language

eng

PubMed ID

23822583

Citation

O'Keeffe, Margaret M., et al. "A Workstation-integrated Peer Review Quality Assurance Program: Pilot Study." BMC Medical Imaging, vol. 13, 2013, p. 19.
O'Keeffe MM, Davis TM, Siminoski K. A workstation-integrated peer review quality assurance program: pilot study. BMC Med Imaging. 2013;13:19.
O'Keeffe, M. M., Davis, T. M., & Siminoski, K. (2013). A workstation-integrated peer review quality assurance program: pilot study. BMC Medical Imaging, 13, 19. https://doi.org/10.1186/1471-2342-13-19
O'Keeffe MM, Davis TM, Siminoski K. A Workstation-integrated Peer Review Quality Assurance Program: Pilot Study. BMC Med Imaging. 2013 Jul 4;13:19. PubMed PMID: 23822583.
* Article titles in AMA citation format should be in sentence-case
TY - JOUR T1 - A workstation-integrated peer review quality assurance program: pilot study. AU - O'Keeffe,Margaret M, AU - Davis,Todd M, AU - Siminoski,Kerry, Y1 - 2013/07/04/ PY - 2012/08/20/received PY - 2013/06/26/accepted PY - 2013/7/5/entrez PY - 2013/7/5/pubmed PY - 2014/1/7/medline SP - 19 EP - 19 JF - BMC medical imaging JO - BMC Med Imaging VL - 13 N2 - BACKGROUND: The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice. METHODS: Workstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system. RESULTS: There were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable. CONCLUSIONS: The workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile. SN - 1471-2342 UR - https://www.unboundmedicine.com/medline/citation/23822583/A_workstation_integrated_peer_review_quality_assurance_program:_pilot_study_ L2 - https://bmcmedimaging.biomedcentral.com/articles/10.1186/1471-2342-13-19 DB - PRIME DP - Unbound Medicine ER -