Tags

Type your tag names separated by a space and hit enter

Performance results for a workstation-integrated radiology peer review quality assurance program.
Int J Qual Health Care. 2016 Jun; 28(3):294-8.IJ

Abstract

OBJECTIVE

To assess review completion rates, RADPEER score distribution, and sources of disagreement when using a workstation-integrated radiology peer review program, and to evaluate radiologist perceptions of the program.

DESIGN

Retrospective review of prospectively collected data.

SETTING

Large private outpatient radiology practice.

PARTICIPANTS

Radiologists (n = 66) with a mean of 16.0 (standard deviation, 9.2) years of experience.

INTERVENTIONS

Prior studies and reports of cases being actively reported were randomly selected for peer review using the RADPEER scoring system (a 4-point scale, with a score of 1 indicating agreement and scores of 2-4 indicating increasing levels of disagreement).

MAIN OUTCOME MEASURES

Assigned peer review completion rates, review scores, sources of disagreement and radiologist survey responses.

RESULTS

Of 31 293 assigned cases, 29 044 (92.8%; 95% CI 92.5-93.1%) were reviewed. Discrepant scores (score = 2, 3 or 4) were given in 0.69% (95% CI 0.60-0.79%) of cases and clinically significant discrepancy (score = 3 or 4) was assigned in 0.42% (95% CI 0.35-0.50%). The most common cause of disagreement was missed diagnosis (75.2%; 95% CI 66.8-82.1%). By anonymous survey, 94% of radiologists felt that peer review was worthwhile, 90% reported that the scores they received were appropriate and 78% felt that the received feedback was valuable.

CONCLUSION

Workstation-based peer review can increase completion rates and levels of radiologist acceptance while producing RADPEER scores similar to those previously reported. This approach may be one way to increase radiologist engagement in peer review quality assurance.

Authors+Show Affiliations

Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9.Intelerad, Montreal, Quebec, Canada Present address: 295 Midpark Way SE, Suite 380, Calgary, Alberta, Canada T2X 2A8.Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9.

Pub Type(s)

Journal Article

Language

eng

PubMed ID

26892609

Citation

O'Keeffe, Margaret M., et al. "Performance Results for a Workstation-integrated Radiology Peer Review Quality Assurance Program." International Journal for Quality in Health Care : Journal of the International Society for Quality in Health Care, vol. 28, no. 3, 2016, pp. 294-8.
O'Keeffe MM, Davis TM, Siminoski K. Performance results for a workstation-integrated radiology peer review quality assurance program. Int J Qual Health Care. 2016;28(3):294-8.
O'Keeffe, M. M., Davis, T. M., & Siminoski, K. (2016). Performance results for a workstation-integrated radiology peer review quality assurance program. International Journal for Quality in Health Care : Journal of the International Society for Quality in Health Care, 28(3), 294-8. https://doi.org/10.1093/intqhc/mzw017
O'Keeffe MM, Davis TM, Siminoski K. Performance Results for a Workstation-integrated Radiology Peer Review Quality Assurance Program. Int J Qual Health Care. 2016;28(3):294-8. PubMed PMID: 26892609.
* Article titles in AMA citation format should be in sentence-case
TY - JOUR T1 - Performance results for a workstation-integrated radiology peer review quality assurance program. AU - O'Keeffe,Margaret M, AU - Davis,Todd M, AU - Siminoski,Kerry, Y1 - 2016/02/17/ PY - 2016/01/07/accepted PY - 2016/2/20/entrez PY - 2016/2/20/pubmed PY - 2017/6/1/medline KW - consensus methods KW - human factors KW - medical errors KW - peer assessment KW - quality improvement KW - quality indicators SP - 294 EP - 8 JF - International journal for quality in health care : journal of the International Society for Quality in Health Care JO - Int J Qual Health Care VL - 28 IS - 3 N2 - OBJECTIVE: To assess review completion rates, RADPEER score distribution, and sources of disagreement when using a workstation-integrated radiology peer review program, and to evaluate radiologist perceptions of the program. DESIGN: Retrospective review of prospectively collected data. SETTING: Large private outpatient radiology practice. PARTICIPANTS: Radiologists (n = 66) with a mean of 16.0 (standard deviation, 9.2) years of experience. INTERVENTIONS: Prior studies and reports of cases being actively reported were randomly selected for peer review using the RADPEER scoring system (a 4-point scale, with a score of 1 indicating agreement and scores of 2-4 indicating increasing levels of disagreement). MAIN OUTCOME MEASURES: Assigned peer review completion rates, review scores, sources of disagreement and radiologist survey responses. RESULTS: Of 31 293 assigned cases, 29 044 (92.8%; 95% CI 92.5-93.1%) were reviewed. Discrepant scores (score = 2, 3 or 4) were given in 0.69% (95% CI 0.60-0.79%) of cases and clinically significant discrepancy (score = 3 or 4) was assigned in 0.42% (95% CI 0.35-0.50%). The most common cause of disagreement was missed diagnosis (75.2%; 95% CI 66.8-82.1%). By anonymous survey, 94% of radiologists felt that peer review was worthwhile, 90% reported that the scores they received were appropriate and 78% felt that the received feedback was valuable. CONCLUSION: Workstation-based peer review can increase completion rates and levels of radiologist acceptance while producing RADPEER scores similar to those previously reported. This approach may be one way to increase radiologist engagement in peer review quality assurance. SN - 1464-3677 UR - https://www.unboundmedicine.com/medline/citation/26892609/Performance_results_for_a_workstation_integrated_radiology_peer_review_quality_assurance_program_ L2 - https://academic.oup.com/intqhc/article-lookup/doi/10.1093/intqhc/mzw017 DB - PRIME DP - Unbound Medicine ER -