Peer review has become an essential component of a comprehensive radiology department quality assurance program. Multiple commercial programs, such as RADPEER, are available to fill this need but may be limited by low radiologist compliance and delayed or limited feedback. Consequently, these peer review programs may not achieve the greater goal of improving diagnostic quality. This article presents data from a peer review system implemented in an academic radiology group at a large urban multidisciplinary children's hospital. The peer review system offered instantaneous feedback with an enhanced comment feature for peer radiologists.
Peer review data were collected on 5278 radiologic studies over a 12-month period including 15 radiologists. The data were analyzed for compliance rate, discrepancy rate, and comment usage.
The compliance rate for peer review averaged 52% for the 12-month period. The compliance rate trended upward over the course of the year, with a final month's compliance rate of 76%. The discrepancy rate between original interpretation and peer review was 3.6%. Comments were voluntarily included in 7.3% of nondiscrepant peer review scores.
Our peer review process was enhanced by real-time comment-enriched feedback on both discrepant and nondiscrepant peer reviews. We show improved radiologist compliance over the course of a year in a peer review program with no incentives or penalties for performing reviews. To our knowledge, no compliance rates exist in current literature for comparison.