RSNA 2016

Abstract Archives of the RSNA, 2016


SSG06-04

How Satisfied are Patients with their Radiologists? Assessment Using a Patient Ratings Website

Tuesday, Nov. 29 11:00AM - 11:10AM Room: S102D



Luke A. Ginocchio, MD, New York, NY (Presenter) Nothing to Disclose
Richard Duszak JR, MD, Atlanta, GA (Abstract Co-Author) Nothing to Disclose
Andrew B. Rosenkrantz, MD, New York, NY (Abstract Co-Author) Nothing to Disclose
PURPOSE

To use a public patient ratings website to assess the performance of radiologists in patient satisfaction.

METHOD AND MATERIALS

Patient reviews were retrieved from www.RateMDs.com for all radiologists in the 297 U.S. cities with population ≥100,000. Each review included ratings of 1-5 in four categories (staff, punctuality, knowledge, helpfulness). For Medicare-participating radiologists, group practice size and years in practice were obtained from the Physician Compare database. Common words in reviews’ free-text comments were assessed. Statistical analysis included Spearman’s rank coefficients, coefficients of variation (CV), ANOVA, and Kruskal–Wallis tests.

RESULTS

1,891 patient reviews for 1,259 unique radiologists were identified. For all four categories, the most common score was 5 (63-74%), and second most common score was 1 (14-20%); scores of 2-4 were less frequent (2%-12%). Scores for the four categories were all highly correlated with one another (r=0.781-0.951). Only 3% of reviews had a substantial discrepancy between average scores for radiologist factors (knowledge/helpfulness) and office factors (staff/punctuality). For 106 radiologists receiving ≥3 patient reviews (total of 572 reviews), average CV was 27%, indicating moderate reproducibility among different patients’ reviews for a given radiologist; in addition, ANOVA demonstrated that the radiologist accounted for 30% of total variation in the average score across patient reviews. The Northeast scored significantly lower than other U.S. regions in average scores for staff (p<0.001) and punctuality (p<0.001); scores for helpfulness and knowledge were similar across regions. Radiologists’ group practice size and years since graduation showed no correlation with satisfaction scores (r=-0.140-0.021). Common words in free-text comments included “caring”, “knowledgeable”, and “professional” for positive reviews, and “rude”, “pain”, and “unprofessional” for negative reviews.

CONCLUSION

Radiologists overall performed well, though patients posting online reviews tended to have strongly positive or negative views. Scores across categories were highly correlated, suggesting a halo effect influencing patients’ global perceptions of radiologists.

CLINICAL RELEVANCE/APPLICATION

Given the observed halo effect, radiologists must recognize the importance of office/staff-related as well as radiologist-related factors in impacting patients’ impressions and public ratings.