Don't Yelp Your Doctor. Study Finds Ratings Are All Wrong.By
Online ratings don’t indicate medical skill, analysis finds
Review sites say friendly staff, kind doctors, matter too
If you’re looking for the best doctor, online ratings are unlikely to be much help.
That’s the determination of researchers at Cedars-Sinai Medical Center in Los Angeles, who compared reviews of 78 of the medical center’s specialists on five popular ratings sites with a set of internal quality measures and found there was essentially no correlation.
The results suggest that in a world awash in online feedback for seemingly every consumer choice, reliable, easy-to-interpret information on how good doctors are at their jobs remains scarce.
“Patients are using these online ratings probably more often than they should,” Timothy Daskivich, a professor of urology at the medical center and an author of the study, said in an interview.
Bradley Anhouse, a 28-year-old who works in marketing, said he looks at online reviews more to figure out how friendly doctors and their staffs are, and how quickly he’ll be seen, not to see who’s the best clinician. He said he knows he has to carefully read what people write, not just the ratings, when he uses online reviews.
“I do try and stay away from five-star reviews and I try to stay away from one-star reviews,” he said. As he’s reading, he looks for things like “how much time the doctor spends with you, whether they’re on time, and overall friendliness,” he said.
Service and Style
Brennan Spiegel, a gastroenterologist and co-author of the study, said that may be the right way to think about reviews -- as gauges of things the patient can observe.
“It may be that these ratings are a good measure of the front-office service or the interpersonal style of the physician,” said Spiegel, a professor and director of health services research at Cedars-Sinai. “We’re not saying that there’s no value to these online ratings -- we’re saying don’t confuse those ratings in any way, shape or form with the actual technical skill.”
The study, published online on Friday in the Journal of the American Medical Informatics Association, compared measures developed by Cedars-Sinai with users’ ratings on five sites: Healthgrades, Yelp, Vitals, RateMDs and UCompareHealthCare. The internal performance metrics include reviews from doctors’ colleagues and administrators, how often patients are readmitted and how long they remain in the hospital, and adherence to practice guidelines.
According to the study, there was little correlation between the doctors’ performance scores and how their patients assessed them on the websites.
Martin Makary, a professor of surgery and health policy at Johns Hopkins who studies health-care quality and safety, said patient satisfaction is a key gauge of doctor quality. Metrics like readmission rates and mortality are useful but don’t fully capture a doctor’s skill, he said.
“Knowing the patient’s experience, the experience of past patients who’ve seen the doc, is one valuable component to selecting a good doctor” Makary said. “The inferences that we can make from our traditional quality measures are very limited. We don’t have a lot to go on.”
Gina Larson, marketing director at Vitals, said the site is looking at ways to add outcome metrics alongside its patient ratings. She said star ratings are a good indicator of the doctor-patient relationship, which can be important to ensuring patients follow their doctors’ advice.
Healthgrades said patients typically use its ratings alongside other information and referrals to pick a doctor. The site incorporates data on the number of procedures a doctor performs into its “experience match” score, and says it’s working on ways to display the information more directly. Healthgrades also shows data on the quality of the hospitals that doctors are affiliated with.
“People don’t make a decision based on patient ratings,” said Andrea Pearson, the site’s chief marketing officer. “They make it based on patient ratings, experience and quality outcomes.”
Yelp Inc. is considering adding quality measures for doctors, according to spokeswoman Rachel Walker Youngblade. Its site currently lists quality indicators on health-care facilities as part of a partnership with ProPublica, and it recently added data on maternity care at California hospitals. She said research has shown that for hospitals, Yelp’s ratings correlate with other quality measures.
Nick Fiore, a 23-year-old health-policy researcher, said he turned to online doctor-finding sites because his insurer’s physician list was inaccurate. He figured most of the doctors in the specialty he was searching, nephrology, were well-qualified, so he was more interested in how he’d be treated, he said.
“I not only want to have a doctor that I feel like is warm, or is going to be good at their job,” he said. “I want to know, how long did you wait to get an appointment? What sort of experience did you have with billing?”
David Vivero said a big question typically not answered by ratings is cost. He co-founded a site called Amino, which helps patients find doctors based on how experienced the physicians are and provides information on insurance coverage and costs. It also shows some outcome information.
“We launched Amino principally because there was so little actionable information about physicians,” Vivero said. “Quality is really really hard to get right.”
Spiegel and Daskivich say their study provides few answers about how to pick a doctor.
“Not to be coy about it,” Spiegel said. “We’re definitely saying, if you’re interested in quality, technical quality and clinical skills, do not rely on the star ratings. That much is clear.”