Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

Health

When Hospitals Post Doctor Ratings, Who Benefits?

Doctors at Scripps Clinic Torrey Pines in La Jolla are among those whose ratings are posted online, July 20, 2016.
Megan Wood
Doctors at Scripps Clinic Torrey Pines in La Jolla are among those whose ratings are posted online, July 20, 2016.

When Hospitals Post Doctor Ratings, Who Benefits?
Scripps Health is so far the only San Diego County health system that uses patient reviews to provide online ratings of some of its doctors. They’re supposed to help patients pick physicians, but some suggest the ratings are a marketing ploy.

When Scripps Health started using patients’ reviews to give online ratings to hundreds of its doctors last August, the San Diego health system became the second in California to do so and one of a dozen nationally.

Today, about 50 systems score their doctors with one to five stars, with 16 launches since Jan. 1. The transparency trend is taking off.

Advertisement
inewsource is an independent nonprofit dedicated to providing in-depth, data-driven journalism on the web, radio and TV.

It appears to be a bold move, one that could expose mediocre providers in a way that could hurt their bottom line and send savvy patients looking elsewhere for a doctor who is not so rude.

But for nearly all star-rated doctors nationally, variation is barely distinguishable, raising questions about whether the expensive and time-consuming effort provides information that improves care, or is just a slick marketing ploy that makes all doctors look pretty darn good.

How it works

After office visits, a contracted survey company asks patients whether the doctor’s friendliness and courtesy were very good, good, fair, poor or very poor, and if they would recommend the provider. Patients indicate how confident they were in their doctors, if their doctors were understandable and if doctors spent enough time listening. Some systems even ask patients if they were kept waiting more than 15 minutes.

Responses dealing with patient experience — health systems choose which ones to include — are converted to a star rating for each doctor. Some organizations use their own conversion tools, while others use algorithms provided by the survey vendors.

Advertisement

A look at the profiles of 406 physicians in the two San Diego County medical groups participating, Scripps Coastal and Scripps Clinic, revealed 92.1 percent received 4.7 to five stars, and those scoring lower don't look all that bad.

Pediatrician Dr. David Herz got the lowest score, 4.1 stars, while dermatologist Dr. Catherine Chen got 4.2, gastroenterologist Dr. Warren Reidel got 4.4 and pediatrician Dr. Barry Goldberg got 4.5. An additional 22 doctors got 4.6.

These surveys don't measure a doctor's diagnostic skill or whether he or she prescribed effective treatment, which may matter more to some patients. Additionally, the posted ratings usually exclude how patients experienced other elements of their visits, including staff courtesy or parking, or annoying phone time spent on hold. For most of these surveys, only the patient's direct interaction with the doctor is what gets publicly scored, based on the argument that other elements aren't under the doctor's control.

Scripps Coastal President Dr. Kevin Hirsch echoed the sentiment of officials with other health systems that have taken this step. He said the groups decided on the star rating largely because they want their doctors to show up on Google search pages above listings from third party review groups such as Healthgrades or Vitals.

These companies base their star scores on far fewer reviews — some of which may be from the physician's competitors or family members rather than real patients — and the reviewers tend to score doctors much more negatively.

Through their own survey vendors, health systems can get feedback from thousands of actual patients in a year, and thus get more reviews. Even with Herz's relatively poor score, he shows up with 43 reviews at the top of the Google search page, better than Healthgrades’ 3.4 stars with 10 reviews, Yelp's three-star rating with two reviews and Vitals' 3.5 stars from 17 reviews, Hirsch said.

"Horrible patient care"

The trend of health systems taking ratings into their own hands began with the University of Utah Health Care in 2012. Like most other systems since, Utah has taken this initiative much further than Scripps by posting patients’ anonymous written comments — from laudatory to scathing — which some say gives far more meaningful information about what a visit with that doctor is like. Utah also includes aspects of the office visit beyond just those with the doctor in the exam room.

One patient said the doctor provided a "horrible experience." Another complained he waited “80 minutes … and he spent three minutes discussing my case. … It was a joke.”

Dr. Vivian Lee, the Utah system’s CEO, extols the benefits of the project, and at a recent conference on health care data in Washington, D.C., explained how doctors have improved, some through "SWAT teams” that work not just with doctors but with nurses and receptionists.

“We had physician communication training and we even expanded our free valet parking for all our patients," Lee said.

Last year, "there were probably a half dozen health care systems implementing this type of star rating program, but today there are more than 50," said Andy Ibbotson, vice president and general manager of the National Research Corp., the second largest health care survey vendor in the nation. Nearly all of them, he said, publish patients' comments.

What's most valuable about these programs "are the comments,” Ibbotson said. “That's what gives you rich detail about the patient encounter and a feel for what it’s like to actually be a patient of that provider. … We’ve read some pretty scathing comments.”

Utah and many other systems also display physician star ratings for each question, while Scripps aggregates all responses into a single star rating. That can veil weaknesses that patients may want to know, including how well the doctor listens.

Interpreting the scores

Dr. Greg Burke, chief patient experience officer for Pennsylvania-based Geisinger Health System, which went live with star ratings and comments for 1,300 clinicians in October, acknowledged the program is "like Lake Wobegon, where everybody is above average," except for the patients' comments that are posted underneath.

Burke insisted, however, that even though all doctors get at least four stars, the decimal points can reveal underperformers. "If I'm looking at a physician with a 4.2 star rating, I know that compared to others, that's probably at the bottom 10 percent in patient experience," he said. "They're performing not very well compared with their peers. You just look at the score differently: 4.0-4.3 stars is a D, and 4.4-4.5 is a C."

Burke said the real value of Geisinger's program is how doctors absorb criticism during the run-up period, when they get to see how they fare compared with their peers. "They were aware this was coming, (and) that in and of itself made them more aware of their communication styles." Patients should feel they get their questions answered, "and they got feedback that wasn't the case."

Now, most physicians have improved their scores dramatically, he said.

"Obviously, the reason these hospitals are doing this is because of marketing. That goes without saying," said Dr. Ashish Jha, who has studied and written favorably about this trend. But star ratings alone, which show little variation, are not enough, said Jha, director of the Harvard Global Health Institute.

"You have to have that narrative, those comments," he said. "Health systems that do these star ratings but then shut off the comments, that's where I say no, no, no. You're not actually being transparent. Now it's become a pure marketing tool. What you're saying is you're not actually interested in sharing with the community what's going on."

Leah Binder, president and CEO of the Leapfrog Group, which rates hospitals based on quality and safety, thinks the trend "is a good sign of progress, albeit imperfect. It tells us that providers are putting a priority on patient perspectives. Ultimately patients will learn how to differentiate among the reviews that give an 'easy A' and those that tell it like it is."

An important element, however, is that these are real patients expressing their opinions, not competitors or someone's mother.

For doctors who say these ratings are worthless because "they don't capture how good I am at diagnosing disease, or how good I am in the operating room, I say, you're absolutely right,” said Dr. Ira Nash, senior vice president of Northwell Health of New York. “It's capturing something different. We're still in our infancy as an industry at being able to capture things like surgical skill or diagnostic acumen."

Doctors’ fears and concerns have mellowed because of all the advance work Northwell did before launching its star ratings, he said.

"We started providing feedback well in advance of the public posting because we wanted doctors to see what their scores were going to look like so they'd have opportunity to make changes," Nash said.

It helped him improve his interactions as well, because "doctors tend to listen to the first three words out of a patient's mouth and jump to a conclusion or a question. So I try to be more mindful of giving people more space to talk, to curb that impulse." Additionally, he "never stands in the presence of a patient. I'm always sitting to give them a sense I'm not in a rush to get out of the room."

Hirsch also said the scores have proved useful for raising physicians' awareness about how they're seen by their patients, especially for those who — prompted by lower scores — are forced to admit they have "a blind spot to how they were interacting with patients in the exam room."

"I will tell you the truth, you can't change the stripes on some tigers. And for some, scores go up and down in a continuous process," Hirsch said. Some have agreed to let higher scoring doctors shadow them to suggest ways to improve.

Of course for some physicians, showing up with a higher number of stars seems trivial.

"I care so little about this I don't want to be found," said Dr. Paul Teirstein, a Scripps Clinic interventional cardiologist. "It's not discriminating much between providers … and I don't think it's very valuable."

Dr. Denise Brownlee of Scripps Coastal, who received 4.7 stars, said tersely, "I'm not interested. I'm retiring in two months." Several dozen other doctors did not respond to calls for their opinion on the rating system and some receptionists said they were not aware their physicians were being scored.

Dozens of other star-rated doctors, including those who received the lowest scores, refused to respond to a request for comment.

Care about it or not, national experts say such public scores are not going away.

Even if the driver is marketing, Jha said, that’s “not necessarily a bad thing if it leads physicians to look at why they're performing badly and make changes, to communicate better and be more respectful. Maybe they get a coach or have shadowing. We want doctors to clean up their act, and if this is what's motivating them, great."