ProPublica

Journalism in the Public Interest

Cancel

Hospital Ratings Fatigue Part II: Reporting on the Report Cards

As top ranking groups get grades, the nation’s foremost accrediting commission nearly doubles the number of hospitals named “top performers.” What’s it all mean?

.

When I wrote last week about how journalists should use caution when covering hospital rankings, ratings and report cards, I didn’t appreciate how good my timing was.

Days later, the hospital accrediting group The Joint Commission labeled 1,099 hospitals as “top performers” — close to double the number from the year before.

“These results are more than numbers,” the group’s president and CEO Mark Chassin said, according to a story by HealthLeaders Media.

"They mean better care for the millions of Americans who require surgery, or who are hospitalized with heart attack, heart failure, stroke, or pneumonia. They mean better care for children coping with asthma, better inpatient psychiatric treatment, increased rates of immunization, and better prevention of dangerous blood clots known as venous thromboembolism."

And then Monday, the Healthcare Association of New York State issued its own report card on hospital ratings, measuring their utility and finding many of them deficient. Among those that didn’t meet the grade were some of the most prominent ratings: U.S. News and World Reports, the Leapfrog Group, HealthGrades and Consumer Reports. Among those that did: The Joint Commission.

“We had been hearing more and more from members their general frustration of all the different report cards,” said Kathleen Ciccone, director of HANYS’ Quality Institute, according to a Kaiser Health News story. “It’s so time consuming for them to be able to respond to the reports, to be able to see what’s useful about them. They’re really looking for some guidance on how to use the information.”

The Kaiser story notes that, “By at least one criterion, the HANYS report on report card falls short of its own standards: HANYS did not give the ratings groups an opportunity to preview the report before publication.”

Longtime health reporter Trudy Lieberman wrote on her Center for Advancing Health blog that she remains skeptical the report cards amount to much:

The take-aways: people are not about to fish around for a hospital when they need a hospital procedure, especially one that is not elective, and they want to be near family and friends. Where does that leave hospital ratings? When it comes to usefulness, ratings are probably no more helpful than they were 20 years ago when marketplace consumerism attempted to gain a toehold in health care.

Lieberman wrote that the best advice she's heard about hospital care came from Don Berwick, founder of the Institute for Healthcare Improvement and past administrator of the Centers for Medicare and Medicaid Services: When you go to the hospital, take someone with you and get out quickly.

I continue to believe that reporters should tread with caution. As report cards proliferate, we may just throw up our arms and cry uncle.

Lisa McGiffert

Nov. 7, 2013, 5:10 p.m.

It is ironic that the only report card that HANYS approved of was the one which is based on measures that almost all hospitals perform well. The Joint Commission “grades” hospitals on process measures that tell how often hospitals follow best practices to prevent complications, infections, and to respond to patients needs. Hospitals have been reporting these measures for many years and for several years, nearly all hospitals follow these practices more than 90% of the time (you can check it out on hospitalcompare.gov). CMS is retiring some of these measures because most hospitals finally “get it” and are doing them as a matter of routine. It is a good thing that hospitals are following best practices but they are not a proxy for best outcomes and consumers need to be aware of that. For example, hospitals are complex environments - all of the right practices can be conducted on a surgery patient, but one doctor’s contaminated hand in the patient’s wound can eradicate that and cause an infection. Further, why does the Joint Commission only show the TOP performers? Consumers want to know which hospitals they should avoid, too. The public should demand report cards that are based on outcome measures like infection rates and mortality rates, and on measures that show the real differences among hospitals. They are not all alike.

Kerry O'Connell

Nov. 7, 2013, 9:57 p.m.

I still don’t believe that the bar has been raised

Healthcare of old had a bar but no numbers on the contestants jerseys, no starting line to jump from, no tape measure on the vertical post, and no score board. Winners were determined by the roar of the crowd.

Today some of the contestants have numbers, still no starting line,    4 different tape measures on the vertical posts,and jumble tron score boards that the fans can’t understand. Winners are determined by the size of their advertising panels.

One great way to audit hospitals would be to set objective criteria for the quality of medical records and then review a random sample of the records from a given hospital. Objective criteria might include an evidence based plan for diagnosis and treatment, a clear listing of all drugs being taken, and a high quality history. A study from a few years ago (Dunley, 2008) showed that hospitals’ average score is about 60% and those below 50% had a 40% higher mortality than those scoring greater than 75% on the criteria. Ratings from the JC are meaningless as far as I am concerned.

Follow the ever increasing lucrative fees that hospitals pay the JCAHO, and the scales fall from one’s eyes as the results assumed nearly impossible to mistake clarity.

Get Updates

Stay on top of what we’re working on by subscribing to our email digest.

optional

Our Hottest Stories

  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •