Kip Sullivan: Health care report card was flawed

  • Article by: KIP SULLIVAN
  • Updated: November 28, 2010 - 6:32 PM

Neither providers nor patients can derive useful information from it.

Minnesotans should be alarmed by a report recently issued by the Minnesota Department of Health (MDH). The Minnesota Health Care Quality Report claims to measure the quality of care delivered by Minnesota's clinics and hospitals, but its methodology is so crude that it is impossible to say what is being measured.

The most important of the methodological defects is that many of the quality measures the department uses are heavily influenced by factors outside of clinic and hospital control, and MDH failed to adjust scores to reflect differences in those factors.

The most important factors outside of provider control that can distort quality-of-care measures are patient health, income and lifestyle. The failure to control for these factors means that providers who treat sicker and poorer patients, or patients with unhealthy habits, will get lower grades even though they may provide better care than higher-scoring providers who treated healthier, wealthier patients.

For example, MDH requires clinics to report the percent of their patients whose blood pressure is under specified levels. Numerous factors outside the control of doctors (heredity, age, diet, exercise, access to medication, stress at home or work, etc.) affect blood pressure. Because the department didn't control for even the most obvious of these factors, it's impossible to know what to make of its blood pressure scores.

To take a real example from the report, consider the diabetes scores the department gave to the Allina clinic in Faribault and the Mayo Health System clinic in Cannon Falls. Thirty-seven percent of the Allina clinic's diabetics met all five standards selected by MDH, while only 16 percent of the Mayo patients did. The five standards include reducing blood sugar, blood pressure and cholesterol below specified levels; getting patients to quit smoking, and getting patients over 40 to take aspirin daily.

Why is there a two-fold difference in this "quality of care" measure for two reputable clinics? It could be that Mayo's Cannon Falls clinic had a lot more smokers who just can't kick the habit no matter how many times they've been advised by their doctors to do so. If that were the reason, would it be useful or fair to say Mayo's diabetes care is inferior to Allina's?

MDH has offered conflicting statements about whether its scores are adjusted to reflect differences in patients. On the one hand, the report itself states repeatedly that the scores are "risk adjusted." It turns out the department did use a very crude method of risk adjustment that has no support in the scientific literature -- it categorized patients according to their type of insurance: private, Medicare, state public programs/uninsured.

On the other hand, health commissioner Sanne Magnan told the Star Tribune she agrees that the report failed to adjust scores for factors outside of provider control. Then she argued that the report card was at least reliable enough that patients should ask a clinic with low scores why. But precisely because the department's report card is so inaccurate, clinics are not going to know why it gave them a low score or, for that matter, a high score.

I have a better idea. The public should ask Magnan: "Why did you release such an inaccurate report card?"

Ultimately what's at stake here is patient health. An inaccurate report card can damage patients at least three ways: (1) by steering patients to providers who score well on the report card but who in fact are inferior; (2) by giving providers incentives to avoid sicker patients so that their grades don't get dragged down unfairly, and (3) by giving providers an incentive to shift resources away from patients whose care is not being measured to patients whose care is being measured.

All three of these threats to patients would be eliminated if MDH would simply share its data privately with clinics and hospitals rather than publishing inaccurate scores. The scientific literature reports numerous examples of doctors using privately shared data (like that which the department has collected) to improve quality. Sharing the data privately, and giving clinics and hospitals grants to investigate why scores vary, would be a much safer way to improve the quality of medical care in Minnesota.

Kip Sullivan is a member of the steering committee of Physicians for a National Health Program, Minnesota chapter.

  • get related content delivered to your inbox

  • manage my email subscriptions

ADVERTISEMENT

  • about opinion

  • The Opinion section is produced by the Editorial Department to foster discussion about key issues. The Editorial Board represents the institutional voice of the Star Tribune and operates independently of the newsroom.

  • Submit a letter or commentary
Connect with twitterConnect with facebookConnect with Google+Connect with PinterestConnect with PinterestConnect with RssfeedConnect with email newsletters

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

question of the day

Poll: Will the Gophers bring home The Axe?

Weekly Question
 
Close