Our “Better Health Care” newsletter focuses this month on how hospitals are rated for patient safety. Among the several ways it mentions for consumers to measure hospital performance is a new guide from Consumers Union, the outfit that publishes Consumer Reports.
As reported last week by Reuters.com, many of the 2,463 hospitals included in the Consumers Union (CU) effort are nationally renowned facilities, and some of them earned less than stellar ratings. “The Cleveland Clinic,” reported Reuters, “some Mayo Clinic hospitals in Minnesota, and Johns Hopkins Hospital in Baltimore, for instance, rated no better than midway between ‘better’ and ‘worse’ on the CU scale, worse than many small hospitals.”
The problem, as Patrick Malone’s newsletter notes, is that a rating organization has access to only some data, and many of them have an agenda; that’s why it’s best to look at several different measures, and understand the biases and variables that each brings to the task. Reuters affirms that CU had only limited access to data, so its ratings confirm the difficulty patients experience in seeking objective information on the quality of care at any given facility.
Last week’s report was the first of its kind for CU. It included hospitals in every state, and it measured only the quality of their surgical care. It analyzed the percentage of Medicare patients who died in the hospital during or after their surgery, and the percentage that stayed in the hospital longer than expected based on standards of care for their condition.
The ratings are based on Medicare claims and clinical records for 86 kinds of surgery, including back operations, knee and hip replacements and angioplasty. Adjustments were made to account for the fact that some hospitals treat older or sicker patients. Data on patients who were transferred from other hospitals were excluded because, according to CU, they are often difficult cases that shouldn’t be counted against the receiving hospital.
Specific complications such as infections, heart attacks, strokes, or other post-surgery problems aren’t teased out, but the researchers say the length-of-stay data indicates those issues.
“Some of the findings are counterintuitive,” says Reuters. “Many teaching hospitals, widely regarded as pinnacles of excellence and usually found at the top of rankings like those of U.S. News & World Report, fell in the middle of the pack.” (See our blog about the limitations of the U.S. News & World Report annual rankings.)
As Dr. Marty Makary, a surgeon at Johns Hopkins Hospital and author of “Unaccountable: What Hospitals Won’t Tell You and How Transparency Can Revolutionize Health Care,” told Reuters, “For a complex procedure you’re probably better off at a well-known academic hospital, but for many common operations less-known, smaller hospitals have mastered the procedures and may do even better [with post-surgical care].”
Some representatives from hospitals with disappointing ratings called into question whether the methodology reflected the full picture of quality of surgical care, saying that outcome data (how well patients undergoing a certain procedure fare in the longer term) is a better metric.
In the CU rating, several urban hospitals did well despite serving many poorer, sicker patients, whose outcomes usually skew figures downward. Rural hospitals usually outperformed more urban facilities, and some that are barely known outside their own region were stars.
Reuters observed that “hospital choice matters more for some procedures than others. Length of stay for hip and knee replacements and back surgery varied widely, for instance, while hospitals’ scores for colon surgery and hysterectomy were more similar to one another.”
There’s a lot of hospital data out there you’re unable to see. According to Reuters, the American College of Surgeons collects data on things like the rate of infections at the surgical site and urinary tract infections through its National Surgical Quality Improvement Program. But it doesn’t release the data publicly because hospitals give the information to the program on a confidential basis. Still, about 1 in 5 of about 500 participating hospitals voluntarily reports some data to the federal Center for Medicare and Medicaid Services.
Some hospitals, including the Cleveland Clinic, make information about outcomes available on their websites. CU advocates for such greater “medical transparency,” and Dr. Peter Provonost of Johns Hopkins supports requiring hospitals to report patient outcomes just as the Securities and Exchange Commission requires public companies to report financial data.