Surgical Quality Program Has No Impact on Outcomes

Register now

A study published in JAMA by a team from the University of Michigan Medical School shows no difference in surgical safety among 263 hospitals taking part in a major national quality effort, and 526 similar hospitals that weren’t involved.

The study analyzed data from 1,226,000 seniors enrolled in Medicare who had one of 11 major operations at those hospitals over a decade.

The initiative, the American College of Surgeons National Surgical Quality Improvement Program, or ACS-NSQIP, has been run for approximately a decade. Trained nurses at participating hospitals record data about every operation carefully, and send it to a secure central database. The ACS crunches the data from all ACS-NSQIP hospitals and shares quality reports, allowing hospitals and doctors see how their overall performance stacks up against others.

But this quality reporting, the authors find, is not enough alone to accelerate the pace of improvement in surgical safety – nor cost savings.

No matter how the authors sliced the data, the result was the same: There was no improvement in any of four measures at ACS-NSQIP hospitals as compared to similar non-ACS-NSQIP hospitals. For instance, before ACS-NSQIP hospitals entered the program, 4.9 percent of their senior patients having these 11 operations died within 30 days of their operation, compared with 5 percent of those at non-participating hospitals. About one in 10 patients suffered a complication, about 13 percent went back to the hospital within 30 days, and 0.5 percent had to have a second operation. Rates were slightly higher at non-ACS-NSQIP hospitals analyzed in the study.

After three years of participation, the rates of all four measures had dropped at ACS-NSQIP hospitals – but they had also dropped at the other hospitals. When the researchers adjusted improvement across all hospitals over time, there was no statistical difference between those treated at hospitals taking part in the ACS-NSQIP and those treated at comparison hospitals.

The cost of the patients’ care, after adjustment, was also similar -- including payments for the initial hospital stay, and payments for additional stays and extraordinary “outlier” cases.
The analysis is the first to use a control group of hospitals to study the impact of ACS-NSQIP participation – the team also matched each ACS-NSQIP hospital with not one, but two control hospitals. Patients treated at the two types of hospitals were generally similar, though ACS-NSQIP hospitals were larger and did more operations and were more likely to be nonprofits or teaching hospitals.

The 11 types of operations analyzed were esophagectomy, pancreatic resection, colon resection, gastrectomy, liver resection, ventral hernia repair, cholecystectomy, appendectomy, abdominal aortic aneurysm repair, lower extremity bypass, and carotid endarterectomy.

The lack of an effect from ACS-NSQIP participation could be due to many things, the authors said. For example, hospitals may not have used the reports to improve care, or quality improvement efforts by hospitals using their data may have fallen short of affecting the four items the study evaluated. Many hospitals may not have the infrastructure needed to develop effective strategies to improve care. In addition, outside factors, such as reimbursement-driven efforts to improve safety, improvements in care across all hospitals or selective referral of patients to high-volume hospitals, could also have played a role in improving safety at all hospitals.

“Although ACS-NSQIP hospitals are improving over time, so are other non-participating hospitals,” said the study’s lead author, Nicholas Osborne, M.D. “Our study suggests that the ACS-NSQIP is a good start, but that reporting data back to hospitals is not enough. The ‘drilling down’ that is needed to improve quality using these reports is better suited for regional collaboratives.”

The study is available here.

For reprint and licensing requests for this article, click here.