Reducing Unneeded Clinical Variation in Sepsis and Heart Failure Care to Improve Outcomes and Reduce Cost: A Collaborative Engagement with Hospitalists in a MultiState System
OBJECTIVE: To (1) measure hospitalist care for sepsis and heart failure patients using online simulated patients, (2) improve quality and reduce cost through customized feedback, and (3) compare patient-level outcomes between project participants and nonparticipants.
METHODS: We conducted a prospective, quasi-controlled cohort study of hospitalists in eight hospitals matched with comparator hospitalists in six nonparticipating hospitals across the AdventHealth system. We provided measurement and feedback to participants using Clinical Performance and Value (CPV) vignettes to measure and track quality improvement. We then compared length of stay (LOS) and cost results between the two groups.
RESULTS: 107 providers participated in the study. Over two years, participants improved CPV scores by nearly 8% (P < .001), with improvements in utilization of the three-hour sepsis bundle (46.0% vs 57.7%; P = .034) and ordering essential medical treatment elements for heart failure (58.2% vs 72.1%; P = .038). In study year one, average LOS observed/expected (O/E) rates dropped by 8% for participants, compared to 2.5% in the comparator group, equating to an additional 570 hospital days saved among project participants. In study year two, cost O/E rates improved from 1.16 to 0.98 for participants versus 1.14 to 1.01 in the comparator group. Based on these improvements, we calculated total cost savings of $6.2 million among study participants, with $3.8 million linked to system-wide improvements and an additional $2.4 million in savings attributable to this project.
CONCLUSIONS: CPV case simulation-based measurement and feedback helped drive improvements in evidence-based care that translated into lower costs and LOS, above-and-beyond other improvements at AdventHealth.
© 2019 Society of Hospital Medicine
Provider Satisfaction
At the project conclusion, we administered a brief survey. Participants were asked to rate aspects of the project (a five-point Likert scale with five being the highest), and 24 responded. The mean ratings of the relevance of the project to their practice and the overall quality of the material were 4.5 and 4.2, respectively. Providers found the individual feedback reports (3.9) slightly more helpful than the webcast group discussions (3.7; Appendix Table 2 ).
DISCUSSION
As health systems expand, the opportunity to standardize clinical practice within a system has the potential to enhance patient care and lower costs. However, achieving these goals is challenging when providers are dispersed across geographically separated sites and clinical decision-making is difficult to measure in a standardized way.16,17 We brought together over 100 physicians and APPs from eight different-sized hospitals in five different states to prospectively determine if we could improve care using a standardized measurement and feedback system. At baseline, we found that care varied dramatically among providers. Care varied in terms of diagnostic accuracy and treatment, which directly relate to care quality and outcomes.4 After serial measurement and feedback, we saw reductions in unnecessary testing, more guideline-based treatment decisions, and better discharge planning in the clinical vignettes.
We confirmed that changes in CPV-measured practice translated into lower costs and shorter LOS at the patient level. We further validated the improvements through a quasi-experimental design that compared these changes to those at nonparticipating AdventHealth facilities. We saw more significant cost reductions and decreases in LOS in the simulation-based measurement and feedback cohort with the biggest impact early on. The overall savings to the system, attributable specifically to the AQQP approach, is estimated to be $2.4 million.
One advantage of the online case simulation approach is the ability to bring geographically remote sites together in a shared quality-of-care discussion. The interventions specifically sought to remove barriers between facilities. For example, individual feedback reports allowed providers to see how they compare with providers at other AdventHealth facilities and webcast results discussions enable providers across facilities to discuss specific care decisions.
There were several limitations to the study. While the quasi-experimental design allowed us to make informative comparisons between AQQP-participating facilities and nonparticipating facilities, the assignments were not random, and participants were generally from higher performing hospital medicine groups. The determination of secular versus CPV-related improvement is confounded by other system improvement initiatives that may have impacted cost and LOS results. This is mitigated by the observation that facilities that opted to participate performed better at baseline in risk-adjusted LOS but slightly worse in cost per case, indicating that baseline differences were not dramatic. While both groups improved over time, the QURE measurement and feedback approach led to larger and more rapid gains than those seen in the comparator group. However, we could not exclude the potential that project participation at the site level was biased to those groups disposed to performance improvement. In addition, our patient-level data analysis was limited to the metrics available and did not allow us to directly compare patient-level performance across the plethora of clinically relevant CPV data that showed improvement. Our inpatient cost per case analysis showed significant savings for the system but did not include all potentially favorable economic impacts such as lower follow-up care costs for patients, more accurate reimbursement through better coding or fewer lost days of productivity.
With continued consolidation in healthcare and broader health systems spanning multiple geographies, new tools are needed to support standardized, evidence-based care across sites. This standardization is especially important, both clinically and financially, for high-volume, high-cost diseases such as sepsis and heart failure. However, changing practice cannot happen without collaborative engagement with providers. Standardized patient vignettes are an opportunity to measure and provide feedback in a systematic way that engages providers and is particularly well-suited to large systems and common clinical conditions. This analysis, from a real-world study, shows that an approach that standardizes care and lowers costs may be particularly helpful for large systems needing to bring disparate sites together as they concurrently move toward value-based payment.