ADVERTISEMENT

Reducing Unneeded Clinical Variation in Sepsis and Heart Failure Care to Improve Outcomes and Reduce Cost: A Collaborative Engagement with Hospitalists in a MultiState System

Journal of Hospital Medicine 14(9). 2019 September;:541-546. Published online first June 11, 2019 | 10.12788/jhm.3220

OBJECTIVE: To (1) measure hospitalist care for sepsis and heart failure patients using online simulated patients, (2) improve quality and reduce cost through customized feedback, and (3) compare patient-level outcomes between project participants and nonparticipants.
METHODS: We conducted a prospective, quasi-controlled cohort study of hospitalists in eight hospitals matched with comparator hospitalists in six nonparticipating hospitals across the AdventHealth system. We provided measurement and feedback to participants using Clinical Performance and Value (CPV) vignettes to measure and track quality improvement. We then compared length of stay (LOS) and cost results between the two groups.
RESULTS: 107 providers participated in the study. Over two years, participants improved CPV scores by nearly 8% (P < .001), with improvements in utilization of the three-hour sepsis bundle (46.0% vs 57.7%; P = .034) and ordering essential medical treatment elements for heart failure (58.2% vs 72.1%; P = .038). In study year one, average LOS observed/expected (O/E) rates dropped by 8% for participants, compared to 2.5% in the comparator group, equating to an additional 570 hospital days saved among project participants. In study year two, cost O/E rates improved from 1.16 to 0.98 for participants versus 1.14 to 1.01 in the comparator group. Based on these improvements, we calculated total cost savings of $6.2 million among study participants, with $3.8 million linked to system-wide improvements and an additional $2.4 million in savings attributable to this project.
CONCLUSIONS: CPV case simulation-based measurement and feedback helped drive improvements in evidence-based care that translated into lower costs and LOS, above-and-beyond other improvements at AdventHealth.

© 2019 Society of Hospital Medicine

Quasi-experimental Design

We used AdventHealth hospitals not participating in AQQP as a quasi-experimental control group. We leveraged this to measure the impact of concurrent secular effects, such as order sets and other system-wide training, that could also improve practice and outcomes in our study.

Study Objectives and Approach

The explicit goals of AQQP were to (1) measure how sepsis and heart failure patients are cared for across AdventHealth using Clinical Performance and Value (CPV) case simulations, (2) provide a forum for hospitalists to discuss clinical variation, and (3) reduce unneeded variation to improve quality and reduce cost. QURE developed 12 CPV simulated patient cases (six sepsis and six heart failure cases) with case-specific evidenced-based scoring criteria tied to national and Advent­Health evidence-based guidelines. AdventHealth order sets were embedded in the cases and accessible by participants as they cared for their patients.

CPV vignettes are simulated patient cases administered online, and have been validated as an accurate and responsive measure of clinical decision-making in both ambulatory11-13 and inpatient settings.14,15 Cases take 20-30 minutes each to complete and simulate a typical clinical encounter: taking the medical history, performing a physical examination, ordering tests, making the diagnosis, implementing initial treatment, and outlining a follow-up plan. Each case has predefined, evidence-based scoring criteria for each care domain. Cases and scoring criteria were reviewed by AdventHealth hospitalist program leaders and physician leaders in OCE. Provider responses were double-scored by trained physician abstractors. Scores range from 0%-100%, with higher scores reflecting greater alignment with best practice recommendations.

In each round of the project, AQQP participants completed two CPV cases, received personalized online feedback reports on their care decisions, and met (at the various sites and via web conference) for a facilitated group discussion on areas of high group variation. The personal feedback reports included the participant’s case score compared to the group average, a list of high-priority personalized improvement opportunities, a summary of the cost of unneeded care items, and links to relevant references. The group discussions focused on six items of high variation. Six total rounds of CPV measurement and feedback were held, one every four months.

At the study’s conclusion, we administered a brief satisfaction survey, asking providers to rate various aspects of the project on a five-point Likert scale.

Data

The study used two primary data sources: (1) care decisions made in the CPV simulated cases and (2) patient-level utilization data from Premier Inc.’s QualityAdvisorTM (QA) data system. QA integrates quality, safety, and financial data from AdventHealth’s electronic medical record, claims data, charge master, and other resources. QualityAdvisor also calculates expected performance for critical measures, including cost per case and length of stay (LOS), based on a proprietary algorithm, which uses DRG classification, severity-of-illness, risk-of-mortality, and other patient risk factors. We pulled patient-level observed and expected data from AQQP qualifying physicians, defined as physicians participating in a majority of CPV measurement rounds. Of the 107 total hospitalists who participated, six providers did not participate in enough CPV rounds, and 22 providers left AdventHealth and could not be included in a patient-level impact analysis. These providers were replaced with 21 new hospitalists who were enrolled in the study and included in the CPV analysis but who did not have patient-level data before AQQP enrollment. Overall, 58 providers met the qualifying criteria to be included in the impact analysis. We compared their performance to a group of 96 hospitalists at facilities that were not participating in the project. Comparator facilities were selected based on quantitative measures of size and demographic matching the AQQP-facilities ensuring that both sets of hospitals (comparator and AQQP) exhibited similar levels of engagement with Advent- Health quality activities such as quality dashboard performance and order set usage. Baseline patient-level cost and LOS data covered from October 2015 to June 2016 and were re-measured annually throughout the project, from July 2016 to June 2018.

Online-Only Materials

Attachment
Size