top of page

Winning the Post-Go-Live Burnout Battle: Why Your EHR Strategy Needs a Human Component

  • mark70309
  • 7 days ago
  • 12 min read

A Stability Edge Executive Brief

Mark C. Medlin  |  Stability Edge Consulting  |  stabilityedge.com 

 

 

THE PROBLEM

WHY NOW

WHAT LEADERS MUST DO

WHAT SUCCESS LOOKS LIKE

73% of organizations land below average on clinician EHR satisfaction after go-live.

 

Most stabilization efforts fail not because the technology breaks — but because the people using it do.

Clinician burnout, moral injury, and intent-to-leave are peaking in the post-go-live window.

 

The organizations that act now recover faster. Those that wait pay in turnover and safety events.

1. Name the resistance pattern (not just "pushback")

2. Establish decision rights and closed-loop communication

3. Attack documentation burden directly

4. Give clinicians structural authority over optimization

Net EHR Experience Score trending toward top quartile within 36 months.

 

Turnover intent declining. Burnout-to-EHR

falling. Clinicians saying: "I can do right by my patients again."

 

It is 2:47 in the morning, three weeks after go-live. A charge nurse is staring at a screen that does not resemble anything she was trained on. A hospitalist has thirty-one unsigned notes, a pharmacy queue he cannot reconcile, and a patient who is decompensating in room twelve. Somewhere, an executive sponsor is reviewing a dashboard that shows the implementation is "on track."

Both of those things are true at the same time. That is the paradox of the post-go-live period, and it is where most EHR strategies quietly fail. The technology works. The configuration was signed off. The command center reports green. And yet the clinicians — the people the system exists to serve — are hemorrhaging trust, energy, and in too many cases, their willingness to stay. And they may not be telling you any of this.

If you are a COO, CIO, CNO, or senior clinical leader reading this, you already know the pattern. What you may not yet have is a clean way to name what is happening, measure it, and intervene before it calcifies into turnover. This brief is written for those leaders. The statistics validate. The work is human.


 

 

The Scope of the Problem: What the Data Tells Us

In the 2025 KLAS Arch Collaborative study of EHR implementations, 73% of organizations that implemented a new EHR landed below the Arch Collaborative average on clinician EHR satisfaction after go-live.1 Not a handful. Not a quarter. Nearly three out of four.

 

 

57% of clinicians said their organization did NOT support the implementation well.

That is not a technology verdict. That is a leadership verdict.  — KLAS Arch Collaborative, EHR Implementations 2025¹

 

The nursing data is especially sobering. Nurse satisfaction with training dropped 24 percentage points from pre-go-live to post-go-live — the largest decline of any clinician group.1 When confidence collapses at that scale, it is a signal that the training model, not the nurse, is broken.

In the Arch Collaborative's 2025 ROI study, 54% of burned-out physicians and 43% of burned-out nurses named the EHR as a direct contributor.2 A JAMIA Open study found that EHR burden nearly doubled from pre-go-live to ten months post-go-live — a window many organizations incorrectly assume is the "we are past it" phase.9

Replacing a single physician costs $500,000 to $1,000,000 in recruitment, lost productivity, and onboarding drag.3,17,18 A mid-sized system that loses twelve physicians in the eighteen months after go-live has, in real-dollar terms, burned a transformation budget's worth of value while telling itself the project was a success.

 

 

When organizations invest in improving clinician experience, 63% report reduced burnout.

The lever exists. The question is whether leadership chooses to pull it.  — KLAS Arch Collaborative, ROI 2025²

 

Beyond Burnout: The Moral Injury Layer

Burnout is the word most organizations reach for, and it is the word I'd like you to stop using by itself. Burnout frames the problem as a capacity issue, which invites capacity-oriented interventions: a wellness app, a resilience workshop, a yoga class in the atrium. Those are not bad. They are also not the issue.

What clinicians are increasingly describing is moral injury: the corrosive experience of knowing what the right thing to do for the patient is, and being systematically prevented from doing it.8 When a physician tells you the EHR is "making me a worse doctor," she is not asking for a coping skill. She is telling you the system has placed her values in conflict with her workflow.

The AMA's STEPS Forward module makes the point bluntly: wellness programs alone do not address EHR-related burnout during transitions. The system must change, not the clinician.11 Clinicians read through these programs quickly when they are offered in place of structural change.

Burnout responds to recovery time. Moral injury responds only to restored integrity — meaning the clinician can once again see a through-line between the work they came into medicine to do and what the system asks them to do at the keyboard.8,10

The Five Faces of Resistance: What Clinicians Are Really Telling You

When clinicians push back on a new EHR, leadership often hears one undifferentiated signal: "they don't like the system." That framing is a mistake, and it leads to one-size-fits-all interventions that make things worse.

The Stability Edge Resistance Framework maps five distinct resistance patterns across two axes: Individual Experience to System and Identity (horizontal); Short-Term Friction to Deep Values and Role (vertical). Two amplifiers — workload and change saturation, and trust in leadership and IT — determine how loudly each pattern speaks.

Practice note: The framework below is a field-derived, practice-informed synthesis developed from EHR burnout, moral injury, cognitive overload, and physician engagement literature.8,9,10,11,20 It has not been formally validated in a controlled study. It is offered as a structured clinical leadership heuristic, not a diagnostic instrument.

Pattern

What It Looks Like

What Leaders Do

Metric Should Move

The Voice

Cognitive Overload

Short-Term Friction × Individual

• Overtime spikes on Day 3–14

• Inbox backlog visible at shift end

• Staff arriving early, leaving late

• Deploy at-the-elbow support immediately

• Redesign highest-friction workflows first

• Protect documentation time blocks

EHR: Time-in-system per encounter (Epic Signal)

Ops: Overtime hours; avg shift length; inbox queue depth

"I have no bandwidth. My backlog is piling up."

Competence Threat

Short-Term Friction × System / Identity

• Quiet avoidance of the new system

• Using workarounds or paper

• Resistance to being observed

• Peer-led, private practice sessions

• Leaders publicly normalize the learning curve

• Competency shown as a team metric, not individual

EHR: Proficiency scores (Arch Collaborative training satisfaction)

Ops: Workaround incident log; training completion rate

"I look slow or incompetent using this system."

Moral Injury

Deep Values × Individual Experience

• Safety events or near-misses tied to workflow

• Escalating tickets framed as patient risk

• High-emotion language in huddles

• Triage safety-flagged tickets within 24 hours

• Transparent closed-loop fixes within 72 hours

• Executive public acknowledgment of clinical cost

EHR: Safety-tagged tickets; closed loop rate

Ops: Near-miss reports; staff intent-to-leave pulse score

"This feels unsafe. I can't do right by my patients."

Identity Resistance

Deep Values × System / Identity

• Physicians citing reduced patient face time

• After-hours documentation ("pajama time")

• Disengagement from optimization governance

• Audit and reduce documentation burden

• Introduce ambient AI or scribes where feasible

• Restore bedside time as a measurable KPI

EHR: Pajama-time index (after-hours login duration)

Ops: Face-time-to-documentation ratio; ambient AI adoption

"This isn't real medicine. I'm a data clerk."

Autonomy Threat

Spanning — amplified by low trust

• Vocal pushback from physician champions

• Skepticism of optimization promises

• Resistance labeled "change fatigue"

• Establish visible clinician governance of backlog

• Publish decision rights map within 2 weeks

• Acknowledge which trade-offs were imposed and why

EHR: Clinician governance participation rate

Ops: Trust-in-IT pulse score; optimization backlog age

"Decisions were made to us, not with us."


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

© Stability Edge Consulting. The Resistance Framework is a proprietary practice-informed synthesis. Not a reproduction of any single published source.

What Successful Organizations Do Differently: Three Cases

The organizations that get post-go-live right treat clinician experience as the primary deliverable, not a byproduct. Three examples from the KLAS Arch Collaborative and peer-reviewed literature illustrate the pattern.

UTHealth Houston: Stabilization as a Multi-Year Commitment

UTHealth Houston implemented Epic in May 2021, during the COVID-19 pandemic. Three years later, they hit the 97th percentile Net EHR Experience Score in the Arch Collaborative.6,7 What changed was not the software — it was the operating model: Epic Signal data paired with expert-led training, engaged physician governance, and an optimization rhythm treated as a permanent capability.

Reid Health: Culture Change in a Rural System

Reid Health moved from a culture its own clinicians described as burned-out and disengaged to Arch Collaborative Pinnacle Award-winning EHR satisfaction.7 The through-line was a clinician-driven culture: physicians and nurses owning the optimization backlog, setting priorities, and being visibly trusted to do so. Budget did not solve the problem. Shifted authority did.

Cleveland Clinic: Attacking the Documentation Burden Head-On

Cleveland Clinic addressed documentation burden through a stepped strategy from virtual scribes to ambient AI.15 In a published study, clinicians using ambient AI scribes for thirty days saw burnout drop from 52% to 39%, with after-hours documentation time falling by nearly an hour per week.16 The lesson: naming documentation burden as a clinical problem — not an IT problem — and deploying technology to reduce it.

What unites all three cases is that leadership treated post-go-live as a stabilization problem, not a close-out problem. They invested after go-live at the same intensity they invested before it.

The Stability Edge Operating Model: Cadence, Roles, and Metrics

The following table is a replicable operating model drawn from the Stability Edge Human-Centered Stabilization Approach, expressing the four framework elements — Decision Clarity, Signal Clarity, Execution Focus, and the Human Dimension — as a concrete weekly and monthly rhythm.

Practice note: This model is practice-derived and illustrative. Adapt roles and cadences to your governance structure. Metric targets will vary by baseline and system.

 

Cadence

Who

What Happens

What Should Move

Daily

(Days 1–30)

At-the-elbow leads

Unit charge nurses

IT command center

Floor rounding + real-time issue triage

Closed-loop "fix log" posted by 3 PM

Overtime hours ↓

At-elbow ticket volume ↓

Safety-tagged issues resolved within 24 hrs

Weekly

(Weeks 1–12)

CNO / CMO

Service line leads

Optimization PM

"You said / We did" clinician communication

Top 5 friction items published & owned

Resistance diagnostic pulse by unit

Burnout pulse score ↑

Workflow ticket close rate ↑

Clinician governance participation ↑

Bi-Weekly

(Months 1–6)

CIO / COO

Physician champion council

HTM / IT leadership

Decision rights review — push decisions downstream

Backlog prioritization by clinician pain score

Documentation burden audit

Pajama-time index ↓

EHR proficiency scores ↑

Decision backlog age ↓

Monthly

(Months 1–24)

Executive sponsor

Arch Collaborative liaisons

HR / Workforce team

Net EHR Experience Score review

Turnover intent monitoring

Capacity plan vs. actual review

NEES trending toward top quartile

Turnover intent ↓

Net burnout contribution ↓

 

The discipline in this model is not the meeting — it is the closed loop. Every cadence produces a visible artifact clinicians can see and react to. That visibility is what rebuilds trust, and trust is the precondition for adoption.1,2,14

Go-Live Metrics & Dashboard Design

The four pillars of the Stability Edge framework only work if you can see them moving. The table below translates each pillar into the specific metrics a healthcare leader should track, where to find the data, what a healthy trend looks like, and what signals a problem before it becomes a crisis.

A well-designed stabilization dashboard does not need twenty metrics. It needs the right four to six, visible weekly, owned by name. The warning signals in the right column are the ones that, if ignored for two or more weeks, historically predict a turnover spike or a safety event.1,2,9,14

Pillar

Key Metrics

Data Source

What "Good" Looks Like

Warning Signal

Decision Clarity

Who decides, how fast

• Optimization ticket owner assignment rate

• Avg time from ticket open → owner assigned

• % decisions resolved at unit vs. enterprise level

IT service desk

Epic optimization queue

Governance log

• >90% tickets have named owner within 24 hrs

• Decisions pushed to unit level within 2 weeks of go-live

• Governance meeting produces closed items each cycle

• Tickets sitting unassigned >48 hrs

• All decisions escalating to enterprise

• Governance backlog growing week over week

Signal Clarity

What the system is telling you

• In-basket / inbox volume per provider per day

• Time-in-system per encounter (pajama-time index)

• Safety-tagged ticket volume

• At-the-elbow support request rate

Epic Signal / Oracle Advance

EHR audit logs

IT command center

ATE team log

• Inbox volume trending down by week 3–4

• Time-in-system per encounter approaching pre-go-live baseline by week 8

• Safety-tagged tickets closing within 24–48 hrs

• Inbox volume rising after week 2

• Pajama-time index climbing (providers working after 9 PM)

• Safety tickets unresolved >72 hrs

• ATE request volume not declining

Execution Focus

Are fixes actually landing

• Weekly ticket closure rate

• "You said / We did" communication frequency

• Top-5 friction items resolved per week

• Workaround incident count

IT service desk

Optimization tracker

Clinical informatics

Incident log

• Closure rate >70% of week's opened tickets by Friday

• "You said / We did" published every week without exception

• Workaround count declining by week 4

• Top-5 list turning over (not the same items week 3+)

• Same tickets appearing on Top-5 list for 2+ weeks

• "You said / We did" published inconsistently

• Workaround count flat or rising

• Closure rate below 50% of weekly volume

Human Dimension

How people are actually doing

• Clinician burnout pulse score (weekly)

• Turnover intent survey (monthly)

• Training satisfaction score (Arch Collaborative)

• Overtime hours by unit

• Voluntary comment volume in huddle logs

Staff pulse survey

HR system

Arch Collaborative

Payroll / scheduling

Huddle facilitator notes

• Burnout pulse score stable or improving by week 4

• Turnover intent below pre-go-live baseline by month 3

• Training satisfaction ≥ Arch Collaborative average

• Overtime hours trending toward baseline by week 6

• Burnout pulse score rising 2+ weeks in a row

• Turnover intent spiking (especially nurses, weeks 3–8)

• Voluntary attrition accelerating

• Huddle comment tone shifting negative without resolution

 

Data sources will vary by EHR platform. Epic Signal, Oracle Health Advance, and equivalent tools in other platforms provide most of the quantitative data. Pulse scores require a lightweight weekly survey of 3–5 questions; many organizations use existing engagement survey tools on an accelerated cadence during the first 90 days post-go-live.

Practical Leadership Actions: Starting This Week

If you are reading this in the stabilization window right now, here is what to do in the next seven days.

•         Walk the floor at the hours that hurt. Pick the shift with the highest overtime and the most incident reports. Go. Listen. Take no notes on a laptop.

•         Run the Resistance Framework in one unit. Pick your most vocal unit and diagnose which of the five patterns is loudest. You will be surprised how often it is Autonomy Threat, not Cognitive Overload.

•         Stand up your dashboard this week. Pick one metric from each of the four pillars in the table above. Assign an owner. Commit to a weekly review. You do not need a BI tool — a shared spreadsheet updated by Friday afternoon will work for the first 90 days.

•         Publish the closed-loop list. Name five issues clinicians raised, five you closed this week, and five you are working on next. Put names on the owners. Every week.

•         Move one decision downstream. Identify one optimization decision bottlenecked at the enterprise level and push it to the service line or unit. The visibility of that delegation is itself an intervention.

•         Commission a documentation-burden audit. The fastest visible win in year one post-go-live is almost always reducing minutes at the keyboard.13,15,16 Ambient AI, scribes, template rationalization, and inbox redesign are all candidates.

•         Retire one wellness-only intervention. Replace it with a workflow change.11,12 Make that trade explicit to clinicians. They will respect the honesty.

The Bottom Line: Stability Is Not Just a Technical Problem

An EHR go-live is the most visible technology event in a health system's life. It is also, at its core, not a technology event. It is a multi-year change in how thousands of clinicians perform the daily work of their professional identity. When stabilization is treated as a close-out checklist, the organization absorbs the cost in turnover, safety events, and moral injury — long after the vendor has moved on.

The data is clear: organizations that invest in the human layer post-go-live reduce burnout in the majority of cases,2 retain clinicians who had already decided to leave,4 and reach the top decile of EHR experience within a few years of a hard transition.1,6,7 The lever exists. The question is whether leadership chooses to pull it.

 

"From Friction to Momentum" is not a tagline. It is a sequence.

Friction is the honest starting condition of every post-go-live environment.

Momentum is what you build when you address all four elements — without shortcuts.

 

About the Stability Edge Resistance 5-Minute Diagnostic

The Resistance 5-Minute Diagnostic is a leadership conversation tool deployable immediately post-go-live or at any point in the stabilization window. It is not a survey instrument — it is a structured facilitation guide that takes approximately five minutes per unit or service line and produces a concrete intervention map aligned to the five resistance patterns in this brief.

If you are in the middle of a difficult stabilization and unsure where to start, start with the diagnostic. To request a walk-through for your leadership team, or to learn more about the Stability Edge Human-Centered Stabilization Approach, visit stabilityedge.com or call 1-844-844-2309.

From friction to momentum.

 

References

1. KLAS Arch Collaborative. EHR Implementations 2025. https://klasresearch.com/archcollaborative/report/ehr-implementations-2025/628

2. KLAS Arch Collaborative. The ROI for Improving Your Clinicians' Experience 2025. https://klasresearch.com/archcollaborative/reports

3. KLAS Arch Collaborative. Clinician Turnover 2024. https://klasresearch.com/archcollaborative/report/clinician-turnover-2024/621

4. KLAS Arch Collaborative. Clinician Turnover 2025. https://klasresearch.com/archcollaborative/report/clinician-turnover-2025/704

7. KLAS Research. Arch Case Studies. https://klasresearch.com/archcollaborative/casestudies

8. Panagioti M, et al. Burnout, Moral Injury, and Implications for the EHR. J Gen Intern Med. 2021. https://pubmed.ncbi.nlm.nih.gov/33164089/

9. Clinician experiences with a challenging EHR transition. JAMIA Open. 2024. https://academic.oup.com/jamiaopen/article/7/3/ooae067/7713901

10. Interventions to Reduce EHR-Related Burnout. Applied Clinical Informatics / PMC. 2024. https://pmc.ncbi.nlm.nih.gov/articles/PMC10764123/

11. AMA STEPS Forward. EHR Transitions: Best Practices. 2024. https://edhub.ama-assn.org/steps-forward/module/2820544

12. AMA. 4 Smart Strategies to Tame the EHR and Cut Physician Burnout. 2022. https://www.ama-assn.org/practice-management/digital-health/4-smart-strategies-tame-ehr-and-cut-physician-burnout

13. CSI Companies. Post-Go-Live EHR Training: Workflow-Based Optimization. 2026. https://csicompanies.com/post-go-live-ehr-training-how-workflow-based-optimization-reduces-provider-burnout/

14. Assessing EHR Burden After Five Years. JMIR Human Factors. 2025. https://humanfactors.jmir.org/2025/1/e65656/

16. Becker's Behavioral Health. Ambient AI scribes: burnout dropped 52%→39%. 2025. https://www.beckersbehavioralhealth.com/ai-2/liberating-cleveland-clinics-experience-with-ai-scribes-in-behavioral-health/

17. Doximity Op-Med. The Enormous Cost of Physician Turnover. 2024. https://opmed.doximity.com/articles/the-enormous-cost-of-physician-turnover

18. Mocingbird. The High Cost of Physician Turnover. 2025. https://mocingbird.com/blog/the-high-cost-of-physician-turnover-how-to-mitigate-it/

19. Healthrise. Overcoming Physician Burnout Through EHR Customization. 2024. https://www.healthrise.com/insights/overcoming-physician-burnout-through-ehr-customization/

20. Cao J, et al. EHR Use: Cognitive Load and Workload. npj Digital Medicine. 2024. https://www.nature.com/npjdigitalmed/

 

 

 

 


 

 

 

 

 
 
 

Comments


bottom of page