Governance & Policy

'We Need an AI Strategy' Is Not an AI Strategy

RD
Ron Diver
Founder

Every health system in America now has an AI strategy. Most of them say the same thing: "Leverage artificial intelligence to improve patient outcomes, reduce costs, and enhance operational efficiency." This is not a strategy. It is a sentence. It could apply to any technology, any organization, any decade. It tells you nothing about what you will actually do, in what order, with whose money, and at the expense of what else.

The pressure to produce this sentence is real. It comes from the board, the CEO, the CMO, every department head who read a vendor press release, and the consultant who presented a quadrant chart at last quarter's retreat. The pressure is not the problem. The problem is that most organizations respond to it by creating a document that makes everyone feel better without forcing anyone to make a decision.

Strategy is not aspiration. Strategy is prioritization. And prioritization means disappointing people.

The Slide Deck Problem

A typical health system AI strategy deck runs 20-30 slides. It opens with market context — how much the healthcare AI market is projected to grow, what peer institutions are doing, a timeline of regulatory developments. Then it lists potential use cases across clinical, operational, and administrative domains. Then it proposes a phased approach: Phase 1 is a pilot, Phase 2 is scaling what works, Phase 3 is transformation.

This is not wrong, exactly. It is empty. It does not answer the questions that matter:

  • Which three use cases will you pursue in the next 12 months?
  • What will you explicitly not do?
  • How much are you willing to spend before you expect measurable returns?
  • Who owns each initiative — not the steering committee, the actual person?
  • What happens when the first pilot fails?

If your AI strategy does not answer these questions, it is not a strategy. It is a permission slip to keep talking about AI without committing to anything.

What a Real Strategy Looks Like

A real AI strategy for a health system is a portfolio management exercise. It starts with an honest inventory of organizational capacity — technical infrastructure, data readiness, workforce capability, governance maturity, and financial runway. Then it forces a rank-ordered list of use cases against two axes: impact and feasibility.

Impact is not "this could save millions." Impact is a specific, measurable outcome tied to an organizational priority that the board has already funded. Revenue cycle automation that reduces days in A/R by four. Clinical documentation that gives physicians back 45 minutes per shift. Prior authorization workflows that cut turnaround from five days to same-day.

Feasibility is not "the vendor says it works." Feasibility is whether your organization — this one, with its actual EHR configuration, its actual data quality, its actual IT staffing — can deploy, integrate, validate, and sustain the solution. A use case that scores high on impact and low on feasibility is not a priority. It is a wish.

The rank-ordered list will have 30 items on it. You will fund five. Maybe three. The strategy is the list, the ranking criteria, and the discipline to hold the line when the chief of surgery wants to jump the queue because he saw a demo at HIMSS.

Build vs. Buy (It's Almost Always Buy)

Community health organizations — FQHCs, critical access hospitals, tribal health programs, safety-net providers — do not have the engineering teams to build AI tools. This is not a criticism. It is a resource reality. The build vs. buy question, for most health systems under 500 beds, has a simple answer: buy.

But "buy" is not simple either. It means:

  • Vendor evaluation — not based on demos, but on reference calls with organizations your size, integration requirements with your specific EHR, and contractual guarantees around data ownership, model transparency, and performance monitoring
  • Integration planning — who connects the tool to your workflows, who maintains the connection, what happens when your EHR upgrades and the integration breaks
  • Validation — not trusting the vendor's accuracy numbers, but running your own data through and measuring whether the tool performs in your patient population the way it performed in theirs
  • Sunset criteria — defining in advance what "not working" looks like, so you can kill a deployment without it becoming a political battle

The organizations that get stuck are the ones that treat procurement as the finish line. Procurement is the starting line. The work is everything after.

Resource Planning Nobody Wants to Do

AI projects fail for the same reason most IT projects fail: the organization underestimates what it takes to sustain them. A clinical AI tool is not a piece of software you install. It is an ongoing operational commitment — monitoring for model drift, retraining on new data, managing clinician feedback, handling edge cases, updating workflows, responding to regulatory changes.

This means staff. Not a committee. Staff. Someone whose job it is to manage the AI portfolio the way someone's job is to manage the EHR. For smaller organizations, this might be a fractional role or a shared service. But it cannot be "the IT director will handle it on top of everything else." That is how tools get deployed, ignored, and quietly abandoned while the subscription keeps billing.

Budget accordingly. The license fee is 40% of the cost. Integration, training, validation, and ongoing management are the other 60%. If your financial model only accounts for the license fee, you do not have a financial model.

Board Communication

Board members want to know three things about AI: Are we falling behind? Are we exposed to risk? How much will it cost?

Answer those questions directly. Do not present a technology roadmap to a governance body. Present a portfolio summary: here are the initiatives we are pursuing, here is why we chose them over alternatives, here is what we are spending, here is how we will measure success, and here is what we are deliberately not doing yet. The "not doing yet" section is the most important part. It demonstrates that leadership is making choices, not chasing trends.

If you cannot explain your AI strategy to your board in 10 minutes without using the words "leverage," "synergy," or "ecosystem," you do not understand your own strategy well enough.

The Hardest Part

The hardest part of AI strategy is not technology selection. It is not vendor negotiation. It is not even funding.

The hardest part is saying no. No to the department that wants a chatbot. No to the physician champion who found a startup. No to the board member who read a Wall Street Journal article. No to the vendor offering a free pilot that will consume six months of IT bandwidth.

Saying no requires a framework — the prioritization criteria, the capacity inventory, the portfolio limits. Without a framework, every request is evaluated on its own merits, and on its own merits, every request sounds reasonable. That is how organizations end up with 15 AI pilots, no coherent data governance, and an IT team that is underwater.

A real AI strategy gives you the authority to say: "That's a good idea. It's number 11 on our list. We're executing numbers one through four this year. We'll revisit the list in Q1."

That sentence is worth more than any slide deck.

Where to Start

If your organization does not have an AI strategy — or has one that reads like a press release — the first step is not hiring a consultant or buying a platform. The first step is an honest assessment of where you actually stand: your data infrastructure, your governance maturity, your workforce readiness, your financial capacity, and your organizational appetite for change.

That assessment becomes the foundation for every decision that follows. Without it, you are building on sand.

We built the LumenHealth AI Readiness Assessment for exactly this purpose — a structured evaluation designed for community health organizations that tells you where you are, not where a vendor wants you to think you are. It takes 15 minutes. It might save you 18 months of unfocused spending.

Assess your organization's AI governance readiness

37 questions across five domains. Free facilitated debrief with your leadership team.

Take the Readiness Assessment →