Home Β· Level 3 Β· Session 2
πŸ›οΈ Level 3 of 3 Β· Session 2 of 5

Level 3 Β· Integration

Institutional Integration

Making AI Stick

How do you embed AI tools durably into WHO India workflows β€” so the capability survives staff turnover, budget cycles, and the institutional gravity that pulls everything back to how it was done before?

Why good tools disappear

Across global health institutions, the pattern is consistent: a motivated staff member discovers a better way to work with AI tools, produces noticeably stronger outputs, and then leaves β€” taking the capability with them. What remains is a folder of prompts no one understands, a subscription no one knows how to use, and a team that reverts to the previous workflow within a quarter. The tool did not fail. The integration failed.

Durable institutional integration requires three things that most AI adoption efforts skip: documentation that survives personnel change, alignment with the actual decision calendar that governs WHO India's work, and explicit team norms that distribute the capability rather than concentrating it in one person. None of these is technically complex. All of them require deliberate effort that feels like overhead until the key person leaves.

Five ways AI integration fails in country offices

🧠
Single-expert dependency
One person holds all the prompts, all the workflow knowledge, and all the tool subscriptions. Their departure ends the capability entirely. The team cannot reconstruct what they did because it was never written down.
β†’ Fix: every workflow must be documented as a handover-ready SOP before it is used in production. The SOP generator in this session produces that document.
πŸ“…
Misalignment with decision cycles
AI tools are adopted outside the rhythm of actual work β€” trained on, then unused until a brief deadline hits. By then, no one remembers the protocol and the team falls back to familiar methods under time pressure.
β†’ Fix: map every AI workflow to a specific deliverable in the WHO India annual cycle. The tool gets used because the deliverable requires it, not because someone remembers to use it.
πŸ’Έ
Subscription cliff
A paid tool is adopted, integrated into workflows, and then the subscription lapses or is not renewed in a budget cycle. The team has built dependencies on a tool that is now unavailable.
β†’ Fix: build workflows around tools with free tiers or open access wherever possible. When paid tools are essential, document the substitution protocol for when access lapses.
πŸ”’
Prompt library lock-in
Effective prompts are treated as personal assets rather than team resources. They live in one person's chat history or personal notes, undocumented and untransferred. The team cannot replicate results without that person.
β†’ Fix: maintain a shared team prompt library in a shared drive, version-controlled. Each prompt is named, described, and annotated with when and why to use it.
πŸ›οΈ
Institutional resistance
AI-assisted evidence synthesis is perceived as methodologically suspect by senior colleagues or MoHFW counterparts who did not receive this training. Briefs produced with undisclosed AI use are vulnerable; briefs with transparent methods notes are not.
β†’ Fix: the disclosure note from Session 1 is institutional armour. Transparent AI use that follows WHO's ethical framework is defensible. Concealed AI use is not.
πŸ“‰
No quality benchmark
Without a before/after comparison of brief quality, speed, and equity coverage, there is no institutional evidence that the AI workflow produces better outputs. Adoption without measurement produces advocacy without proof.
β†’ Fix: for the first three briefs produced with the new workflow, track: time from question to draft, number of sources retrieved, equity disaggregation coverage, and any factual errors found in review. This is your institutional evidence base.

The integration ladder

Durable integration happens in stages. Trying to jump from individual use to institutional standard operating procedure in one step produces the resistance and reversion described above. Work through the ladder in sequence:

L1
Individual proficiency
One person can reliably run the full workflow β€” PECO-F framing, search protocol, AI-assisted screening, critical appraisal, bias check, brief generation β€” and produce a defensible output. This is what Levels 1 and 2 of this curriculum build.
Signal you are here: you have completed a full brief cycle using AI tools with documented protocol log and disclosure note.
L2
Documented workflow
The workflow is written down in enough detail that a new team member could follow it without guidance. Prompts are in a shared library. The SOP covers what to do when a tool is unavailable. This step is the most commonly skipped and the most consequential.
Signal you are here: a colleague who did not attend this training can produce a comparable brief using only the SOP document.
L3
Calendar alignment
Each AI workflow is assigned to a specific deliverable in the WHO India annual work plan. The NHA release triggers the surveillance check. The budget cycle window triggers the rapid review protocol. The workflow runs because the calendar demands it.
Signal you are here: AI workflows appear as named steps in at least two WHO India work plan deliverables for the current year.
L4
Team distribution
At least two team members can run the full workflow independently. The capability is not hostage to any one person's presence. Onboarding new staff includes the workflow documentation and this curriculum as orientation materials.
Signal you are here: the workflow ran without interruption during a period when the original trained staff member was on leave or travel.
L5
Institutional standard
The AI-assisted evidence workflow is referenced in WHO India's country office quality standards, mentioned in donor reporting as a methodological improvement, and adopted by at least one counterpart ministry unit. The capability has propagated beyond the team that built it.
Signal you are here: MoHFW or a state health ministry team has requested a brief produced using the documented workflow β€” and the methodology note was not challenged.

Aligning with India's health policy calendar

The single most effective integration mechanism is mapping AI workflows to the fixed decision points in India's health financing calendar. These windows are predictable, high-stakes, and time-pressured β€” exactly the conditions where a pre-built workflow produces the most value:

PeriodEventWorkflow triggeredLead time needed
January–February Union Budget preparation Rapid evidence review on health financing priorities; surveillance plan update 6–8 weeks before Budget Day
March–April NHA data release (prior year) OOP and financial protection figures update; brief baseline revision Within 2 weeks of release
April–May HTAIn annual assessment cycle New HTAIn reports screened against active briefs; update triggers assessed Ongoing surveillance β€” Tier 2
July–August NHM mid-year review State health financing performance evidence brief; PM-JAY claims data review 4–6 weeks before review meeting
October–November State budget cycles begin State-specific fiscal space analysis; evidence brief for state health financing teams 6 weeks before state Budget presentation
December Annual WHO country programme review Full surveillance check; brief portfolio update; workflow quality benchmark review 4 weeks before review

Generate your team SOP

πŸ›οΈ SOP generator β€” produce a handover-ready workflow document

Describe a specific evidence workflow your team runs β€” any step from search to surveillance. The generator will produce a handover-ready Standard Operating Procedure including tool list, step-by-step process, fallback options, quality checks, and disclosure requirements. Ready to save to a shared drive and hand to a new team member.

Workflow name What is this SOP for?
Calendar trigger When or why is this workflow run?
Team size and roles Who does this work?
Target output What does this workflow produce?
Current workflow steps Describe what you currently do β€” as much or as little detail as you have
Known constraints or risks What has broken this workflow before, or might?
πŸ›οΈ Standard Operating Procedure β€” save to shared drive

🎯 Key takeaway

A tool that only one person knows how to use is not an institutional capability β€” it is a personal skill waiting to leave. The SOP is the conversion mechanism: it turns individual proficiency into team infrastructure. The integration ladder gives you the sequence; the calendar alignment gives you the forcing function; the SOP generator gives you the document. All three are needed. The capability that survives a staff rotation is the one that was written down before anyone thought to ask. Session 3 addresses the hardest practical integration challenge: rapid evidence reviews under genuine time pressure.