More in
AI Team Readiness Playbook
How to Audit Your Sales Team's AI Readiness
4月 14, 2026
Building an AI Skills Matrix for Your Department
4月 14, 2026
90-Day Plan: From AI-Curious to AI-Fluent
4月 14, 2026
AI Tools Training Playbook for Non-Technical Teams
4月 14, 2026
Hiring vs Upskilling: Decision Framework for Directors
4月 14, 2026
Setting Up an AI Champions Program in Your Department
4月 14, 2026 · Currently reading
Measuring AI Adoption ROI Across Your Team
4月 14, 2026
AI Onboarding Checklist for New Hires in 2026
4月 14, 2026
Building AI-Powered Workflows for Sales Teams
4月 14, 2026
Building AI-Powered Workflows for Marketing Teams
4月 14, 2026
Setting Up an AI Champions Program in Your Department: A Step-by-Step Guide
The fastest way to spread AI adoption across your team isn't scheduling more training sessions. It's finding the two people who already love these tools and giving them a formal role to share what they know.
Most managers look at AI adoption as a top-down problem: they need to train people, mandate usage, or bring in vendors. But the teams with the strongest AI adoption almost universally have one thing in common: a peer who makes it approachable. Someone their colleagues can walk up to and say, "Hey, how did you do that thing with the email drafts?" without feeling judged. McKinsey's research on enterprise AI adoption found that peer-nominated "AI ambassadors" were present in 68% of companies reporting high AI adoption rates, compared to only 23% of companies reporting low adoption — making the peer champion model one of the strongest structural predictors of success.
That's what an AI champions program formalizes. And unlike most training initiatives, it can be set up in two weeks and self-sustaining within 90 days. It works best as the next step after a 90-day AI fluency plan — by day 90, you'll have identified the natural champions through their behavior, not just their self-nomination.
Why Peer-Led Adoption Outperforms Top-Down Training
Manager bandwidth is finite. If you're personally responsible for driving AI adoption across a 15-person team while also hitting quarterly targets, something gets dropped. Usually it's the AI adoption follow-up.
Champions solve this by creating an adoption layer that operates without constant manager involvement. They answer day-to-day questions, model the behaviors you want, maintain shared resources, and catch the people who are falling behind before they check out entirely.
But the key word is "peer." A champion who is also someone's direct manager doesn't have the same effect. The power dynamic changes the conversation. Champions work because colleagues feel safe asking dumb questions and admitting they're struggling. That psychological safety is the program's most valuable asset. Research on psychological safety in teams from Harvard Business Review — drawing on Amy Edmondson's foundational work at Google — demonstrates that peer learning environments produce significantly higher knowledge transfer than hierarchical ones.
Step 1: Identify Champion Candidates
You're not looking for your most technical team members. You're looking for three specific qualities:
Early adopter. They're already using AI tools in some form, even informally. They've figured things out on their own and are genuinely enthusiastic, not performing enthusiasm because leadership expects it.
Respected peer. Their colleagues trust their judgment. When they say something is worth trying, people believe them. A champion that the team doesn't respect just creates more confusion.
Cross-functional visibility. Ideally, they interact across multiple parts of the team or adjacent teams. Champions with narrow visibility can only influence a small circle.
Champion Selection Scorecard
Use this to evaluate candidates objectively. Score each criterion 1-3.
| Criterion | Score (1-3) | Notes |
|---|---|---|
| Currently using AI tools independently | ||
| Demonstrates curiosity, not just compliance | ||
| Respected by peers (not just management) | ||
| Communicates clearly without jargon | ||
| Has time available (not overwhelmed) | ||
| Interacts across the broader team | ||
| Total | /18 |
Candidates scoring 14+ are strong champions. Candidates scoring 10-13 are worth considering if you have limited options. Don't force someone who scored under 10. A reluctant champion does more harm than no champion.
How many champions? One per 8-12 team members is a good ratio. For a 15-person team, two champions is right. More than that and the program gets diluted; the role starts feeling like a committee.
What if no one obviously fits? Look for the person who, when you said "we're rolling out an AI tool," was the first to ask "can I get early access?" That's your champion candidate, regardless of seniority or technical background. If your team shows low learning agility across the board, run a sales team AI readiness audit first — it surfaces the behavioral signals (dimension 4: learning agility) that identify champion candidates even in skeptical teams.
Step 2: Define the Champion Role Clearly
The biggest mistake in champion programs is ambiguity. If champions don't know what they're supposed to do, they'll either do too much (burn out) or too little (have no impact).
Define the role in writing before you ask anyone to fill it.
Champion Role Description Template
Role: AI Champion, [Department Name]
Time commitment: 2-3 hours per week
Duration: Initial 90-day program with quarterly renewal
Responsibilities:
- Host one 30-minute office hours session per week (can be informal, Slack-based, or scheduled)
- Contribute two new prompts to the team prompt library each month
- Run one live demo for the team per month showing a new use case
- Flag common questions and blockers to the program manager (the manager or HR lead running the program)
- Complete one advanced training module per month ahead of the broader team
What this role is not:
- Technical support (champions are peers, not IT)
- Mandatory training delivery (demos should feel like sharing, not presentations)
- Accountability for other people's adoption (champions influence; they don't manage)
Recognition:
- Formal mention in performance review as "AI Champion" with documented contributions
- Early access to new tools before team-wide rollout
- Monthly 1:1 with [manager] to discuss what's working
- [Optional: quarterly stipend, gift card, or professional development budget]
Have this conversation before anyone says yes. Surprises about time commitment are the fastest way to burn out a champion.
Step 3: Give Champions a Head Start
A champion who is only slightly more advanced than their peers can't provide real value. Before asking them to help others, invest in getting them significantly ahead.
Advanced training. Give champions access to more intensive training than the general team rollout: vendor-led sessions, online courses, or direct access to your AI tool account manager. Budget 6-8 hours of focused learning before the program officially launches. The AI certification market breakdown for 2026 can help you identify which credentials are worth the investment for champions who want to signal their expertise formally.
Early tool access. If you're rolling out a new AI tool to the team, give champions access 2-4 weeks earlier. They need time to get comfortable before they're fielding questions.
Direct line to vendor support. This is underused. Most AI tool vendors offer a customer success contact. Connect your champions directly. When a champion gets a weird output or can't figure out a feature, they should be able to get an answer in 24 hours, not wait for a ticket.
A brief from the manager. Before the program starts, sit down with each champion and cover: what AI adoption problems you're trying to solve, what the 90-day success looks like, and what you want them to deprioritize if they're time-constrained. This alignment meeting takes 30 minutes and prevents a lot of drift.
Step 4: Structure Their Activities
Champions left to define their own activities will drift toward what's comfortable and avoid what feels risky. Give them a default structure they can modify, not an open mandate.
Weekly: Office Hours (30 minutes)
This is the highest-leverage activity. A standing time each week (Thursday lunch, Slack thread, or 15-minute Zoom) where anyone can ask AI questions without judgment.
Format that works: "AI Office Hours: drop any questions here and I'll share what I know. No dumb questions. If I don't know, I'll find out."
Champions should track what people ask during office hours. Recurring questions are signal that the team needs training on that specific use case.
Monthly: Live Demo (15-20 minutes)
A short demo of a new use case or technique, shared in a team meeting or async via Loom. The rule: it has to be something the champion actually uses in their own work. Demos of hypothetical use cases feel academic. Demos of "here's how I drafted this week's report in 10 minutes instead of 45" are immediately actionable.
Monthly: Prompt Library Contribution
Two new prompts added to the shared team prompt library with notes on when to use them and what to watch out for. Champions should document the failures too. "This prompt sounds good but consistently gives vague output, here's the better version" is more valuable than a clean list of winning prompts.
Monthly Activity Planner
| Week | Activity | Time |
|---|---|---|
| Week 1 | Office hours session | 30 min |
| Week 2 | Office hours session + add 2 prompts to library | 60 min |
| Week 3 | Office hours session | 30 min |
| Week 4 | Office hours session + prepare and share one demo | 90 min |
| Ongoing | Advanced training module (self-directed) | 60 min/month |
| Total | ~4-5 hrs/month |
This is lighter than it sounds. Much of it happens in the flow of work. The demo preparation is the biggest time investment, and even that's just documenting something they're already doing.
Step 5: Sustain Momentum
The programs that fail do so between months 3 and 6. Initial enthusiasm fades, the novelty wears off, and champions start deprioritizing their role when their core workload increases.
Build in sustainability from the start.
Quarterly recognition. Don't wait for the annual review. Every quarter, acknowledge the champion's contributions publicly: in a team meeting, a Slack channel, or a leadership update. The recognition doesn't have to be monetary. A specific, public acknowledgment of impact ("Maria's prompt library updates helped our support team cut response drafting time by 30% this quarter") means more to most people than a gift card. Deloitte's employee engagement research found that public, specific recognition tied to measurable outcomes increases the likelihood of the recognized behavior being repeated by 78% — and is more effective than equivalent monetary rewards for knowledge workers.
Career path signaling. The most powerful retention tool for champions is connecting the role to their professional development. Frame it explicitly: "This puts you in position to lead our AI integration work as we scale" or "AI fluency at this level is a differentiator for a senior role." Champions who see the role as career capital will sustain it. The AI fluency salary premium data for 2026 gives you concrete numbers to back this up — AI-fluent non-technical professionals are commanding 15-25% compensation premiums over their non-AI-fluent peers.
Quarterly refresh sessions. Every 90 days, bring champions together (or run a solo debrief if you have one) to review what's working, update the program based on what they're hearing from the team, and introduce new tools or techniques. Champions who feel like they're growing will stay engaged. Champions who feel like they're doing the same thing indefinitely will drift.
Watch for burnout signals. Champions who are going quiet, getting less prepared for demos, or responding to questions with "I don't know" without follow-up are showing early burnout. Have a direct conversation: reduce their time commitment before they resign the role entirely.
Recognition Framework
Champions need to feel that the role is worth their time. Build recognition into the program, not as an afterthought.
Non-monetary recognition (high impact, low cost):
- Public acknowledgment in team meetings and leadership updates
- "AI Champion" title in internal communications and Slack profiles
- Written recommendation or LinkedIn endorsement from the manager
- Inclusion in AI strategy conversations that their peers aren't part of
- First access to new tools, beta features, or vendor events
Monetary recognition (for teams with budget):
- Monthly or quarterly stipend ($100-$300/month depending on program size)
- Professional development budget ($500-$1,000/year for AI courses, conferences)
- Gift cards or experience rewards tied to milestones (e.g., "team hit 70% AI adoption this quarter")
The best recognition is career recognition. If being an AI champion genuinely accelerates someone's promotion or opens new opportunities, you won't need much else.
Avoiding the Common Failures
Picking a manager as champion. Managers can support the program, but they shouldn't be the champion. The power dynamic changes the peer dynamic that makes champions work. If a direct report is struggling with an AI tool and they have to ask their manager, they'll often say they're fine instead.
No formal role definition. Champions without a clear role description drift into doing whatever seems needed in the moment. Some end up as informal IT support. Others run too many sessions and burn out. The role description is not bureaucracy. It's protection.
Burning them out. The most common failure mode. An enthusiastic early adopter gets asked to run all the training, answer all the questions, build all the resources, and maintain all the documentation. They hit a wall around month 3. Keep the time commitment genuinely bounded at 2-3 hours per week.
No manager involvement. Champions are not a delegation mechanism. The manager still needs to signal support, attend at least some office hours, and make AI adoption part of team conversations. Champions accelerate adoption. They don't replace the manager's role in setting expectations.
Measuring Program Success
Team AI usage rate. The primary metric. Before the program: what percentage of your team uses AI tools at least three times per week? After 90 days: has that number increased by at least 20-30 percentage points? A champion program that isn't moving this metric isn't working. Connect this to the AI adoption ROI measurement framework so usage rate improvements translate into efficiency gains and business impact numbers you can take to leadership.
Champion retention at 6 months. If your champion leaves the role before 6 months, investigate why. Burnout, lack of recognition, or unclear role definition are the most common causes. Each is fixable if caught early.
Peer satisfaction scores. A simple anonymous survey every 90 days: "How helpful has the AI Champions program been for your own AI adoption?" Rate 1-5. Average below 3 means something is off in the program design. Champions who score consistently below 3 in peer satisfaction are usually struggling with communication skills, not AI knowledge.
Prompt library growth. A proxy for ongoing contribution. A healthy program sees the shared prompt library grow by 5-10 new entries per month. Stagnant libraries signal that champions are coasting or have run out of ideas, both solvable with a refresh session.
Learn More
- 90-Day Plan: From AI-Curious to AI-Fluent
- AI Tools Training Playbook for Non-Technical Teams
- Measuring AI Adoption ROI Across Your Team
- How Peer-Led AI Programs Outperform Top-Down Training
- First AI Ops Manager Hire at a 100-Person Company: How the champion role often becomes the seed of a formal AI ops function as teams scale

Co-Founder & CMO, Rework
On this page
- Why Peer-Led Adoption Outperforms Top-Down Training
- Step 1: Identify Champion Candidates
- Step 2: Define the Champion Role Clearly
- Step 3: Give Champions a Head Start
- Step 4: Structure Their Activities
- Step 5: Sustain Momentum
- Recognition Framework
- Avoiding the Common Failures
- Measuring Program Success
- Learn More