Bahasa Indonesia

How to Run a SaaS RFP That Doesn't Waste 6 Weeks

Key Facts: SaaS RFP Reality Check

  • Average mid-market RFP cycle: 4-6 months from kickoff to signed contract (Gartner, 2024). Lean RFPs compress this to 3 weeks.
  • Incumbent wins roughly 60-70% of RFPs where the buyer already had a vendor relationship — often because requirements were written around the incumbent's feature set.
  • Cost per RFP: $15K-$40K in loaded buyer labor (5-10 people x 40-60 hours) and an estimated $30K-$75K in vendor response cost, which gets priced back into the deal.
  • Structured vs. ad-hoc success rate: Buyers who use a weighted scorecard and structured demo script report 2.3x higher satisfaction at the 12-month mark vs. unstructured evaluations (Forrester, 2023).
  • RFP abandonment rate: ~25% of mid-market RFPs end with no purchase — usually because the process outlasted the urgency of the problem.

The need was identified in January. By February, the director had sent an RFP to seven vendors. By mid-March, six responses were in. A scoring committee of five people met three times over four weeks to evaluate the responses. Two vendors were shortlisted. Both gave demos. Neither demo was structured identically, so comparison was difficult. Internal debate continued. A decision was made in late April. Contract negotiation started in May. The tool went live in August.

Fourteen months from need identification to go-live. Forrester's research on B2B software purchasing cycles shows that drawn-out procurement processes are one of the leading causes of stalled or abandoned software projects. The cost isn't just time, it's the compounding cost of an unsolved problem.

The process wasn't incompetent. It was just designed for a different kind of organization. Enterprise procurement cycles are built for risk management at scale: multiple stakeholders, large contracts, long vendor relationships, regulatory oversight. For a 150-person company buying a $40K/year tool, this process doesn't manage risk. It creates it: the risk of slow decisions, team frustration, and the cost of a problem that went unsolved for a year.

A good SaaS RFP for a mid-market company takes three weeks, not six. Here's how to run one.

What a Lean RFP Actually Accomplishes

Let's be clear about what a SaaS RFP is and isn't for.

An RFP is for: making a defensible decision among multiple credible options, ensuring requirements are clearly defined, and creating a paper trail that justifies the selection.

An RFP is not for: discovering what you need (that's a requirements brief), running a procurement beauty contest, or creating a process so thorough that no one can second-guess the outcome. Before you write the first line of an RFP, make sure you've run the SaaS buying decision tree to confirm you should be buying at all, not building or extending an existing tool.

Most RFPs fail because they're trying to do all of the above simultaneously, and the weight of the process collapses the speed of the decision. The lean RFP separates these steps, keeps each one focused, and maintains momentum.

The 3-Tier RFP Filter

Every capability a vendor is evaluated against should be sorted into one of three tiers before the RFP goes out. Must-haves are capabilities the tool cannot lack — missing one is an automatic disqualification, no points awarded. Differentiators are capabilities where vendors materially diverge in quality and where the difference changes day-to-day operator experience; these carry the bulk of the scoring weight. Nice-to-haves are useful but non-decisive features that inform a tiebreaker only. Most failed RFPs blur these tiers — treating a nice-to-have as a must-have narrows the field to the wrong winner.

Phase 1: Requirements Brief (Day 1-2)

Before any vendor hears from you, write a one-page requirements brief. Not a thirty-page RFP document: a one-page brief.

The brief answers five questions:

  1. What problem are we solving? One paragraph, no jargon, specific enough that someone unfamiliar with the team would understand the need.

  2. What are the must-have capabilities? Maximum five. These are the capabilities without which the tool doesn't work for you. Be specific: "bidirectional CRM sync" not "integrations."

  3. What are the nice-to-have capabilities? Maximum five. These inform scoring but don't disqualify vendors.

  4. What does success look like at 90 days? One or two measurable outcomes. "Response time below 2 hours for 90% of tickets" is a success metric. "Better customer service" is not.

  5. What are the constraints? Budget range (be direct), integration requirements, compliance requirements, timeline.

Template: 1-Page Requirements Brief

Project: [Tool Category] Selection — [Date]
Owner: [Name + Role]

Problem statement:
[2-3 sentences describing the current state and the cost of the problem]

Must-have capabilities (rank-ordered):
1.
2.
3.
4.
5.

Nice-to-have capabilities:
1.
2.
3.

Success metrics at 90 days:
1.
2.

Constraints:
- Budget: $[X] to $[Y] per year
- Integration requirements: [list]
- Compliance: [list]
- Timeline: live by [date]
- Decision authority: [name]

Circulate this brief to the decision-maker and one technical stakeholder for sign-off before any vendor outreach. This step prevents the most common RFP failure: requirements that shift mid-process because they were never clearly defined.

Phase 2: Vendor Shortlist (Day 3-5)

Invite three to five vendors. Not seven. Not ten.

The instinct to cast a wide net comes from fear of missing the best option. But inviting too many vendors creates four problems. For CRM specifically, a CRM buyers checklist can help you pre-screen vendors before they reach the formal shortlist stage, reducing the field without slowing down your process.

  1. Vendors give generic responses when they're competing against a large field
  2. Evaluation becomes unwieldy and comparisons become unfair
  3. The process takes longer, eroding momentum
  4. You signal low seriousness, which affects vendor engagement quality

How to build the shortlist in 48 hours:

  • Start with Gartner Peer Insights, G2, or Capterra filtered by company size (your size) and industry
  • Check which tools your peer network uses (ask in Slack communities, LinkedIn, direct outreach to two or three peer operators)
  • Look at what the vendors you've already seen in demos actually compete against (ask the rep directly)
  • Confirm the shortlist against your must-have capabilities and remove any vendor that visibly can't meet must-have #1 or #2

Send each shortlisted vendor a copy of the requirements brief plus two specific asks:

  1. A 30-minute initial call to confirm whether they're a fit before going to formal demo stage
  2. Written answers to your five must-have questions before the call

The 30-minute screening call eliminates vendors who aren't a fit without burning time on a full demo. Any vendor who can't answer your must-have questions in writing before a call is probably not operationally ready for your requirements.

Phase 3: Structured Demo Script (Week 2)

This is the step most mid-market RFPs skip, and it's the one that most affects decision quality. Running the vendor diligence checklist in parallel with Phase 3 lets you verify security, financial health, and support SLAs while your team is evaluating features in demos.

Without a structured demo script, each vendor shows you what they're proud of. You see different features in different contexts and comparing them becomes a subjective exercise that favors whoever gave the most polished presentation.

With a structured demo script, every vendor shows you the same scenarios in the same sequence. You're comparing how different tools handle the same problem, which is actually useful.

Building the demo script:

Take your five must-have capabilities and write one scenario per capability. The scenario should describe a real workflow from your business, not a generic use case.

Bad scenario: "Show us how your reporting works." Good scenario: "A sales manager needs to see, by rep, which accounts moved from Qualified to Proposal in the last 30 days, with the associated ARR. Walk us through how to build and share that view."

Include two or three scenarios from your nice-to-have list as bonus items if time permits.

15-Question Demo Script Template:

Send this to vendors the day before the demo. Tell them to walk through your scenarios, not their standard demo flow.

  1. Walk through [Must-Have Scenario 1] from setup to output.
  2. Walk through [Must-Have Scenario 2] end-to-end.
  3. Walk through [Must-Have Scenario 3] including how an error or edge case is handled.
  4. Walk through [Must-Have Scenario 4] including permissions and access control.
  5. Walk through [Must-Have Scenario 5].
  6. Show us the integration with [CRM/HRIS], specifically how [specific data object] syncs bidirectionally.
  7. Show us what happens when the integration breaks: how is the error surfaced and resolved?
  8. Show us the admin dashboard, specifically how a new user is provisioned and removed.
  9. Show us the reporting view that [specific role] would use daily: how is it accessed and customized?
  10. Show us a scenario where something goes wrong (a failed import, a permission error, a sync conflict) and how the system handles it.
  11. Walk us through the implementation process for a company our size. What does week one look like?
  12. What's on your roadmap for the next 90 days that's relevant to our use case?
  13. What do customers your size typically struggle with in the first 60 days?
  14. What's your SLA for a P1 outage and can you show us where that's documented?
  15. If we signed today and went live in 30 days, what would you need from us and what would you deliver?

Allocate 60-75 minutes per vendor. Keep your own notes on a scoring sheet (see below) while the demo runs.

Phase 4: Scorecard Evaluation and Decision (Week 3)

After demos are complete, score each vendor on a weighted criteria matrix before any group discussion. This prevents anchoring bias, where group discussion converges on the most-discussed or most-recently-seen option rather than the best one. Research from Harvard Business Review on group decision-making shows that individual scoring before group discussion consistently produces more accurate assessments than open deliberation without a structured framework.

Weighted Vendor Scoring Matrix:

Criteria Weight Vendor A Vendor B Vendor C
Must-have #1: [capability] 20% 1-5 1-5 1-5
Must-have #2: [capability] 20%
Must-have #3: [capability] 15%
Must-have #4: [capability] 15%
Must-have #5: [capability] 10%
Integration readiness 10%
Support and SLA quality 5%
Implementation track record 3%
Nice-to-have #1 1%
Nice-to-have #2 1%
Weighted Total 100%

Each evaluator scores independently before a group meeting. The group meeting reviews the scores, discusses significant variances (where evaluators scored the same vendor differently), and makes the decision.

Assign a single tiebreaker (the decision-maker) before the group meeting starts. If scores are close and discussion doesn't resolve it, the decision-maker decides.

Common Pitfalls

Inviting too many vendors. Three to five is optimal. Beyond five, the process becomes a sorting problem, not an evaluation.

Writing requirements that favor the incumbent. This happens when the person writing the requirements brief has already seen one vendor's demo and unconsciously mirrors its features as "requirements." Circulate the brief to someone who hasn't seen any demos yet.

Skipping the structured demo script. Free-form demos are presentations, not evaluations. The structured script is what makes comparison possible.

Evaluation by committee without a tiebreaker. Committees without designated decision authority produce consensus-toward-the-middle outcomes, not best outcomes. Name the tiebreaker upfront.

Starting negotiation before the decision is final. Negotiating with multiple vendors simultaneously is appropriate only if you're genuinely undecided. If the decision is made and you're negotiating for better terms before signing, say so. That's a different conversation.

Decision Memo Template

Before signing, document the decision. Not a lengthy report: a one-page memo that captures the key facts for the record. McKinsey's analysis of high-performing procurement organizations identifies decision documentation as one of the most consistent practices separating teams that successfully rationalize software spend from those that don't.

Decision: [Tool name] selected for [use case]
Date: [Date]
Decision maker: [Name]
Evaluators: [Names]

Vendors evaluated: [List]
Selection rationale: [2-3 sentences on why this vendor]
Key terms: $[X]/year, [Y] seats, [Z]-year term
Implementation timeline: [start] to [go-live date]
Success metrics at 90 days: [from requirements brief]
Review date: [90 days from go-live]

Alternatives considered:
- [Vendor B]: [1 sentence on why not selected]
- [Vendor C]: [1 sentence on why not selected]

This memo takes fifteen minutes to write. It creates a record that's useful at the 90-day review, at renewal time, and if the decision needs to be defended internally.

Running the RFP in Rework Work Ops

Most mid-market teams try to run an RFP out of a shared spreadsheet and a Slack channel, and the process quietly collapses because there's no single surface that holds the requirements brief, the vendor shortlist, the scorecards, and the cross-functional reviewer input. Rework Work Ops (from $6/user/month) gives the RFP owner a dedicated kanban to run the process end-to-end: a board with columns for Shortlist, Screening Call, Demo Scheduled, Scorecard Pending, Finalist, and Decision, with each vendor as a card that holds the written responses, demo recording links, scorecard attachments, and reviewer sign-offs.

Because scoring is the most failure-prone step, Rework lets you build a structured scorecard template once and apply it to every vendor card, so evaluators score independently before the group meeting — no anchoring bias from Slack threads. Cross-functional reviewers (IT, security, finance, the end-user team) get tagged directly on the vendor card rather than routed through a separate approval chain, which is usually where mid-market RFPs lose their second and third weeks.

FAQ

Frequently Asked Questions About Running a SaaS RFP

When should a mid-market SaaS buyer run an RFP vs. just buy?

Run an RFP when the contract is over $25K/year, when the tool touches multiple teams, or when switching costs will be high (CRM, HRIS, core finance). For tools under $15K/year that are team-specific and easy to rip out, a 2-vendor bake-off or a 30-day trial of your top pick is faster and produces a similar-quality decision. RFPs add process cost — apply them where that cost is justified by contract size and switching risk.

How many vendors should be in an RFP?

Three to five. Below three, you don't have real comparison; above five, evaluation becomes unwieldy and vendors give generic responses because they know they're one of many. Five is the practical ceiling for a mid-market team running evaluations in parallel with their day jobs.

How long should an RFP cycle take?

Three weeks for a mid-market SaaS RFP (one week per phase: requirements + shortlist, demos, scoring + decision). Anything past six weeks indicates the process is running the team rather than the team running the process — and at that point, momentum and stakeholder attention start eroding faster than information is accumulating.

Should pricing be scored in the RFP?

No — pricing should be a separate negotiation after the technical decision is made. Scoring pricing alongside capability biases the evaluation toward cheaper-but-weaker vendors and anchors your negotiating position at list price. Keep the scorecard capability-focused, pick the winner, then negotiate. If a vendor is technically strong but priced 30% above your band, that's a negotiation problem, not a scoring problem.

What are RFP red flags from a vendor's response?

(1) Generic answers to must-have questions that don't reference your specific use case; (2) a written response that contradicts what the rep said on the screening call; (3) refusal to commit to an implementation timeline or SLA in writing; (4) an "enterprise tier" pitch when you asked for mid-market pricing; (5) no named implementation manager or customer success contact for companies your size; (6) demo environments that crash or lag on your scenarios.

What's the biggest mistake mid-market buyers make with RFPs?

Writing requirements that unconsciously mirror the incumbent's feature set. This is why the incumbent wins 60-70% of RFPs — the requirements brief was built by someone who already knew one tool deeply, so the "must-haves" read like a checklist of that tool's features. Fix: have someone who has never seen any vendor's demo review and challenge the requirements brief before it goes out. If three of your five must-haves are things only the incumbent does, the RFP was decided before it started.

Do we need a formal RFP if we already know which vendor we want?

If you're 90%+ sure and the contract is under $40K/year, a lightweight 2-vendor bake-off with a structured demo is usually enough — you're validating the gut call, not discovering new options. Run a full RFP when you owe the decision to stakeholders who weren't in early discovery, when the contract is over $50K/year, or when the decision needs a defensible paper trail for the board, audit, or future renewal justification.

Learn More