Deutsch

Paid Ads Tools and Tech Stack: The Real 7 You Need (and the 4 Doing All the Work)

Most paid stacks I inherit are 11 tools where 4 do the real work. The other 7 are zombie subscriptions: a free trial nobody cancelled, a vendor pitch the previous manager couldn't say no to, an "AI optimization" tier someone bought during a 2024 Q3 panic and then forgot about.

I've audited stacks at three different B2B SaaS companies in the last 18 months. Same pattern every time. Six-figure annual tooling spend, half of it wasted, and the paid manager is still pulling reports in a Google Sheet at 11pm on Sunday because nothing actually closes the loop from ad click to closed-won.

This is the playbook I run when I either inherit one of those stacks or get told to build one from scratch. It ends with fewer logins, clearer attribution, and a budget line your CFO won't try to cut next quarter.

The "Core 7" — what actually matters for B2B SaaS

I think about a paid stack as seven categories, not 11 tools. Most of them you only need one product per category. A few you don't need at all until you cross a spend threshold. Here's the map.

Category What it does When you need it
Ad platforms Buy the media Day one
Conversion tracking Get clean signal back to platforms Day one
Attribution Stitch ad spend to pipeline and revenue $20K/mo+ spend or B2B sales cycle >30 days
Creative tooling Produce ad variants at speed Day one (Figma); Pencil/Smartly above $50K/mo
Bid management Automate routine optimizations $50K/mo+ Google spend
Competitive intel See what competitors are running Optional, pick one
Reporting Translate spend into pipeline language Day one

Four of these categories (ad platforms, tracking, attribution, and reporting tied to your CRM) are where the real work happens. The rest are leverage tools. Useful at scale, dead weight under it. Let's go through each.

1. Ad platforms — and the honest B2B truth about which ones matter

Every "ultimate B2B paid stack" post lists Google, LinkedIn, Meta, Reddit, X, and TikTok like they're equally important. They're not.

For B2B SaaS, here's how I rank them:

  • Google Ads: non-negotiable. Search captures intent. If a buyer is searching "Salesforce alternatives," you need to be there. Run it.
  • LinkedIn Ads: non-negotiable above ~$10K ACV. Most expensive CPM in B2B, but the only place where you can target by job title, company size, and tech stack with surgical precision. Pay the tax.
  • Meta (Facebook/Instagram): conditional. Works if your ICP has a strong personal-life overlap (founders, marketers, designers). Doesn't work for finance, IT, or procurement buyers who don't engage with ads on Facebook. Test before you commit a quarter to it.
  • Reddit: the sleeper for technical buyers. Devtools, infra, dev productivity, security. Reddit's contextual targeting on subreddits like r/devops or r/sysadmin is legitimately undervalued. Cheap CPMs, engaged audiences, but creative has to fit the platform (no logo-in-the-corner stock photo banners).
  • X (formerly Twitter): mostly vibes. Real for founder-led brands selling to other founders. Real-ish for crypto and AI tooling. Otherwise, skip it. The reporting is bad, the audience targeting is worse, and the platform itself is unstable.
  • TikTok: hard pass for B2B SaaS unless you're explicitly targeting under-30 SMB owners. Don't let a "we should test TikTok" conversation become a $20K experiment that proves nothing.

Three platforms get my budget by default: Google, LinkedIn, Reddit. The other three get tested with a fixed budget and a 30-day kill date.

2. Conversion tracking — GTM, server-side GTM, Stape

If your conversion data is wrong, every other tool in the stack is making decisions on garbage. This is the category most teams under-invest in and then wonder why their CPL "drifts" 40% month over month.

The 2026 reality: iOS privacy, ITP, ad blockers, and cookie deprecation have eaten somewhere between 15% and 40% of client-side conversion signal depending on your audience. If you're still running everything off basic Google Tag Manager with no server-side layer, your platform algorithms are optimizing on bad data.

The minimum viable tracking stack:

  • Google Tag Manager (client-side): free. Run it. Standard for triggering pixels and basic events.
  • Server-side GTM: the upgrade. Sends events from your server to ad platforms via the Conversions API (Meta), Enhanced Conversions (Google), Conversions API (LinkedIn). Recovers a meaningful chunk of the lost signal.
  • Stape: managed hosting for server-side GTM. About $20-100/month depending on event volume. The alternative is running it yourself on Google Cloud Platform, which costs less in pure infra but requires a developer to babysit it. For a solo paid manager, Stape is worth every cent. You configure tags, they handle the infrastructure.

Some teams will tell you they don't need server-side GTM because their dev team built a custom data layer. That's fine if it's actually working. Audit it. Pull the conversion API logs for the last 30 days and compare to what's hitting your ad platforms. If the numbers don't reconcile, the custom build is broken and you need server-side GTM as a backup.

3. Attribution — Dreamdata or HockeyStack (NOT Triple Whale)

This is the category where I see the most expensive mistakes. A paid manager hears about Triple Whale at a conference, sees the dashboards, books a demo, and wastes three weeks discovering it's a DTC product. Triple Whale is genuinely excellent for Shopify ecommerce. It's not built for B2B sales cycles that span 60-180 days, multiple stakeholders, and CRM-stitched pipeline data.

For B2B SaaS, the two real options:

Tool Best for Strengths Watch-outs
Dreamdata Mid-market and enterprise B2B with HubSpot or Salesforce Strong CRM integration, multi-touch attribution out of the box, decent self-serve onboarding Pricing scales with revenue; can get expensive past Series B
HockeyStack Mid-market B2B with strong web analytics needs Visual attribution paths, account-level analytics, friendlier UI Younger product; some integrations less mature than Dreamdata's
Triple Whale DTC ecommerce on Shopify Best-in-class for Shopify; pixel + ad platform unification Wrong tool for B2B. Don't book the demo.

Pick one. Don't pilot both. Both Dreamdata and HockeyStack take 4-6 weeks to set up properly, and you only have the bandwidth to do one of those projects right.

Below ~$20K/month in ad spend, attribution platforms are usually overkill. Use UTMs, push them into your CRM via hidden form fields, and build a Looker Studio dashboard that joins ad spend to closed-won. That gets you 80% of the answer for $0/month.

4. Creative tooling — Figma, Pencil, Smartly

Three tiers based on volume and team size:

  • Figma ($15/editor/month): the design layer. Every team should have it. Build templates, version variants, hand off to whoever runs the ad uploads. If you're a solo paid manager without a designer, Figma + the right templates gets you 80% there.
  • Pencil (Brandtech Group, ~$120-300/month per seat depending on plan): AI creative variant generation. Feed it a brief and brand assets, get back static and video ad variants. Worth it when you're testing 30+ creative variants per month and don't have a creative team. Below that, you'll burn the subscription on novelty and never use it.
  • Smartly (enterprise, custom pricing usually $2K-10K+/month): full creative ops platform. DCO at scale, automated creative production, ad management across platforms. Only makes sense above ~$200K/month spend with a dedicated creative ops person. Don't even take the demo at IC scale.

For 90% of paid managers reading this: Figma is the answer. Pencil if you're testing volume. Skip Smartly unless someone above your pay grade has already decided you need it.

5. Bid management — Optmyzr, Adalysis

This is the category where ROI math is easy. If a tool saves you 5 hours a week and helps you find 10% more efficiency on Google spend, it pays for itself above ~$50K/month Google budget.

  • Optmyzr ($249-499/month): Google-heavy. Strong on bid scripts, anomaly detection, search query reports, and Performance Max insights. The "I have 47 Google campaigns and need to manage them in less than 4 hours a week" tool.
  • Adalysis ($99-499/month): audit-and-experiment focused. Better for systematic A/B testing on Google and Microsoft Ads. Catches things humans miss (broken ads, disapproved assets, drift in match types).

Below $50K/month Google spend, neither is worth it. Optmyzr alerts and Google's native automation get you most of the way. Above $100K/month, run one of them. Above $250K/month, run Optmyzr and use Adalysis for the experiment audit layer.

LinkedIn doesn't have a great bid management ecosystem. Native tooling and good naming conventions cover most of it.

6. Competitive intel — SimilarWeb or AdBeat (pick one)

Useful for landing-page benchmarking, traffic share, and seeing what creative competitors are running. Not useful for actual day-to-day decisions.

  • SimilarWeb (~$200-15,000/year depending on plan): traffic, channel mix, top pages. Best for understanding where competitors are getting volume.
  • AdBeat (~$249-499/month): display and native creative library. Best for "what creative angles are competitors testing on display?"

Pick one based on where your spend goes. Heavy display/native? AdBeat. Heavy search and want benchmarking? SimilarWeb. Don't run both. The novelty wears off in three weeks and you'll never log in again.

7. Reporting — Looker Studio + GA4 + CRM

The boring answer. The right answer.

  • Looker Studio: free. Connects to GA4, Google Ads, LinkedIn (via connector), Sheets, and most CRMs. Build one executive dashboard, one paid manager dashboard, one campaign-level dashboard. Done.
  • GA4: free. Use it as the web analytics backbone. Hate the UI all you want; the data model is solid and it integrates with everything.
  • CRM (HubSpot, Salesforce, or Rework): the ground truth for revenue. Without it in the loop, you're optimizing for form-fills, not pipeline.

Most paid managers think they need a $1,500/month BI tool. They don't. Looker Studio with a clean data model and three or four core dashboards covers the IC reporting use case completely. Save the BI budget for when someone actually pulls a report nobody has time to build.

Closing the ad-to-pipeline loop (the part most stacks miss)

Here's the gap I see most often: the paid stack stops at the form fill. The CRM stack starts at the form fill. Nothing closes the loop back to the ad platform with "this lead became pipeline" or "this lead closed."

Without that loop, every conversion you optimize against is a proxy. You're telling Google to find you more people who fill out forms, not more people who become customers. The two audiences are wildly different, and the gap shows up six months later when CAC creeps up and pipeline-to-spend ratio quietly halves.

The fix is sending qualified-lead and closed-won events back to the platforms as offline conversions. Google calls it Enhanced Conversions for Leads, LinkedIn calls it Conversions API, Meta calls it CAPI. The technical mechanism is similar across platforms: pass an email or hashed identifier and a value, the platform credits the originating click, and the algorithm starts learning on revenue instead of forms.

The tooling friction here is real. If you're on Salesforce with full RevOps support, you can usually beg, bribe, or wait six weeks for the integration to get built. If you're on a leaner setup, you need a CRM that lets a paid manager push qualification and closed-won events without filing a ticket. Rework is the one I run when I want that loop closed without RevOps as a dependency. CRM/Sales Ops from $12/user/month, Work Ops from $6/user/month. Native webhooks per pipeline stage, lead source preserved end to end, and the offline conversion exports plug directly into Google Ads and LinkedIn without an integration project.

The point isn't the tool. The point is: if your CRM can't push pipeline events back to your ad platforms in under 15 minutes of setup, your stack has a hole. Fix it.

The 30-Day Stack Audit

Run this when you inherit a stack or quarterly when you own one. Block four hours a week.

Week 1 — Inventory and cost

List every tool. Every login. Every annual contract. Every monthly auto-renew. Pull the credit card statement and the AP system.

For each: tool name, category, monthly cost, annual cost, contract end date, primary user, business outcome it's supposed to deliver.

You will find at least three tools you didn't know you were paying for. Guaranteed.

Week 2 — Usage logs

For every tool, pull usage data for the last 30 days. Most SaaS admin panels show this. Specifically:

  • Who logged in?
  • How often?
  • What did they do (last action timestamp)?

If a tool has zero logins in 30 days, it's a kill candidate. If it has logins from one person who uses it for one report a month, it's a consolidation candidate.

Week 3 — Overlap map

Map tools to the Core 7 categories. Two tools claiming the same job? One has to go. Common overlaps I see:

  • Two attribution tools running in parallel because nobody killed the legacy one
  • Three creative tools (Canva, Figma, Pencil) where one would be enough
  • A standalone bid management tool plus a "platform optimization" feature on an attribution tool plus Google's native automation, three layers of bid logic fighting each other

Week 4 — Cut list and consolidation plan

Build a kill list. For each tool: keep, consolidate (folded into another tool), or cut. Write a one-line justification for each.

For cuts: cancel before the next renewal date, document the workflow it replaced, and assign whoever owns the replacement.

Bring the list to your manager or the CFO. The conversation goes: "I'm cutting $X/month in tooling, here's what we're keeping, here's why." That conversation makes you look like an operator, not a cost center.

When to consolidate vs. when to specialize

A heuristic I use:

  • Under $30K/month total ad spend: consolidate ruthlessly. One ad platform manager dashboard, one attribution tool (or none), one creative tool, one reporting layer. You don't have the spend to justify specialization, and the time you'd save with specialized tools you'd lose to managing the integrations between them.
  • $30K-$150K/month: the trap zone. This is where teams over-buy. Vendors target this segment hardest because budgets are real but governance is loose. Be defensive. Add a tool only when you can write down the specific workflow it unblocks and the hours it saves.
  • Over $150K/month: specialize. Bid management tools earn out, dedicated attribution platforms earn out, creative ops tools earn out. Build the specialized stack and dedicate someone to running each layer.

The trap zone is the one to watch for. The pitch decks are seductive, the demos are slick, and the savings math always assumes perfect adoption. Half the tools in the trap zone are bought to solve a workflow that doesn't exist yet, and a year later they're zombie subscriptions in someone else's audit.

A note on AI tool fatigue

In 2026, every vendor has shipped an "AI optimization" tier. Most of it is wrappers: a GPT call dressed up as a feature, sold for an extra $400/month.

Three questions before you add any AI tool to the stack:

  1. Does it replace a specific human task that's currently costing real hours?
  2. Does it save more than 2 hours per week, measured?
  3. Does it integrate with your existing data, or does it require manual exports to work?

If the answer to any of those is no, skip it. The opportunity cost of evaluating ten AI tools and adopting none of them is lower than the opportunity cost of adopting three and managing the integrations badly.

The AI tools that have actually earned a place in my stack: Pencil (creative variant generation, real time savings), and the AI features inside tools I already pay for (GA4 anomaly detection, LinkedIn audience expansion, Optmyzr's anomaly alerts). The ones I've cut: every standalone "AI ad copy generator" I've ever tested, two "AI bid optimizers" that fought with Google's native automation, and one "AI insights" tool that cost $800/month and produced screenshots I could have made in Looker Studio.

The stack you can explain to your CFO in five minutes

Here's the test: can you explain your full paid stack to your CFO in five minutes, with cost, category, and one-sentence justification per tool?

If yes, you have a defensible stack. If you start hedging, listing edge cases, or can't remember what one of the tools does, you have an audit problem.

A stack that survives the next budget cycle has these properties: every tool maps to one of the Core 7 categories, every category has exactly one primary tool, every tool has a named owner, and the cost-to-spend ratio is under 5%.

The 11-tool stack with overlapping coverage and zombie subscriptions doesn't survive contact with a CFO who's looking for $200K to cut. The Core 7 stack does.

Cut what isn't working. Keep what is. Don't fall for the next AI tier. Run the audit quarterly.

Learn More