B2B SaaS Lowers Google Ads CPA by 35% with Better Measurement and Experimentation
Skeleton case study showing how the Google Ads optimization workflow can be applied to a B2B lead generation client.
February 1, 2025
Key Results
Client Background
An anonymized B2B SaaS company selling a workflow tool for mid‑market teams was primarily relying on inbound leads from content and referrals.
They had been investing in Google Ads for several years with:
- Monthly Google Ads spend of $35K–$45K
- A mix of brand, competitor, and non‑brand search campaigns
- Self‑serve free trial and “Request a demo” as key CTAs
Despite healthy traffic, the marketing and sales teams were struggling with:
- Rising CPAs on non‑brand search
- Inconsistent demo quality across campaigns
- Lack of confidence in the numbers from GA4 vs Google Ads
Challenge
Before we started, the main issues were:
- Measurement gaps – form submissions, trial sign‑ups, and demo requests were tracked inconsistently across GA4 and Google Ads
- High CPA / low ROAS – especially on non‑brand search, where many leads did not progress to opportunities
- Unclear campaign impact – the team could not confidently say which campaigns or keywords actually drove pipeline, not just form fills
- No systematic testing – bid strategies, structures, and landing pages had been changed ad‑hoc without clear hypotheses or evaluation
Approach
Phase 1: Measurement & Data Audit (Week 1)
We began by aligning GA4 and Google Ads so that everyone was looking at the same story.
- Audited GA4 events and Ads conversion actions
- Discovered:
- Duplicate conversion actions in Ads still marked as primary
- GA4 events for `demo_request` and `trial_signup` firing on page load instead of successful submissions
- No distinction between all leads and sales‑qualified leads (SQLs)
- Redesigned the event and conversion setup to:
- Use a single, clean `lead_submit` event with parameters for lead type and source
- Create a secondary GA4 event for `sql` synced from the CRM to tie ads to down‑funnel quality
- Import the right conversions into Google Ads (lead + SQL) and disable legacy actions
- Set up custom GA4 reports showing leads and SQLs by campaign/keyword
Deliverables:
- Measurement map showing click → session → lead → SQL
- Issues & fixes log documenting each tracking change
Phase 2: Account Diagnosis & Quick Wins (Week 1–2)
With reliable data, we audited the account structure and performance.
- Reviewed campaign and ad group setup:
- Dozens of small campaigns with overlapping themes and budgets
- Broad match keywords mixed with exact/phrase in the same ad groups
- Little separation between high‑intent and research queries
- From the last 90 days of data we found:
- ~22% of spend on keywords that had zero SQLs despite occasional form fills
- Brand campaigns optimized to all leads, masking the true cost of qualified demos
- Geo/device segments with consistently poor performance
We implemented several quick wins:
- Paused and excluded obviously irrelevant and non‑converting queries
- Split high‑intent keywords (e.g. “{category} software”, “{category} platform”) into focused campaigns
- Tightened budgets on under‑performing regions and devices
- Changed optimization for brand campaigns to focus on SQLs instead of all leads
Deliverables:
- Account audit summary (observations → issues → recommendations)
- Prioritized quick wins list with estimated impact and risk
Phase 3: Experiment Design & Implementation (Week 3–4)
Next we designed a small set of experiments to test bigger strategic questions.
Key hypotheses:
- Switching from manual CPC to tCPA on core non‑brand campaigns would lower CPA without hurting volume.
- Separating “solution aware” keywords (e.g. “{category} software”) from “problem aware” keywords (e.g. “how to manage {problem}”) would improve both conversion rate and SQL rate.
- A more focused demo‑first landing page would improve demo request rate from high‑intent traffic.
Implementation:
- Used Google Ads Experiments to A/B test manual CPC vs tCPA on selected campaigns
- Created new campaigns for high‑intent vs problem‑aware groups with tailored ad copy and budgets
- Set up an A/B test between the existing landing page and a new version emphasizing social proof and friction‑reduced form
Success metrics:
- Primary: cost per SQL and SQL volume
- Secondary: lead conversion rate, overall CPA, and demo‑to‑opportunity rate from CRM
Deliverables:
- Experiment design docs for each test
- Experiment log summarizing configuration and expected lift
Phase 4: Results Analysis & Account Restructuring (Week 4+)
After 4–6 weeks of testing, the results were clear:
- tCPA on core non‑brand campaigns reduced cost per SQL by ~18% while maintaining volume
- High‑intent campaigns showed significantly higher SQL rate and more stable CPA compared to mixed‑intent setups
- The new landing page improved demo request rate by 23% from high‑intent traffic
Based on these findings we:
- Rolled out tCPA bidding to more non‑brand campaigns with guardrails
- Consolidated overlapping campaigns into a smaller set of clearly‑defined structures (brand / competitor / high‑intent / problem‑aware)
- Shifted budget towards high‑intent campaigns and paused consistently weak segments
- Made the new landing page the default for core demo traffic
Deliverables:
- Before/after charts for CPA, lead volume, and SQL volume
- Updated account map showing the simplified structure
Phase 5: Ongoing Optimization & Reporting
With the foundation in place, we moved to a lighter ongoing cadence.
- Weekly:
- Monitor for anomalies and bid/budget adjustments
- Review search terms and add negatives where needed
- Monthly:
- Update performance summary by campaign/keyword and by lead stage
- Plan 1–2 new experiments (e.g. new messaging angles, new geos)
- Share a concise report with key insights and agreed actions
Deliverables:
- Example monthly report and dashboard views
- Rolling experiment roadmap snapshot
Results
Six months after the engagement started:
- Overall Google Ads CPA decreased by 35% (from \$420 to \$275)
- Qualified leads (SQLs) increased by 1.5×, with a higher demo‑to‑opportunity rate
- Non‑brand campaigns, previously barely breakeven, became a reliable source of pipeline
- The client gained a much clearer view of which campaigns and keywords truly drove revenue
Key Lessons
Tie the outcome back to your principles:
- Measurement first – why fixing tracking changed the conversation
- Hypothesis‑driven testing – how structured experiments reduced guesswork
- Continuous optimization – how a simple cadence prevented performance drift
Call to Action
Want Similar Results?
Let's discuss how these strategies can be adapted for your business.