Sales & Conversion

Why Ad Performance Analytics Lies (And What I Track Instead)


Personas

Ecommerce

Time to ROI

Short-term (< 3 months)

Two months ago, I was celebrating what looked like an incredible Facebook Ads success story. The dashboard showed a 9.2 ROAS for my e-commerce client - the kind of numbers that make agency owners screenshot their reports and post them on LinkedIn.

Then I dug deeper into the actual revenue data. The real ROAS? Closer to 2.5. The "incredible performance" was actually Facebook claiming credit for organic SEO traffic I'd been building for months.

This isn't just a Facebook problem - it's an industry-wide issue where ad platforms are designed to make their performance look better than reality. After working with dozens of e-commerce stores and watching them make budget decisions based on misleading data, I developed a completely different approach to ad performance analytics.

Here's what you'll learn from my experiments with attribution modeling across multiple client accounts:

  • Why platform-reported ROAS misleads most businesses into wasting budget

  • The dark funnel reality that attribution models completely miss

  • My 4-layer tracking system that reveals actual ad impact

  • Simple metrics that predict profitability better than ROAS

  • When to kill campaigns despite "good" platform metrics

Let's dive into what I learned from tracking real e-commerce performance beyond the vanity metrics.

Industry Reality

What every marketer has been told about tracking ad performance

Walk into any digital marketing agency or open any advertising course, and you'll hear the same gospel about ad performance analytics. The industry has built an entire mythology around tracking that sounds sophisticated but falls apart in practice.

The Standard Attribution Playbook:

  • Platform ROAS is your north star - Facebook and Google dashboards tell the truth

  • Last-click attribution works - whoever gets the final touch gets the credit

  • UTM parameters solve everything - just tag your campaigns properly

  • Multi-touch attribution is the future - sophisticated models will save us

  • Real-time optimization matters - adjust campaigns based on daily performance

This conventional wisdom exists because it's what ad platforms want you to believe. Facebook, Google, and TikTok all have financial incentives to make their performance look as good as possible. They've built increasingly sophisticated attribution models that claim credit for conversions they had minimal impact on.

The problem? Modern customer journeys are messy. Someone might see your Facebook ad, ignore it, search for your brand three days later, read a blog post, check reviews, and finally buy through an organic Google search. Which channel "converted" them?

According to platform analytics, it was the organic search. According to reality, the Facebook ad started the entire journey. This is where traditional attribution completely breaks down, leading to budget allocation disasters that I've seen destroy profitable campaigns.

The result is what I call "attribution theater" - lots of sophisticated tracking that provides the illusion of precision while making fundamentally wrong decisions about where to spend money.

Who am I

Consider me as your business complice.

7 years of freelance experience working with SaaS and Ecommerce brands.

The wake-up call came during a project with an e-commerce client selling outdoor gear. They had been running Facebook Ads for eight months with what looked like incredible performance - their Facebook dashboard consistently showed 4-6 ROAS, well above their target of 3.0.

But something felt off when I started working with them. Despite these "amazing" Facebook results, their overall revenue growth was stagnant. The founder was frustrated because he was pumping more money into ads but not seeing proportional business growth.

I decided to run a three-month experiment. Instead of relying on platform analytics, I would track every possible data point to understand what was actually driving conversions.

The Challenge: Attribution Chaos

Within the first week, I discovered the problem. Facebook was claiming credit for conversions that were clearly coming from other sources. A customer would see a Facebook ad on Monday, search "outdoor gear reviews" on Google Tuesday, land on an organic blog post Wednesday, and buy on Thursday. Facebook's attribution window meant they claimed the conversion, even though the real driver was the organic content strategy I'd been building.

The client had been increasing Facebook ad spend based on these inflated metrics, while the actual conversion driver - SEO content - was being completely ignored in budget allocation decisions.

This wasn't just a Facebook problem. When I audited their Google Ads account, I found similar issues. Google was claiming credit for branded searches that were likely triggered by offline conversations, word-of-mouth, or other unmeasurable touchpoints.

The traditional analytics setup was creating a feedback loop of bad decisions: increase spend on platforms that looked good in dashboards while starving the channels that were actually working.

My experiments

Here's my playbook

What I ended up doing and the results.

After seeing attribution fail spectacularly across multiple client accounts, I developed what I call the "Reality Check Framework" - a 4-layer system that cuts through platform BS to show actual ad impact.

Layer 1: Platform Skepticism

Instead of trusting platform metrics, I started treating them as "maximum possible impact." If Facebook claims 6 ROAS, the real impact is probably 2-4 ROAS. This skeptical baseline prevents over-investment in platforms that inflate their own performance.

I implemented a simple rule: never make budget decisions based solely on platform-reported metrics. Every optimization requires validation from at least two other data sources.

Layer 2: Revenue Cohort Analysis

The breakthrough came when I started tracking revenue by customer acquisition date rather than campaign performance. I created weekly cohorts of new customers and tracked their total lifetime value, regardless of attribution model.

For the outdoor gear client, this revealed something fascinating: weeks with higher Facebook ad spend correlated with increased new customer revenue, but with a 2-3 week delay. The platform attribution was claiming immediate credit, but the real impact was showing up weeks later through "organic" conversions.

Layer 3: Incrementality Testing

I implemented systematic on/off testing for ad campaigns. For two weeks, we'd run campaigns at normal budget. For the next two weeks, we'd pause them completely. This crude but effective method showed real incrementality versus correlation.

The results were eye-opening. Pausing Facebook ads led to a 30% decrease in total revenue, not the 60% decrease that attribution models predicted. The ads were working, but not as dramatically as platforms claimed.

Layer 4: Cross-Channel Impact Tracking

I started tracking how ad campaigns affected other channels. When we increased Facebook spend, what happened to organic search volume? Direct traffic? Email performance? This revealed the true ecosystem impact that traditional attribution completely misses.

For example, Facebook ads drove a 40% increase in branded search volume, which then converted through "organic" Google traffic. The Facebook campaign was incredibly effective, but you'd never know it from looking at platform metrics alone.

The Implementation Process:

Week 1-2: Set up revenue cohort tracking in Google Sheets connected to your e-commerce platform. Track new customer value by acquisition week, ignoring attribution models.

Week 3-4: Implement incrementality testing for your largest ad channel. Run normal campaigns for two weeks, then pause for two weeks while tracking total business impact.

Week 5-8: Add cross-channel impact monitoring. Track how ad spend changes affect organic search volume, direct traffic, and other "unattributed" channels.

Week 9+: Build decision-making frameworks based on business impact rather than platform metrics. A campaign that shows 2 ROAS on Facebook but drives 40% branded search increase might be your most valuable campaign.

Revenue Cohorts

Track new customer value by acquisition date rather than campaign attribution to see real impact patterns

Incrementality Tests

Pause campaigns systematically to measure true business impact versus correlation with other channels

Cross-Channel Tracking

Monitor how ad spend affects organic search, direct traffic, and other channels platforms ignore

Platform Skepticism

Treat platform-reported ROAS as maximum possible impact and require validation from multiple sources

The revenue cohort analysis became the foundation for all budget decisions. Within two months of implementing this system, I could see clear patterns that platform analytics completely missed.

For the outdoor gear client:

  • Facebook ROAS dropped from 6.2 to 2.8 when measured through business impact

  • Google Ads ROAS increased from 3.1 to 4.2 when accounting for cross-channel effects

  • Total ad efficiency improved 40% by reallocating budget based on real impact

  • Branded search volume increased 85% during high Facebook spend periods

The incrementality tests revealed that display campaigns everyone wanted to kill were actually driving 25% of new customer acquisition through subsequent organic channels. Meanwhile, "high-performing" retargeting campaigns were mostly claiming credit for customers who would have bought anyway.

Most importantly, business revenue became predictable. Instead of chasing platform metrics that fluctuated randomly, we could see clear cause-and-effect relationships between ad spend and business growth.

Learnings

What I've learned and the mistakes I've made.

Sharing so you don't make them.

Building this attribution reality-check system taught me that most performance marketing is based on beautiful lies. Here are the key lessons from tracking actual business impact across multiple e-commerce accounts:

  1. Platform metrics optimize for platform revenue, not your business - every ad platform has financial incentives to inflate their apparent performance

  2. Customer journeys are invisible to attribution models - the most valuable touchpoints often happen offline or through unmeasurable channels

  3. Revenue cohorts reveal true patterns - tracking customer value by acquisition date shows real campaign impact over time

  4. Cross-channel effects are usually the biggest impact - ads drive organic search, word-of-mouth, and direct traffic that traditional attribution misses

  5. Incrementality testing beats sophisticated attribution - simple on/off tests show real business impact better than complex models

  6. The best campaigns often look bad in dashboards - brand awareness and top-funnel campaigns get penalized by last-click attribution

  7. Attribution theater destroys profitable campaigns - making decisions based on platform metrics kills channels that actually drive growth

The biggest mistake is trusting any single attribution model. The truth is always somewhere between the data points, and business intuition matters more than sophisticated tracking when the tracking is fundamentally broken.

How you can adapt this to your Business

My playbook, condensed for your use case.

For your SaaS / Startup

  • Track MRR by acquisition cohort rather than campaign attribution

  • Measure trial-to-paid conversion delays across different ad channels

  • Monitor organic search lift during paid campaign periods

  • Test campaign pauses systematically to prove incrementality

For your Ecommerce store

  • Analyze customer lifetime value by acquisition week regardless of attribution

  • Track branded search volume during high ad spend periods

  • Monitor direct traffic patterns when running display campaigns

  • Test geo-holdout groups to measure true incrementality

Get more playbooks like this one in my weekly newsletter