Growth & Strategy
Personas
SaaS & Startup
Time to ROI
Short-term (< 3 months)
Three months into working with an e-commerce client, I was celebrating what looked like massive success. Facebook's dashboard showed a beautiful 8-9 ROAS. The client was thrilled. I felt like a genius.
Then I dug deeper into the actual data and realized everything was a lie.
The reality? SEO was driving most of the conversions, but Facebook's attribution model was claiming credit for organic wins. We were optimizing for metrics that had nothing to do with our actual distribution success. It was a wake-up call that changed how I measure everything.
Most businesses are drowning in vanity metrics while missing the signals that actually matter for distribution success. They're tracking clicks, impressions, and surface-level engagement while their real growth engine remains invisible.
Here's what you'll learn from my expensive lesson:
Why traditional attribution is broken (and what to track instead)
The 3 distribution metrics that actually predict growth
How to measure the "dark funnel" that most tools miss
A framework for tracking distribution success across channels
The one metric that saved my client $50K in wasted ad spend
Let's dive into what happens when you stop believing the pretty dashboards and start measuring what actually drives sustainable growth. Check out our growth playbooks for more distribution strategies.
Industry Reality
The vanity metrics trap that catches everyone
Walk into any marketing meeting and you'll hear the same metrics repeated like gospel: click-through rates, cost-per-click, impressions, reach, and ROAS. Most distribution strategies are built around optimizing these numbers because they're easy to track and make great PowerPoint slides.
The conventional wisdom goes like this:
Higher CTR means better targeting - If more people click, your message must be resonating
Lower CPC means efficient spending - Cheaper clicks equal better ROI
High ROAS means profitable campaigns - If the platform says 4:1 return, you're winning
Impression volume drives awareness - More eyeballs equal more opportunity
Attribution models tell the truth - Last-click or first-touch data shows what's working
This approach exists because it's measurable, reportable, and feels scientific. Marketing teams can show clear week-over-week improvements. Agencies can demonstrate value with charts that go up and to the right. Everyone feels productive.
But here's where it falls apart: these metrics measure activity, not distribution success. They tell you what happened inside each platform's walled garden, not whether your actual distribution strategy is working. You end up optimizing for the metrics instead of optimizing for growth.
The real problem? Most businesses are playing a game where the score doesn't match the actual outcome. You can have amazing CTRs while your distribution strategy is fundamentally broken.
Consider me as your business complice.
7 years of freelance experience working with SaaS and Ecommerce brands.
I learned this lesson the hard way with a Shopify e-commerce client who had built their entire growth strategy around Facebook Ads. When I started working with them, they had what looked like solid performance - a 2.5 ROAS that most marketers would call acceptable.
But something felt off. The business was growing slower than those numbers suggested it should. The margins were tight, and the dependency on paid ads felt risky. That's when I decided to dig deeper than the surface metrics.
My first move was implementing a complete SEO overhaul alongside the existing ad campaigns. I wanted to diversify their distribution channels and reduce their Facebook dependency. Within a month of launching the SEO strategy, something strange happened.
Facebook's dashboard started showing a dramatic improvement. The ROAS jumped from 2.5 to 8-9 almost overnight. My client was ecstatic. The agency managing their ads took credit for "optimization improvements." Everyone was celebrating.
But I knew better. I hadn't changed anything about the Facebook campaigns. The only new variable was the SEO traffic starting to flow. That's when it clicked - Facebook's attribution model was claiming credit for conversions that were actually coming from organic search.
Here's what was really happening: People would Google the brand name after seeing a Facebook ad days or weeks earlier. They'd land on the site organically, browse around, maybe even bookmark it. Then they'd return directly or through another Google search to make a purchase. Facebook's tracking pixel would fire and claim credit for the sale.
This revelation was both exciting and terrifying. Exciting because it meant our SEO strategy was working better than expected. Terrifying because it meant we'd been making distribution decisions based on completely false data.
Here's my playbook
What I ended up doing and the results.
Once I realized how broken traditional attribution was, I developed a framework for measuring what actually matters in distribution success. Instead of relying on platform-reported metrics, I started tracking what I call "Distribution Health Indicators" - metrics that actually predict sustainable growth.
The Framework: DHI (Distribution Health Indicators)
I built this around three core measurements that can't be gamed by attribution models:
1. Channel Independence Score
This measures how dependent you are on any single distribution channel. I calculate it as: 100 - (largest channel percentage of total traffic). A score of 70+ is healthy, below 50 is dangerous. My e-commerce client started at 25 (75% Facebook dependent) and we got them to 65 within six months.
2. Direct Traffic Velocity
This tracks branded search volume and direct traffic growth over time. It's the best indicator of actual brand awareness and distribution effectiveness. While Facebook claimed credit for conversions, I was watching direct traffic patterns to see which channels were actually building awareness.
3. Cross-Channel Conversion Paths
Instead of trusting last-click attribution, I implemented a system to track the real customer journey. Using UTM parameters, Google Analytics goals, and customer surveys, I mapped how people actually discovered and converted. The data was messy but truthful.
The Implementation Process
I set up what I call "Truth Tracking" - a parallel measurement system that doesn't rely on platform attribution:
Week 1-2: Data Audit
I exported data from all channels and looked for inconsistencies. The Facebook ROAS spike coinciding with SEO launch was the smoking gun that led me to question everything.
Week 3-4: Survey Implementation
Added a simple post-purchase survey asking "How did you first hear about us?" The responses were eye-opening - 60% mentioned Google searches, only 20% mentioned Facebook ads.
Week 5-8: UTM Standardization
Implemented consistent UTM tracking across all campaigns with strict naming conventions. This created a clean data trail that couldn't be polluted by cross-platform attribution fighting.
Week 9-12: Cohort Analysis
Started tracking customer cohorts based on true acquisition channel. This revealed that SEO-acquired customers had 40% higher lifetime value than paid social customers.
The breakthrough came when I stopped looking at individual campaign performance and started measuring distribution portfolio health. Just like financial investments, diversification reduced risk and improved overall returns.
Channel Health
Track distribution diversification to avoid dangerous dependencies on single channels
Attribution Reality
Implement truth tracking systems that reveal actual customer journey paths
Survey Insights
Use post-purchase surveys to understand real discovery patterns beyond tracking pixels
Portfolio Approach
Measure distribution like investments - diversified channels reduce risk and improve returns
Within three months of implementing this new measurement framework, the results were transformative. The client reduced their Facebook ad spend by 40% while maintaining the same revenue levels. More importantly, they discovered that their SEO strategy was generating 3x more value than Facebook's attribution suggested.
The real breakthrough was understanding their true distribution mix. What Facebook reported as 80% of conversions was actually closer to 30%. SEO was driving 45% of revenue, direct traffic was 15%, and email marketing was contributing 10%. This revelation changed everything about budget allocation.
Their Channel Independence Score improved from 25 to 68, making the business far more resilient. When iOS 14 hit and Facebook tracking became even less reliable, they were prepared. While competitors saw 50%+ drops in reported performance, this client barely noticed because they weren't dependent on Facebook's attribution lies.
The most unexpected outcome? Customer quality improved dramatically. When we stopped optimizing for Facebook's metrics and started optimizing for real distribution health, we attracted customers with higher lifetime value and lower churn rates.
What I've learned and the mistakes I've made.
Sharing so you don't make them.
This experience taught me that attribution is fundamentally broken in the age of privacy changes and multi-device journeys. The biggest lesson: don't trust any single platform's version of the truth. Build your own measurement system.
Here are the key insights that changed my approach:
Platform metrics are marketing, not measurement - Each platform wants to prove its value, so they'll claim credit for anything they can
The dark funnel is bigger than the light funnel - Most customer journeys happen outside trackable touchpoints
Distribution health beats campaign performance - A diversified 3:1 ROAS is better than a concentrated 8:1 ROAS
Surveys beat pixels - Asking customers directly gives more accurate attribution than any tracking technology
Channel independence is a competitive advantage - When algorithm changes hit, diversified businesses survive
Customer quality varies by channel - Some channels bring higher LTV customers than others
True measurement takes patience - Real distribution success shows up in quarterly trends, not daily dashboards
If I had to start over, I'd implement truth tracking from day one instead of relying on platform attribution. The three months spent building our own measurement system paid for itself many times over in better decision-making.
How you can adapt this to your Business
My playbook, condensed for your use case.
For your SaaS / Startup
For SaaS companies implementing this framework:
Track trial-to-paid conversion by true acquisition channel - Some channels bring trial users who never convert
Measure Channel Independence Score monthly - Aim for 70+ to reduce platform risk
Survey new customers about discovery path - Build this into your onboarding flow
Track direct traffic growth as brand health indicator - This shows true distribution success
For your Ecommerce store
For e-commerce stores implementing this approach:
Add post-purchase survey asking about discovery - This reveals real attribution beyond tracking pixels
Monitor customer lifetime value by true acquisition channel - Some channels bring more valuable customers
Track branded search volume growth - This indicates actual brand awareness building
Diversify beyond paid social - Aim for 60%+ non-paid distribution