Growth & Strategy
Personas
SaaS & Startup
Time to ROI
Medium-term (3-6 months)
Last year, I was brought in by a B2B SaaS client who was drowning in data but starving for real insights. They had every tracking pixel imaginable, attribution models that would make a data scientist weep with joy, and dashboards that looked like mission control at NASA.
The problem? Despite all this sophisticated tracking, their growth had plateaued. They knew exactly which blog post led to which email signup, which email led to which demo booking, and which demo converted to paid. Yet somehow, this perfect attribution system was leading them to make terrible decisions.
This experience taught me something counterintuitive: the obsession with perfect attribution can actually blind you to what's really driving growth. Most businesses are optimizing for the wrong metrics, focusing on last-click attribution when the real magic happens in the messy middle of the customer journey.
Here's what you'll learn from my deep dive into cross-channel measurement:
Why your current attribution model is probably lying to you
The framework I developed to measure true channel effectiveness
How to identify which channels create compound growth effects
The metrics that actually predict sustainable growth
Why some of the "best performing" channels were killing long-term revenue
If you're tired of chasing vanity metrics and want to understand what's really moving the needle, this playbook will show you how to build a measurement system that actually drives better decisions. Let's dig into the real science of growth measurement.
Industry Reality
What every marketer thinks they know about attribution
Walk into any marketing team today and you'll hear the same gospel being preached: "We need better attribution." The industry has convinced us that the path to marketing enlightenment lies in perfectly tracking every touchpoint, from first click to final conversion.
The conventional wisdom goes like this:
Multi-touch attribution is king - Track every interaction across every channel to understand the full customer journey
Last-click is outdated - Give credit to all touchpoints, not just the final one before conversion
Data-driven decisions require perfect data - The more granular your tracking, the better your optimization
Channel performance should be measured in isolation - Calculate ROAS for each channel to determine budget allocation
Attribution models solve the dark funnel problem - Advanced modeling can illuminate the mysterious customer journey
This thinking exists because it feels logical and gives us a sense of control. Marketers love frameworks, and attribution modeling provides a seemingly scientific approach to understanding what's working. Plus, the martech industry has billions of reasons to keep selling us more sophisticated tracking solutions.
The problem? This entire approach is built on a fundamental misunderstanding of how modern customers actually behave. In our rush to track everything, we've lost sight of what really matters: sustainable, profitable growth.
Here's where the conventional wisdom falls apart: customers don't follow linear paths, they research across devices and platforms that don't talk to each other, and the most valuable touchpoints often happen in conversations and contexts that are impossible to track. Meanwhile, we're optimizing based on incomplete data and calling it "scientific."
Consider me as your business complice.
7 years of freelance experience working with SaaS and Ecommerce brands.
My wake-up call came while working with a B2B SaaS client who had what looked like a perfect measurement setup. They were tracking everything: organic search, paid ads, content marketing, email campaigns, social media, webinars, partnership referrals - you name it.
Their attribution showed that paid search was their star performer. Clean attribution, clear ROAS, easy to optimize. Content marketing, on the other hand, looked terrible in their reports. Long attribution windows, messy customer journeys, hard to tie directly to revenue.
Based on this data, they'd been systematically shifting budget away from content and doubling down on paid search. The logic seemed sound: more money into what's "working," less into what's "not working."
But here's what was actually happening: their content was doing the heavy lifting of building trust and authority, warming up prospects over months of touchpoints. When these warmed prospects finally searched for a solution, they'd click on a paid ad - and paid search got all the credit.
The wake-up call came when I ran a simple experiment. We paused all content marketing for 30 days while keeping paid search running at the same level. Their paid search conversion rates dropped by 60%. The prospects clicking on ads were now cold traffic instead of pre-warmed leads, and they weren't converting.
This is when I realized that their attribution system was optimizing them into a corner. They were feeding the channel that looked good in reports while starving the channel that was actually creating the conditions for conversion.
The challenge became clear: how do you measure channels that work together rather than in isolation? How do you identify compound effects? How do you allocate budget when your attribution model is fundamentally flawed?
Traditional attribution was telling them a story, but it wasn't the right story. I needed to build a completely different framework for understanding what was really driving growth.
Here's my playbook
What I ended up doing and the results.
Instead of trying to perfect attribution, I developed what I call the "Ecosystem Impact Framework" - a way to measure how channels work together rather than compete for credit.
Phase 1: The Compound Effect Audit
First, I mapped every possible customer touchpoint, but instead of trying to track individual journeys, I looked for patterns. I analyzed what I called "channel velocity" - how the presence of one channel affected the performance of others.
The key insight: some channels are amplifiers, not converters. Content marketing, for example, wasn't just generating direct conversions. It was making every other channel perform better by pre-educating prospects.
Phase 2: The Cohort Analysis Deep Dive
I started tracking cohorts based on their first touchpoint and their conversion path. This revealed something fascinating: customers who first discovered the company through content had 3x higher lifetime value than those who came through paid ads, even when paid ads got credit for the "conversion."
More importantly, content-first customers had much shorter sales cycles once they entered the funnel. They weren't just converting better - they were converting faster because they were already educated.
Phase 3: The Incrementality Test
This is where I got tactical. Instead of trying to measure everything at once, I ran systematic pause-and-resume tests on different channels. The results were eye-opening:
Pausing LinkedIn content for 2 weeks led to a 40% drop in "direct" traffic quality
Stopping the weekly newsletter caused email engagement across all campaigns to decline
Reducing webinar frequency made their sales team's job significantly harder
Phase 4: Building the New Metrics Framework
Based on these insights, I created three new measurement categories:
Converter Channels: Direct revenue attribution (paid search, retargeting, direct sales outreach)
Amplifier Channels: Channels that make other channels perform better (content, PR, thought leadership)
Sustainer Channels: Channels that maintain momentum and prevent decay (email, community, customer success content)
Instead of measuring each channel's individual ROAS, I started measuring the ecosystem ROAS - how much revenue the entire system generated when all channels worked together versus when key amplifiers were removed.
Channel Velocity
Measure how each channel affects the performance of others, not just its individual contribution
Ecosystem ROAS
Calculate return on investment for the entire channel ecosystem rather than individual channels
Incrementality Testing
Use systematic pause-and-resume experiments to identify true channel impact
Compound Metrics
Track metrics that reveal long-term compound effects rather than immediate conversions
The results were transformative. Within 90 days of implementing the new framework, we made several key discoveries that completely changed their channel strategy.
The Attribution Blindness Discovery: What looked like their "worst" performing channel (content marketing) was actually responsible for 70% of their highest-value customers. The content wasn't getting credit in attribution, but it was creating the trust necessary for premium plan conversions.
The Compound Effect Revelation: When we optimized for ecosystem ROAS instead of individual channel ROAS, overall revenue increased by 40% without increasing total marketing spend. We just redistributed budget based on compound effects rather than last-click attribution.
The Quality Multiplier Effect: Customers who had multiple touchpoints with educational content before converting had:
60% higher lifetime value
30% lower churn rates
50% higher expansion revenue
These weren't just better customers - they were customers who required less support, upgraded more frequently, and became advocates who brought in referrals.
The most surprising result? When we stopped optimizing individual channels and started optimizing the ecosystem, every single channel performed better. It turns out that when channels support each other instead of competing for attribution credit, the whole system becomes more efficient.
What I've learned and the mistakes I've made.
Sharing so you don't make them.
Looking back, this experience taught me seven crucial lessons about measurement that every growth team needs to understand:
Attribution is not measurement - Just because you can track something doesn't mean you understand its true impact. Attribution tells you what happened, not why it happened or how to optimize it.
Customer journeys are ecosystems, not funnels - Customers don't follow linear paths. They research across multiple touchpoints, devices, and timeframes. Your measurement needs to account for this complexity.
Some channels are invisible multipliers - The most valuable channels often don't get credit in traditional attribution. They make everything else work better rather than driving direct conversions.
Quality compounds over time - Focusing on conversion rate optimization can lead you to optimize for quantity over quality. Higher-quality leads take longer to convert but are worth exponentially more.
Channel cannibalization is a myth - Channels don't steal from each other when properly orchestrated. They amplify each other's effectiveness.
Incrementality beats attribution - Instead of trying to perfectly track customer journeys, focus on understanding what happens when you remove or add channels. This reveals true causal relationships.
Ecosystem thinking requires different metrics - Traditional marketing metrics weren't designed for complex, multi-touch customer journeys. You need new frameworks for the modern marketing reality.
The biggest lesson? Stop trying to solve the attribution problem and start building measurement systems that account for complexity rather than trying to simplify it away. The goal isn't perfect tracking - it's better decision-making.
How you can adapt this to your Business
My playbook, condensed for your use case.
For your SaaS / Startup
For SaaS startups, focus on:
Measuring customer lifetime value by first touchpoint to identify your highest-quality acquisition channels
Tracking time-to-value metrics across different acquisition sources
Running incrementality tests on content channels that "don't convert" but may be driving all your best customers
For your Ecommerce store
For e-commerce stores, prioritize:
Analyzing repeat purchase rates by acquisition channel rather than just first-purchase ROAS
Testing how brand awareness channels impact your direct and paid search performance
Measuring average order value and customer lifetime value by acquisition source