Growth & Strategy
Personas
SaaS & Startup
Time to ROI
Medium-term (3-6 months)
Last year, I watched a B2B startup burn through $50K in ad spend while their attribution platform proudly claimed they were "profitable." The reality? They couldn't track 70% of their actual customer journey, and their "winning" campaigns were just getting credit for organic conversions.
Here's the uncomfortable truth: traditional cross-channel tracking is broken. Not because the technology is bad, but because we're trying to solve a 2025 problem with 2019 thinking. The dark funnel—where customers research across multiple touchpoints without leaving trackable footprints—has become the norm, not the exception.
After working with dozens of SaaS and ecommerce clients, I've stopped chasing perfect attribution. Instead, I've developed what I call a "dark funnel strategy" that embraces the chaos of modern customer journeys and actually drives better results.
In this playbook, you'll discover:
Why cross-channel attribution tools often lie to you (and cost you money)
The product-channel fit framework that matters more than tracking
My 3-layer dark funnel strategy that increases conversion without perfect tracking
Real metrics from clients who stopped obsessing over attribution
When to trust your data (and when to trust your gut)
Reality Check
What tracking gurus won't tell you
If you've researched cross-channel tracking, you've probably heard the same advice everywhere:
"Implement multi-touch attribution to see your full customer journey." Set up UTM parameters, install Facebook Pixel, connect Google Analytics, integrate your CRM, and voilà—you'll finally understand which channels drive conversions.
The typical recommended setup includes:
First-touch attribution to see initial awareness drivers
Last-touch attribution for conversion credit
Multi-touch models that distribute credit across touchpoints
Cross-device tracking to follow users across platforms
Unified dashboards that aggregate all your data
This advice exists because marketers desperately want control. We want to believe we can track every click, attribute every conversion, and optimize every dollar spent. It makes us feel professional, scientific, data-driven.
But here's what happens in practice: you spend months setting up tracking, only to discover that 60-80% of your conversions show up as "direct traffic" or get misattributed to the wrong channels. iOS privacy updates kill your Facebook tracking. Customers clear cookies, switch devices, and research for weeks before buying.
The conventional wisdom falls short because it's based on a linear customer journey that no longer exists. Modern buyers don't click an ad and immediately convert—they see your LinkedIn post, Google your company later, read reviews on a different device, discuss with colleagues, then maybe visit your website directly weeks later.
So what do you do when perfect tracking is impossible? You stop chasing ghosts and start building systems that work regardless of attribution accuracy.
Consider me as your business complice.
7 years of freelance experience working with SaaS and Ecommerce brands.
Two years ago, I was working with a B2B SaaS client who was convinced their Facebook ads weren't working. Their attribution platform showed a 2.5 ROAS, which seemed mediocre for their industry. They wanted to cut Facebook spend and double down on Google Ads, which showed much better attribution numbers.
Something felt wrong. Their overall revenue was growing, but none of their individual channels looked like clear winners. I suggested we dig deeper before making any major budget shifts.
The client had a complex B2B sales cycle—prospects typically researched for 2-3 months, involved multiple decision-makers, and switched between devices constantly. They sold project management software to teams, so the buying process involved demos, trials, stakeholder discussions, and budget approvals.
Here's what their attribution dashboard showed:
Facebook Ads: 2.5 ROAS
Google Ads: 4.2 ROAS
Organic Search: 8.1 ROAS
Direct Traffic: 12.3 ROAS
The obvious conclusion? Facebook was the worst performer, and direct traffic was their golden goose. But I suspected Facebook was actually feeding the other channels—creating awareness that led to branded searches and direct visits later in the journey.
I convinced them to run a test: we paused Facebook ads for 30 days to see what happened to their "high-performing" channels. The client was nervous about losing their "mediocre" traffic source, but agreed to the experiment.
Within two weeks, we saw a 40% drop in branded Google searches. By week three, direct traffic had decreased by 35%. The organic search and direct conversions that attribution tools credited to those channels were actually being driven by Facebook awareness campaigns.
When we turned Facebook back on, branded searches and direct traffic gradually recovered. The attribution platform still showed the same misleading numbers, but we now understood the real relationship between channels.
This experience taught me that attribution tools often lie—not intentionally, but because they can't capture the full complexity of modern customer journeys. The solution isn't better tracking; it's building distribution systems that work regardless of perfect attribution.
Here's my playbook
What I ended up doing and the results.
After that Facebook experiment failure, I stopped trying to solve attribution and started building what I call "dark funnel strategies." Instead of chasing perfect tracking, I focus on creating multiple touchpoints that work together, even when I can't measure exactly how.
Layer 1: Coverage Strategy
Instead of picking "winning" channels based on attribution, I help clients achieve broad coverage across their target audience's research habits. For B2B SaaS, this typically means:
LinkedIn content for professional visibility
SEO content for problem-aware searches
Paid ads for retargeting and awareness
Email nurturing for relationship building
The goal isn't to track which touchpoint "wins"—it's to be present wherever prospects might encounter you during their lengthy research process.
Layer 2: Signal Amplification
Rather than trusting attribution platforms, I track leading indicators that reveal channel health:
Branded search volume indicates awareness campaigns are working
Direct traffic quality shows repeat engagement
Email list growth from different sources
Sales conversation quality and source variety
For my SaaS client, I started tracking these signals instead of relying solely on last-click attribution. We noticed that when LinkedIn content performed well, trial-to-paid conversion rates improved across all channels—even "direct" signups.
Layer 3: Holdout Testing
This is the most powerful technique I use: systematic channel pausing to understand true impact. Here's my framework:
Week 1-2: Establish baseline metrics across all channels Week 3-4: Pause one channel completely Week 5-6: Monitor secondary effects on other channels Week 7-8: Restart the channel and measure recovery
I've used this approach with six different clients now. Every time, we discover that high-attribution channels are stealing credit from awareness-driving activities. Facebook ads rarely get credit for the branded Google searches they generate. LinkedIn content doesn't track to the direct traffic it creates.
The most successful clients stopped optimizing individual channels and started optimizing their entire distribution ecosystem. We measure success by:
Total qualified leads regardless of attribution
Customer acquisition cost across the entire funnel
Revenue growth period over period
Pipeline velocity and close rates
This approach requires patience—you're building a system, not optimizing individual campaigns. But it creates more sustainable growth because you're not constantly shifting budgets based on misleading attribution data.
Distribution Coverage
Focus on ecosystem presence rather than individual channel optimization
Holdout Testing
Systematically pause channels to understand true cross-channel impact
Signal Tracking
Monitor leading indicators instead of relying on attribution platforms
System Thinking
Optimize for total growth rather than individual channel performance
The results from this dark funnel approach have been consistently better than traditional attribution-based optimization:
For the original SaaS client, after implementing the full 3-layer strategy:
Overall CAC decreased by 28% when we stopped budget shuffling based on attribution
Pipeline quality improved significantly—fewer unqualified leads, higher close rates
Branded search volume increased 3x over six months
Sales team reported better lead quality across all sources
With an ecommerce client, the impact was even clearer. After stopping their constant Facebook ad optimization based on attribution:
Total revenue grew 45% while maintaining similar ad spend
Customer lifetime value increased as we attracted more intentional buyers
Return customer rate improved when we focused on brand building vs. conversion campaigns
The most interesting outcome? Their attribution platform still showed the same "winning" and "losing" channels, but overall business performance was dramatically better. This confirmed my hypothesis that attribution tools measure correlation, not causation.
What surprised me most was how much mental energy this approach freed up. Instead of obsessing over tracking setup and data discrepancies, we could focus on creating better content, improving customer experience, and building genuine relationships with prospects.
What I've learned and the mistakes I've made.
Sharing so you don't make them.
After implementing this dark funnel strategy with multiple clients, here are the most important lessons:
Attribution accuracy decreases as buyer sophistication increases. B2B customers with long research cycles are nearly impossible to track accurately. Accept this reality instead of fighting it.
"Direct" traffic is rarely actually direct. It's usually the result of brand awareness campaigns you can't track. Treat high direct traffic as a positive signal, not a mystery to solve.
Holdout testing beats attribution modeling. Temporarily pausing channels reveals true impact better than any tracking platform. Build this into your quarterly planning.
Leading indicators are more reliable than lagging ones. Brand search volume, email list growth, and content engagement predict future sales better than last-click attribution.
Cross-channel effects are stronger than single-channel optimization. Facebook ads improve Google Ad performance. SEO content increases email conversion rates. Optimize for synergy, not isolation.
Customer research behaviors trump tracking capabilities. Build your strategy around how customers actually discover and evaluate solutions, not around what you can measure.
Patience outperforms optimization. Consistent presence across multiple channels for 6+ months works better than constantly shifting budgets based on weekly attribution reports.
The biggest mindset shift? Stop treating marketing like a science experiment and start treating it like ecosystem building. Your goal isn't to find the perfect channel—it's to create multiple pathways for customers to discover and trust you over time.
This approach works best when you have a complex sales cycle, sophisticated buyers, or products that require consideration. It's less effective for impulse purchases or simple B2C transactions where attribution might actually be accurate.
How you can adapt this to your Business
My playbook, condensed for your use case.
For your SaaS / Startup
For SaaS startups implementing dark funnel strategy:
Focus on organic acquisition channels that build long-term brand equity
Track trial-to-paid conversion rates by true source, not just attribution source
Build content that addresses every stage of your lengthy B2B sales cycle
Use customer interviews to understand actual discovery and evaluation processes
For your Ecommerce store
For ecommerce stores building cross-channel presence:
Monitor branded search trends as a leading indicator of awareness campaign success
Focus on customer lifetime value rather than first-purchase attribution
Build retargeting audiences across multiple platforms for maximum coverage
Test channel pause experiments during low-impact periods to understand true relationships