Growth & Strategy
Personas
SaaS & Startup
Time to ROI
Medium-term (3-6 months)
When I started working with a B2B SaaS client as a freelance consultant, their acquisition strategy looked solid on paper. Multiple channels, decent traffic, trial signups coming in. But something was broken in their conversion funnel.
My first move? Diving deep into their analytics. What I found was a classic case of misleading data—tons of "direct" conversions with no clear attribution. Most companies would have started throwing money at paid ads or doubling down on SEO. Instead, I dug deeper.
Here's what I discovered: a significant portion of quality leads were actually coming from the founder's personal branding on LinkedIn. The direct conversions weren't really "direct"—they were people who had been following the founder's content, building trust over time, then typing the URL directly when they were ready to buy.
This case study walks you through how I used the Bullseye Framework to identify their hidden growth engine and systematically test new channels. You'll learn:
Why most companies fail at the Bullseye method (and spread themselves too thin)
How to identify your actual top-performing channel when attribution is broken
The exact testing framework I used to validate 3 new channels in 90 days
Why "Do Things That Don't Scale" became our competitive advantage
How we built a sustainable distribution system around personal branding
Industry Reality
What most SaaS founders think they know about the Bullseye method
The Bullseye Framework, popularized in "Traction" by Gabriel Weinberg, is gospel in the startup world. Every growth consultant preaches the same three-step process: brainstorm all 19 traction channels, test the most promising ones, then focus on the winner.
Most SaaS founders I meet think they're following this framework correctly. They'll tell me about their "omnichannel approach"—running Google Ads, posting on LinkedIn, building an SEO blog, trying cold email, maybe even testing podcast sponsorships. They've got spreadsheets tracking metrics across multiple channels.
Here's what the industry typically recommends:
Test everything: Try 5-7 channels simultaneously to "diversify risk"
Follow the data: Let attribution tools tell you what's working
Scale the winners: Pour more budget into whatever shows the best ROI
Automate quickly: Build systems to manage multiple channels efficiently
Hire specialists: Get experts for each channel to maximize performance
This conventional wisdom exists because it feels smart and diversified. VCs love seeing multiple growth levers. It looks professional in board decks. And yes, some successful companies do run omnichannel strategies.
But here's where it falls short in practice: most early-stage SaaS companies don't have the resources to properly test multiple channels simultaneously. They end up doing everything mediocrely instead of one thing exceptionally well. Worse, broken attribution makes them double down on the wrong channels while missing their actual growth engine.
The real Bullseye Framework isn't about testing everything—it's about finding the one channel that can sustainably deliver the growth you need right now.
Consider me as your business complice.
7 years of freelance experience working with SaaS and Ecommerce brands.
The B2B SaaS client I mentioned was a perfect case study for this problem. They were a workforce management platform targeting small businesses, with a freemium model and around $15K MRR when I started working with them.
Their growth team was running the classic "spray and pray" approach:
Google Ads targeting competitor keywords ($2K/month spend)
Facebook ads to business owners ($1.5K/month)
SEO blog with 2-3 posts per week
Cold email campaigns to HR managers
The founder posting occasionally on LinkedIn
On paper, it looked diversified. In practice, they were burning $4K+ monthly on ads with mediocre results, while their content marketing wasn't generating meaningful traffic.
The real problem? Their attribution was completely broken. Google Analytics showed 60% "direct" traffic, which is startup-speak for "we have no idea where these people come from." The paid channels were getting credit for conversions, but the quality was terrible—high trial signups, almost zero paid conversions.
When I dug into their customer interviews, a pattern emerged. Their best customers—the ones with highest LTV and lowest churn—kept mentioning the same thing: they'd been following the founder's content on LinkedIn for months before signing up.
But here's the kicker: these customers would see a LinkedIn post, visit the website directly (not clicking the link), explore the product, and convert. Attribution gave LinkedIn zero credit, while "direct" traffic got all the glory.
This is exactly the kind of situation where most growth teams make the wrong decision. They see direct traffic converting well and paid ads bringing in signups, so they pour more money into ads while treating LinkedIn as a nice-to-have side activity.
I knew we needed to test this hypothesis properly using the real Bullseye method—not the broken version most companies follow.
Here's my playbook
What I ended up doing and the results.
Instead of continuing the scatter-shot approach, I convinced them to run a proper Bullseye experiment. Here's exactly how we did it:
Step 1: The Attribution Detective Work
First, I built a simple system to track the customer journey more accurately. We added UTM parameters to all LinkedIn posts, created unique landing pages for different channels, and most importantly, added a "How did you hear about us?" field to the trial signup form with specific options.
We also implemented a basic lead scoring system based on engagement behavior. Users who spent more than 5 minutes on the site, visited the pricing page, and engaged with the product demo got flagged as high-intent leads.
Step 2: The Real Bullseye Test
Instead of testing 5-7 channels simultaneously, we focused on properly testing LinkedIn personal branding against our current "best performing" channel (Google Ads). For 30 days:
The founder committed to posting valuable content on LinkedIn daily
We paused Facebook ads entirely and reduced Google Ads spend by 50%
All other activities remained constant as a control
The content strategy was simple: share one insight from running a small business every day. No product pitches, no "follow me for more" nonsense. Just genuine, helpful content about workforce management challenges.
Step 3: The "Do Things That Don't Scale" Approach
Here's where most LinkedIn strategies fail—they try to automate too quickly. Instead, the founder spent 30 minutes daily:
Personally responding to every comment on his posts
Engaging meaningfully with posts from his target audience
Sending personal messages (not sales pitches) to people who engaged with his content
Joining relevant conversations in workforce management groups
This is exactly the kind of manual, unscalable work that most growth teams avoid. But it's also what creates genuine relationships that convert into high-value customers.
Step 4: The Measurement Framework
We tracked three key metrics:
Lead Quality Score: Based on trial behavior and engagement
Trial-to-Paid Conversion Rate: The ultimate measure of channel effectiveness
Customer Acquisition Cost: Time investment vs. paid channel costs
After 30 days, we had our answer. LinkedIn-sourced trials had a 40% higher conversion rate to paid plans compared to Google Ads, and the LTV of these customers was 60% higher. The founder's time investment (about 15 hours/month) was generating better ROI than our $2K/month ad spend.
That's when we made the decisive Bullseye move: we went all-in on LinkedIn personal branding while systematically testing one new channel at a time.
Testing Framework
We used a 30-day sprint approach to test each channel with clear success metrics and decision criteria
Quality Metrics
Lead quality from LinkedIn was 40% higher than paid ads—they actually used the product and converted to paid plans
Time Investment
The founder spent just 30 minutes daily on LinkedIn engagement versus managing multiple paid ad campaigns
Attribution Fix
We tracked the real customer journey with UTM codes and direct customer feedback instead of relying on broken analytics
The results were frankly better than I expected. Within 90 days of implementing the focused Bullseye approach:
MRR grew from $15K to $23K (53% increase)
Trial-to-paid conversion rate improved from 12% to 19%
Customer Acquisition Cost dropped by 60% when comparing time investment vs. ad spend
Average customer LTV increased by 40% due to better product-market fit
But the most surprising result was indirect: the improved attribution revealed that our SEO content was actually performing much better than we thought. People would find the founder on LinkedIn, read his content, then Google the company name and come through organic search. This "dark funnel" was invisible in our previous attribution model.
The LinkedIn content also created a compound effect. High-quality leads would often mention the founder's posts during sales calls, making the closing process much smoother. Our sales cycle shortened from an average of 45 days to 28 days.
Six months later, we tested our second channel using the same methodology: targeted email newsletters to HR professionals. This became our second-best performing channel, but only because we had the bandwidth to focus on it properly after LinkedIn was running smoothly.
What I've learned and the mistakes I've made.
Sharing so you don't make them.
Here are the key lessons I learned from this Bullseye implementation:
Attribution is broken for most B2B SaaS companies—invest in qualitative research and customer interviews before trusting your analytics
"Do Things That Don't Scale" isn't just for early-stage companies—manual, personal touches often outperform automated systems at any stage
Channel quality matters more than channel quantity—one channel delivering high-LTV customers beats five channels bringing in tire-kickers
Personal branding is a distribution channel, not just a vanity metric—treat it like any other growth lever
The real Bullseye method requires patience—most companies give up on channels before they have enough data to make informed decisions
Dark funnel attribution is real—customers often touch multiple channels before converting, but only the last one gets credit
Founder involvement accelerates channel performance—personal credibility can't be outsourced or automated
What I'd do differently: I would have implemented proper attribution tracking earlier in the process. We lost probably 2-3 months of valuable data because we were relying on Google Analytics' default attribution model.
This approach works best for B2B SaaS companies with complex sales cycles where trust and credibility matter. It's less effective for pure self-service products or highly transactional businesses.
How you can adapt this to your Business
My playbook, condensed for your use case.
For your SaaS / Startup
Fix attribution tracking before testing new channels
Focus founder time on one channel that builds long-term trust
Measure lead quality, not just lead quantity
Test channels for minimum 30 days with consistent effort
For your Ecommerce store
Identify which founder content resonates with your ideal customers
Build email capture around high-value content rather than discounts
Track customer journey from first touch to purchase manually
Focus on channels that allow personal connection at scale