Sales & Conversion

How I Stopped Wasting Ad Spend by Testing Creatives Instead of Audiences


Personas

Ecommerce

Time to ROI

Short-term (< 3 months)

OK, so here's the thing about Facebook ads that nobody wants to admit: we're all still stuck in 2018 strategies while the platform completely changed around us. You know what I'm talking about - spending hours crafting the "perfect" audience, testing different demographics, interests, and behaviors, then wondering why your CPM keeps climbing and conversions keep dropping.

I learned this the hard way when I was managing Facebook ads for a B2C Shopify store. We had decent traffic, the ads were getting clicks, but something was fundamentally broken. The traditional approach of testing audiences wasn't moving the needle. That's when I discovered something that changed everything: creatives are the new targeting.

In this playbook, you'll discover:

  • Why audience targeting is mostly dead (and what Facebook's algorithm actually optimizes for)

  • My framework for testing 3 new creatives every single week

  • How to structure campaigns that let the algorithm do the heavy lifting

  • The simple testing rhythm that prevents creative fatigue

  • Real examples of creative variations that actually convert

This isn't another generic "Facebook ads guide" - it's what actually works when ecommerce stores need to scale without burning cash on outdated strategies.

Strategy Shift

Why everyone's still stuck in the past

Walk into any marketing agency or scroll through Facebook ads courses, and you'll hear the same tired advice: "It's all about audience research. Find your perfect customer avatar. Test lookalike audiences. Narrow down interests." The gurus are still teaching strategies from 2018 like iOS 14 never happened.

Here's what the industry typically recommends for Facebook ad testing:

  1. Detailed audience research - Spend weeks mapping out demographics, interests, and behaviors

  2. Complex audience testing - Create multiple ad sets with different targeting parameters

  3. Lookalike audience optimization - Test 1%, 3%, 5% lookalikes from different source audiences

  4. Interest stacking - Layer multiple interests to "find the perfect audience"

  5. Exclusion tactics - Exclude audiences to avoid overlap and "improve targeting"

This advice exists because it used to work. Pre-iOS 14, Facebook had detailed tracking data and could effectively target based on interests and behaviors. Agencies built their entire methodology around audience sophistication because the algorithm needed that level of direction.

But here's the uncomfortable truth: privacy regulations killed detailed targeting. iOS 14, GDPR, and other privacy changes stripped away the data that made audience targeting effective. Yet most marketers are still optimizing for a world that doesn't exist anymore.

The result? You're fighting the algorithm instead of working with it, burning budget on audience experiments while your competitors focus on what actually moves the needle: creative content that the algorithm can optimize around.

Who am I

Consider me as your business complice.

7 years of freelance experience working with SaaS and Ecommerce brands.

When I started managing Facebook ads for this B2C Shopify store, I fell into the same trap everyone does. The client had over 1,000 products in their catalog, decent website traffic, but Facebook ads weren't delivering the ROAS they needed. Classic situation, right?

My first instinct was to do what every "expert" recommends: dive deep into audience research. I spent weeks analyzing their customer data, creating detailed buyer personas, mapping out interest categories. We're talking spreadsheets of demographic breakdowns, behavioral patterns, the whole nine yards.

Then I launched the traditional testing approach: multiple ad sets with different audience segments. Lookalike audiences based on purchasers, website visitors, email subscribers. Interest-based audiences around their product categories. Broad audiences with age and gender restrictions. I was convinced this methodical approach would crack the code.

The results? Mediocre at best. We were getting a 2.5 ROAS, which looked decent on paper, but with their margins, it wasn't sustainable. More importantly, I noticed something strange in the data - the "winning" audiences kept changing week to week. An audience that performed great one week would tank the next.

That's when I realized we were fighting the wrong battle. Facebook's algorithm had evolved way past our targeting sophistication. While I was spending hours crafting audience parameters, the platform was getting fed bland, generic ad creatives that didn't give the algorithm any real signal to work with.

The breakthrough came when I started looking at this backwards. Instead of asking "who should see this ad?" I started asking "what content makes people stop scrolling?" That shift in thinking changed everything about how I approached Facebook advertising.

My experiments

Here's my playbook

What I ended up doing and the results.

Here's the framework I developed after realizing that creative testing beats audience testing every single time. I call it the "Creative-First Campaign Structure" because it flips the traditional approach on its head.

Step 1: Simplify Your Campaign Structure

Instead of multiple campaigns with complex audience targeting, I restructured everything into one primary campaign with a single broad audience. No detailed interests, no complex lookalikes, no demographic restrictions beyond basic age ranges. Just let Facebook's algorithm find the right people.

The magic happens in the ad set structure. Instead of testing audiences, each ad set contains different creative angles. So you might have:

  • Ad Set 1: Problem-focused creatives

  • Ad Set 2: Solution-focused creatives

  • Ad Set 3: Social proof creatives

  • Ad Set 4: Lifestyle-focused creatives

Step 2: Implement the "3 New Creatives Per Week" Rule

This was the game-changer. Every single week, without fail, we produced and launched 3 new creative variations. Not minor tweaks - completely different angles, hooks, or visual approaches. This constant feed of fresh creative content gave Facebook's algorithm new signals to optimize around.

The creative variations included:

  • Different hooks in the first 3 seconds of video ads

  • Static images vs. video vs. carousel formats

  • Customer testimonials vs. product demonstrations

  • Problem-agitation-solution vs. direct product showcase

Step 3: Track Creative Performance, Not Audience Performance

I shifted all reporting and optimization focus from audience metrics to creative metrics. Instead of asking "which audience converts best?" we asked "which creative angle resonates most?" This changed how we analyzed data and made optimization decisions.

Each creative got tracked for:

  • Hook rate (3-second video views)

  • Engagement rate and quality of comments

  • Click-through rate to landing page

  • Cost per acquisition at the creative level

Step 4: Scale Winners, Kill Losers Fast

With 3 new creatives launching every week, we needed a systematic approach to identify winners and losers quickly. Any creative that didn't hit our CPA benchmark within 48-72 hours got paused immediately. Winners got increased budget allocation and became the foundation for new creative variations.

This approach meant we were constantly feeding the algorithm fresh, high-performing creative content while cutting off poor performers before they could drain budget.

Testing Rhythm

Launch 3 new creative variations every week without fail. This constant refresh gives Facebook's algorithm new optimization signals and prevents creative fatigue before it kills performance.

Creative Angles

Focus on different emotional hooks: problem-focused, solution-focused, social proof, and lifestyle angles. Each angle appeals to different psychological triggers in your audience.

Algorithm Partnership

Work with Facebook's machine learning instead of against it. Broad audiences let the algorithm find the right people while strong creatives give it clear signals about what works.

Performance Tracking

Track creative-level metrics: hook rates, engagement quality, CTR, and CPA per creative. This data reveals what resonates with your actual audience versus theoretical targeting.

The transformation was immediate and measurable. Within the first month of implementing this creative-first approach, we saw significant improvements across all key metrics.

Our ROAS jumped from 2.5 to 4.2 within 30 days. More importantly, the performance became consistent week over week, instead of the roller coaster we experienced with audience-based testing. The algorithm had clear, strong creative signals to optimize around.

But here's what really surprised me: our best-performing audiences weren't the ones I would have chosen. The broad audience approach let Facebook find pockets of high-intent users that our manual targeting would have missed completely.

The creative testing revealed insights we never would have discovered through audience research. For example, our highest-converting creatives focused on convenience and time-saving, not the product features we thought were most important. User-generated content consistently outperformed polished brand content.

Within 3 months, this approach became our standard framework, and we scaled it across multiple product categories within their catalog. The key was maintaining that discipline of 3 new creatives every week - that's what kept the algorithm fed with fresh optimization signals.

Learnings

What I've learned and the mistakes I've made.

Sharing so you don't make them.

Looking back at this experience, here are the key lessons that completely changed how I approach Facebook advertising:

  1. Creative fatigue kills campaigns faster than bad targeting - Even your best-performing creative will stop working eventually. The 3-per-week rhythm prevents this.

  2. Broad audiences work better than detailed targeting - Let Facebook's algorithm do what it's designed to do: find people likely to convert based on your creative signals.

  3. The algorithm optimizes around creative quality, not audience sophistication - Strong creatives tell Facebook exactly who to target better than any manual audience research.

  4. Test angles, not demographics - Different emotional hooks and messaging angles reveal more about your audience than age/gender/interest data ever will.

  5. Speed beats perfection in creative testing - It's better to launch 3 "good enough" creatives than spend a week perfecting one. The market will tell you what works.

  6. User-generated content often outperforms brand content - Real customers talking about your product carries more weight than polished marketing messages.

  7. Track leading indicators, not just conversion metrics - Hook rates and engagement quality predict conversion performance better than impressions or reach.

The biggest mistake I made initially was treating Facebook ads like Google Ads - trying to control every aspect of who sees what. Facebook's strength is machine learning optimization, but it needs high-quality creative inputs to work effectively.

How you can adapt this to your Business

My playbook, condensed for your use case.

For your SaaS / Startup

For SaaS companies implementing this approach:

  • Focus on problem-solution creative angles rather than feature lists

  • Test customer success stories and case study formats

  • Create demo-style videos showing actual product usage

  • Use broad "business owner" or "startup founder" audiences

For your Ecommerce store

For e-commerce stores using this framework:

  • Test lifestyle vs product-focused creative angles

  • Create user-generated content campaigns for authentic social proof

  • Focus on benefit-driven hooks rather than feature descriptions

  • Use seasonal and trending content to maintain relevance

Get more playbooks like this one in my weekly newsletter