Growth & Strategy

Why I Turned Down a $XX,XXX Platform Project (And What I Told the Client Instead)


Personas

SaaS & Startup

Time to ROI

Short-term (< 3 months)

Last year, a potential client approached me with an exciting opportunity: build a two-sided marketplace platform for a substantial budget. The technical challenge was interesting, and it would have been one of my biggest projects to date.

I said no.

Here's why — and what this taught me about the real purpose of gathering user feedback in 2025. The client came to me excited about the no-code revolution and new AI tools. They'd heard these tools could build anything quickly and cheaply. They weren't wrong — technically, you can build a complex platform with these tools.

But their core statement revealed the problem: "We want to see if our idea is worth pursuing." They had no existing audience, no validated customer base, no proof of demand. Just an idea and enthusiasm.

Most founders think gathering user feedback means building first, then asking. But after years of working with SaaS startups and watching countless product failures, I've learned the opposite is true. Here's what you'll discover:

  • Why your first MVP should take one day to build, not three months

  • The difference between feedback that validates and feedback that misleads

  • How to gather meaningful user insights before writing a single line of code

  • Why distribution comes before development (and how to test both)

  • The framework that saved my client $50,000+ in unnecessary development

Reality Check

The advice everyone gives about MVP feedback

The startup world is full of advice about gathering user feedback on your MVP. Here's what you'll typically hear from accelerators, blog posts, and "growth experts":

  1. Build a minimal viable product — Strip down your idea to core features

  2. Launch quickly — Get something out there in 30-90 days

  3. Gather user feedback — Use surveys, interviews, and analytics

  4. Iterate based on data — Improve the product based on what users tell you

  5. Find product-market fit — Keep tweaking until users love it

This conventional wisdom exists because it sounds logical and follows the lean startup methodology. It's what everyone teaches in business school and what successful companies talk about in retrospective blog posts.

The problem? This approach assumes your biggest risk is building the wrong features. But for most startups today, especially in the age of AI and no-code tools, building isn't the constraint anymore. The real constraints are knowing what to build and for whom.

Most founders spend 90% of their time building the product and 10% on marketing and audience building. It should be reversed. Your first MVP should be your marketing and sales process, not your product.

When you follow the traditional approach, you end up with a beautifully crafted solution to a problem nobody cares about, launched to an audience that doesn't exist. The feedback you gather is from the wrong people, asking the wrong questions, at the wrong time.

Who am I

Consider me as your business complice.

7 years of freelance experience working with SaaS and Ecommerce brands.

When that client approached me about building their two-sided marketplace, everything seemed perfect on paper. They had a clear vision, adequate budget, and enthusiasm. But something felt off when they said: "We want to test if our idea works."

This wasn't my first rodeo with ambitious platform projects. I'd seen too many founders burn through budgets building sophisticated solutions that nobody wanted. The signs were all there:

  • No existing audience or community around the problem

  • No validated customer base or early adopters

  • No proof that people were currently solving this problem in a painful, manual way

  • Just market research and competitor analysis

Instead of taking the project, I challenged them with a hard truth: "If you're truly testing market demand, your MVP should take one day to build — not three months."

Yes, even with AI and no-code tools, building a functional two-sided platform takes significant time. But here's what most founders miss: your first MVP shouldn't be a product at all. Your MVP should be your distribution and validation process.

I've worked with enough SaaS startups to know that the technology to build has never been easier. The challenge is knowing what to build and for whom. Distribution and validation come before development — always.

This mindset shift changed everything for how I approach client projects. Instead of jumping straight into development, I now start every project by asking: "How are you going to get your first 100 users?" If they can't answer that question, we're not ready to build anything.

My experiments

Here's my playbook

What I ended up doing and the results.

Rather than building their platform, I walked my client through what I call the "Day 1 MVP" framework — a systematic approach to gathering real user feedback before writing any code:

Week 1: Create the Simplest Possible Test
We started with a simple landing page explaining the value proposition. Not a product demo or interactive prototype — just a clear explanation of what problem we were solving and for whom. This took 4 hours to build using a simple landing page tool.

Week 2-3: Manual Validation Process
Instead of building matching algorithms, we manually connected supply and demand. We reached out to potential users on both sides of the marketplace via LinkedIn, email, and industry forums. When someone expressed interest, we manually facilitated the connection via email and WhatsApp.

Week 4: Measure Real Demand
We tracked every interaction: How many people clicked through? How many filled out the interest form? How many actually followed through when we made a manual introduction? Most importantly — did they see enough value to ask when the "real platform" would be ready?

The Results Were Eye-Opening:
Out of 200 people who visited the landing page, 45 signed up for early access. We manually facilitated 12 connections. Only 3 resulted in actual business being done. When we asked those 3 people if they'd pay for a platform to automate this process, 2 said no — they preferred the personal touch.

That's when we realized the real insight: People didn't want a platform. They wanted a service. The manual process wasn't a bug — it was a feature. This completely changed our understanding of the opportunity.

Instead of building a $50,000 platform, my client pivoted to launching a boutique service business. They started making revenue in month 2 instead of month 12. The manual validation process became their actual business model.

Validation Timeline

Most meaningful feedback comes within the first 2 weeks of manual testing, not months of product development

Pre-Product Research

Interview 10-15 people in your target market before building anything. Ask about current solutions, not your idea

Manual Process First

Manually deliver your core value proposition before automating. This reveals what users actually value vs. what you think they need

Real Money Test

The ultimate validation is when people pay for your manual process, not when they say they'd use your eventual product

The framework delivered exactly what we needed — real user feedback without the development risk. Here's what happened:

Timeline Results:
Instead of spending 3-4 months building and then discovering the market, we validated (and pivoted) the concept in 4 weeks. My client started generating revenue 8 months earlier than if we'd followed the traditional MVP approach.

Financial Impact:
We saved approximately $50,000 in development costs that would have been wasted on the wrong product. More importantly, my client started earning revenue immediately instead of burning through their budget for months.

Quality of Feedback:
The manual process revealed insights that surveys and interviews never could. We learned that users valued the personal curation and relationship aspect more than the efficiency of automation. This insight completely reshaped the business model.

Unexpected Discovery:
The biggest surprise was that our "temporary" manual process became the actual product. What we thought was just a validation method turned into a sustainable, profitable business model. The platform we almost built would have eliminated the very value that customers were willing to pay for.

Learnings

What I've learned and the mistakes I've made.

Sharing so you don't make them.

This experience reinforced several principles I now share with every startup client:

  1. Distribution beats features every time — A mediocre product with great distribution will outperform a great product with no distribution

  2. Manual validation reveals truth — People lie in surveys and interviews, but behavior during manual processes tells the real story

  3. Your constraint isn't building — In the age of AI and no-code, the constraint is knowing what to build and for whom

  4. Revenue validates faster than feedback — When people pay for your manual process, you've found something worth building

  5. The "temporary" solution often becomes the product — Don't be so eager to automate that you eliminate the value customers actually want

  6. Time-to-revenue matters more than time-to-product — Getting paid validates your idea faster than getting built

  7. Start with the end in mind — If you can't explain how you'll get your first 100 customers, you're not ready to build anything

The most important lesson? Effective user feedback comes from observing behavior, not asking opinions. When you manually deliver value, you see exactly what users do, not what they say they'd do.

How you can adapt this to your Business

My playbook, condensed for your use case.

For your SaaS / Startup

  • Start with landing page validation before building features

  • Manually onboard your first 10-20 users to understand their real workflow

  • Track activation metrics during manual process to identify automation priorities

  • Use feedback from paying pilot customers, not free users

For your Ecommerce store

  • Test product-market fit with manual fulfillment before building inventory systems

  • Gather feedback from customers who complete purchases, not just browsers

  • Manually curate product recommendations before building AI algorithms

  • Focus on repeat purchase behavior as the ultimate validation metric

Get more playbooks like this one in my weekly newsletter