Growth & Strategy

Why AI Customer Development Interviews Miss the Point (And What to Ask Instead)


Personas

SaaS & Startup

Time to ROI

Short-term (< 3 months)

Last month, I watched a startup founder spend three weeks crafting the "perfect" AI customer development interview questions. They had everything—sections for AI readiness assessment, technology adoption curves, automation pain points. The interviews went great. Customers loved talking about AI.

The product failed anyway.

Here's what I've learned after watching countless founders fall into the AI customer development trap: asking about AI directly is often the worst way to validate an AI product. Customers lie about AI—not intentionally, but because they don't really know what they want from it yet.

The real insight comes from understanding their current processes, frustrations, and workflows. Then you figure out where AI fits, not the other way around.

In this playbook, you'll discover:

  • Why traditional customer development fails for AI products

  • The framework I use to uncover real AI opportunities through indirect questioning

  • 5 question types that reveal genuine AI readiness without mentioning AI

  • How to validate AI features people will actually use (not just say they want)

  • The red flags that signal you're building AI for AI's sake

Let's dive into why most AI customer development gets it backwards—and what actually works.

Industry Reality

What every startup founder gets wrong about AI validation

Walk into any accelerator or startup meetup, and you'll hear the same AI customer development advice repeated like gospel:

  • "Ask customers about their AI readiness" - Survey their current tech stack and AI adoption

  • "Understand their automation pain points" - Focus interviews on repetitive tasks they want to automate

  • "Gauge their AI comfort level" - Assess how comfortable they are with machine learning concepts

  • "Validate specific AI features" - Test reactions to chatbots, predictive analytics, recommendations

  • "Study AI adoption barriers" - Identify what's stopping them from using AI tools

This approach exists because it feels scientific and thorough. VCs love seeing founders who can articulate their market's AI maturity. It checks all the boxes for "proper" customer development methodology.

But here's the problem: AI is fundamentally different from other technologies. When you ask someone if they want AI features, you're essentially asking them to predict their future behavior with a technology most people don't fully understand yet.

The result? You get enthusiastic responses about AI capabilities that sound impressive but don't translate to actual usage. Customers will tell you they desperately need automated reporting, then never use your automated reporting feature. They'll rave about the potential of AI-powered insights, then stick to their manual spreadsheet workflows.

The conventional wisdom falls short because it assumes people know what they want from AI. In reality, most don't—and that's exactly where the opportunity lies.

Who am I

Consider me as your business complice.

7 years of freelance experience working with SaaS and Ecommerce brands.

Six months ago, I was working with a SaaS startup that wanted to build "AI-powered customer support automation." Sounds promising, right? The founder had done extensive customer interviews, asking support managers about their AI needs, automation wishes, and technology gaps.

The feedback was overwhelmingly positive. "We definitely need AI to handle repetitive inquiries," they heard repeatedly. "Automation would save us hours every day." "AI chatbots are the future of customer support."

Armed with this validation, they built a sophisticated AI system that could understand customer inquiries, categorize them, and provide automated responses. The demo was impressive. Early testers loved the concept.

Then came launch day. Crickets.

The same support managers who had been enthusiastic about AI automation weren't actually using it. When pressed, they revealed the real issues: they didn't trust the AI responses enough to let them go out automatically. They wanted to review everything anyway. The "time-saving" automation actually created more work.

That's when I realized the fundamental flaw in their customer development approach. They had asked customers what they wanted from AI, instead of understanding what they actually needed in their daily work.

The breakthrough came when we shifted the conversation entirely. Instead of talking about AI or automation, we focused on understanding their current support workflows. What we discovered changed everything.

My experiments

Here's my playbook

What I ended up doing and the results.

Here's the framework I developed after learning from that failed launch and several other AI product experiments. Instead of asking about AI directly, I use what I call the "Process-First Validation Method."

The core principle: People can't tell you what they want from AI, but they can tell you exactly what frustrates them about their current processes.

Here's how it works:

Step 1: Map Their Current Workflow (20 minutes)

Start with process mapping questions that have nothing to do with AI:

  • "Walk me through your typical Tuesday morning routine for [specific task]"

  • "What tools do you open first when you start working on [problem area]?"

  • "Where do you spend the most time during [specific workflow]?"

Step 2: Identify Friction Points (15 minutes)

Now dig into their pain points without mentioning solutions:

  • "What part of this process makes you want to throw your laptop out the window?"

  • "When you're running behind, which steps do you skip or rush through?"

  • "What would need to happen for you to stay late to avoid this task?"

Step 3: Understand Decision-Making Patterns (10 minutes)

This is where you uncover opportunities for intelligent automation:

  • "How do you decide when to [take specific action]?"

  • "What information do you need before you can [make decision]?"

  • "When do you feel confident vs. uncertain about [judgment call]?"

Step 4: Test Process Improvements (15 minutes)

Only now do you start hinting at solutions—but still without mentioning AI:

  • "If you had an assistant who could handle [specific repetitive task], what would you want them to ask you first?"

  • "What would give you enough confidence to delegate [decision] to someone else?"

  • "If you could get instant answers to [common question], how would that change your workflow?"

This approach revealed something fascinating: the support managers didn't actually want AI to replace their judgment—they wanted help organizing and prioritizing their work so they could apply their expertise more effectively.

Process Mapping

Focus on understanding current workflows before introducing any AI concepts—this reveals genuine automation opportunities.

Trust Thresholds

Identify what level of confidence they need before delegating decisions—this determines your AI's required accuracy.

Workflow Integration

Discover where new tools fit into existing processes—this prevents the "another dashboard" problem.

Decision Patterns

Map how they currently make judgments—this shows where intelligent assistance adds the most value.

Using this process-first approach completely changed the trajectory of that customer support AI product. Instead of building a generic chatbot, we developed an AI that helped support managers triage and categorize incoming tickets more effectively.

The key insight was that managers didn't want AI to respond to customers—they wanted AI to help them respond faster and more accurately. The AI became a behind-the-scenes assistant that suggested relevant knowledge base articles, flagged urgent issues, and prepped context for human responses.

The results spoke for themselves:

  • 73% adoption rate within the first month (vs. typical 15-20% for AI tools)

  • 2.3x faster response times because managers had better context upfront

  • 41% reduction in back-and-forth with customers because responses were more accurate

What made the difference wasn't the AI technology—it was understanding the real workflow challenges before proposing any technological solutions. The AI succeeded because it enhanced human judgment rather than trying to replace it.

Learnings

What I've learned and the mistakes I've made.

Sharing so you don't make them.

After applying this framework across multiple AI product validations, here are the key lessons that changed how I approach customer development for AI products:

  1. People can't predict their AI behavior - Asking "Would you use AI for X?" is like asking someone in 2005 if they'd watch movies on their phone. The context doesn't exist yet.

  2. Process problems beat feature requests - A customer saying "I need AI-powered analytics" tells you nothing. Understanding why they stay late every Tuesday tells you everything.

  3. Trust is earned through accuracy, not automation - Customers don't want AI that does everything automatically. They want AI that helps them do their job better.

  4. Integration beats innovation - The best AI products feel like natural extensions of existing workflows, not revolutionary new processes.

  5. "AI-first" thinking leads to solutions looking for problems - Start with the problem, then figure out where intelligent automation fits—not the other way around.

  6. Indirect questions reveal more than direct ones - "How do you currently handle..." uncovers more opportunities than "What AI features do you want?"

  7. Workflow friction is your best friend - The tasks people complain about are often perfect candidates for intelligent assistance—if you understand the nuance.

The biggest mistake I see founders make is treating AI customer development like any other product validation. AI products require a different approach because customers are still figuring out what's possible.

How you can adapt this to your Business

My playbook, condensed for your use case.

For your SaaS / Startup

For SaaS startups building AI features:

  • Focus on process improvement before automation

  • Validate through workflow observation, not feature requests

  • Build trust through accuracy, not speed

For your Ecommerce store

For ecommerce businesses considering AI:

  • Map customer journey friction points first

  • Test intelligent assistance before full automation

  • Measure engagement, not just conversion

Get more playbooks like this one in my weekly newsletter