Back to blog
Startup #user interviews#customer discovery#MVP

MVP User Interviews in 2026: Customer Discovery That Prevents Expensive Mistakes

Most founders do user interviews wrong. A practical guide to customer discovery that uncovers real needs, not polite validation.

15 min · January 15, 2026 · Updated January 27, 2026
Topic relevant background image

TL;DR

  • Most founders “do interviews wrong” without realizing it—they pitch instead of listen, ask about intentions instead of behaviors.
  • Create an interview guide before starting: define research objectives, core questions, and what you’re testing.
  • Focus on past behaviors and current pain, not future intentions—“Would you pay for X?” is nearly worthless.
  • The Mom Test: talk about their life, not your idea. Ask about specifics, not hypotheticals.
  • Classify your idea: Vitamin (nice-to-have), Painkiller (solves immediate pain), or Antibiotic (prevents critical failure). Only painkillers and antibiotics are fundable.
  • Conduct 20–30 interviews minimum before making product decisions—patterns emerge around interview 15.
  • Debrief after every interview: what surprised you? What contradicted your assumptions?

Why User Interviews Matter More Than Ever

In the capital-constrained environment of 2026, building the wrong thing is unaffordable. The era of “build it and they will come” is definitively over. Rigorous customer discovery isn’t optional—it’s the difference between startups that survive and those that burn through runway on products nobody wants.

Yet most founders conduct interviews poorly. They talk to dozens of potential customers but extract little substance due to lack of process, confirmation bias, and the human tendency to be polite rather than honest.

The uncomfortable truth: casual validation with friends and family doesn’t count as research. Neither does a survey with leading questions. Neither does interpreting silence as agreement.

The Problem-First Framework

Before diving into interview techniques, understand what you’re validating:

Vitamin vs. Painkiller vs. Antibiotic

TypeDefinitionExampleFundability
VitaminNice to have, makes life slightly betterAnother note-taking appDifficult in 2026
PainkillerSolves an immediate, felt painTool that automates a tedious weekly taskFundable with traction
AntibioticPrevents critical failure or lossSecurity tool that prevents breachesHighly fundable

Your interviews should reveal which category your idea falls into—not confirm the category you hope it is.

Defining Your Niche of One

The “Ideal Customer Profile” is often too broad. Instead, find your “niche of one”:

  • Not “small business owners” → “Solo consultants billing $150K+ who struggle with client contracts”
  • Not “developers” → “Backend engineers at Series B startups who maintain critical payment systems”
  • Not “busy professionals” → “Working parents at tech companies with 2+ kids under 10”

The narrower your initial focus, the more actionable your interview insights.

The Four Laws of Effective Interviews

Law 1: Create an Interview Guide

Never wing it. Before your first interview, document:

Research objectives (what you’re testing):

  • Is [problem X] actually painful?
  • How do people currently solve this?
  • What would make a solution worth paying for?
  • What are the constraints (budget, time, integrations)?

Core questions (your standard question set):

  • Tell me about your role and typical day
  • What’s the most frustrating part of [activity]?
  • Walk me through the last time you encountered [problem]
  • What did you try? What worked? What didn’t?
  • If this problem disappeared tomorrow, what would change?

Segments to cover (ensure diversity):

  • 5+ interviews per distinct user segment
  • Include both potential power users and casual users
  • Include people who tried competitors

Law 2: Focus on Behaviors, Not Intentions

Never ask:

  • “Would you use this?”
  • “Would you pay for this?”
  • “Do you think this is a good idea?”

These questions get polite affirmation, not truth.

Instead ask:

  • “Tell me about the last time you faced this problem”
  • “What did you actually do?”
  • “How much did you spend on solutions?”
  • “How much time did you lose?”

Past behavior predicts future behavior. Intentions are aspirational and unreliable.

Law 3: Listen More Than You Talk (The Mom Test)

The Mom Test framework: talk about their life, not your idea.

Bad interview (fails the Mom Test):

“We’re building an app that helps busy professionals schedule their day using AI. What do you think?”

This invites validation. Even your mom would say “sounds interesting.”

Good interview (passes the Mom Test):

“What’s the first thing you do when you start your work day?” “How do you decide what to work on first?” “Tell me about a time when your day got derailed. What happened?”

This surfaces real problems without anchoring to your solution.

The 80/20 rule: The interviewee should talk 80% of the time. You talk 20% max.

Law 4: Debrief and Reflect

After every interview, capture:

  1. Top 3 surprises: What didn’t match your expectations?
  2. Key quotes: Exact words that captured insights
  3. Contradictions: What conflicted with previous interviews?
  4. Gaps: What didn’t you ask that you should have?
  5. Hypothesis updates: How should your assumptions change?

Don’t just collect data—synthesize meaning.

The Interview Structure

Before the Interview (5 minutes)

  • Review the interview guide
  • Note what you want to learn from this specific person
  • Clear your mind of assumptions
  • Prepare recording (with permission)

Opening (3–5 minutes)

Build rapport, set context:

  • Thank them for their time
  • Explain the purpose (researching the problem space, not selling)
  • Confirm recording permission
  • Set expectations (30–45 minutes, focused conversation)

Core Questions (25–35 minutes)

Follow your guide, but adapt based on responses:

Context questions (5 minutes):

  • “Tell me about your role”
  • “What does a typical week look like?”

Problem exploration (15 minutes):

  • “What’s the most frustrating part of [area]?”
  • “Walk me through the last time [problem occurred]”
  • “What did you try? What worked?”

Current solutions (10 minutes):

  • “How do you handle this today?”
  • “What tools/processes do you use?”
  • “What’s missing from current solutions?”

Value questions (5 minutes):

  • “If this problem disappeared, what would change?”
  • “How much time/money does this cost you?”
  • “What would a solution need to have?”

Closing (5 minutes)

  • “Is there anything I should have asked but didn’t?”
  • “Who else should I talk to about this?”
  • “Can I follow up if I have more questions?”
  • Thank them, offer to share findings

Analyzing Interview Data

Pattern Recognition

After 15–20 interviews, patterns emerge:

SignalInterpretation
Same problem mentioned by 10+ peopleReal pain, worth solving
Varied solutions being usedMarket exists, opportunity for better approach
High emotional language (“hate,” “nightmare,” “obsessed”)Strong motivation to switch
Specific dollar/time costsQuantifiable value proposition
Requests for specific featuresRoadmap guidance
”I would pay X for this”Weak signal (verify with behavior)
Already paying competitorsStrong signal (willingness to pay proven)

Creating Synthesis Documents

After your interview batch:

Problem Statement:

[User type] struggles with [problem] because [root cause], resulting in [consequence]. Current solutions fail because [gap].

Evidence:

  • Quote 1 (source)
  • Quote 2 (source)
  • Quote 3 (source)

Implications for product:

  • Must have: [feature]
  • Nice to have: [feature]
  • Avoid: [anti-pattern]

Common Interview Mistakes

Mistake 1: Pitching Your Idea

Symptoms: You’re talking more than listening, explaining how your product works Fix: Don’t mention your product until the last 5 minutes (if at all)

Mistake 2: Leading Questions

Symptoms: Questions that contain the answer you want (“Don’t you think X is frustrating?”) Fix: Use open-ended questions, let them define the problem

Mistake 3: Accepting Compliments

Symptoms: “That’s a great idea!” → You feel validated Fix: Compliments aren’t data. Ask about behavior: “Have you tried to solve this? What happened?”

Mistake 4: Talking to Friends

Symptoms: All interviewees know you, want to be supportive Fix: 80%+ of interviews should be with strangers who have no loyalty to you

Mistake 5: Stopping Too Early

Symptoms: 5 interviews, feels like patterns → You stop Fix: Minimum 20 interviews. Patterns are clearer at 15+, edge cases emerge at 20+.

Mistake 6: Ignoring Contradictions

Symptoms: One person says X, another says not-X, you choose the one you wanted to hear Fix: Contradictions are information. Dig deeper: what’s different about these users?

Finding Interview Candidates

Cold Outreach (LinkedIn, Twitter)

Template:

Hi [Name], I’m researching how [role type] handle [problem area]. Not selling anything—just learning. Would you share 25 minutes about your experience? Happy to share findings after.

Expect 10–20% response rate. Send 100 messages to get 15 interviews.

Community Mining

  • Reddit threads about your problem
  • Slack/Discord communities in your space
  • Quora answers about related topics
  • Twitter discussions about pain points

Customer of Competitors

  • Review sites (G2, Capterra, TrustRadius)
  • App store reviews (look for complaints)
  • LinkedIn searches for competitor users

Referral Chains

After each interview: “Who else should I talk to?” One interview often leads to 2–3 more.

Interview Artifacts

The Interview Repository

Create a searchable database:

FieldExample
IntervieweeJane D., Product Manager, FinTech
Date2026-01-15
SegmentMid-market B2B SaaS
Top painManual reporting takes 4 hours/week
Current solutionExcel + Google Sheets
Competitor triedLooked at Tableau, too expensive
WTP signal”Would pay $200/mo for automation”
Notable quote”I dread Monday mornings because of reporting”
Follow-up?Yes, interested in beta

Synthesis Outputs

  • Problem hierarchy: Ranked list of pains by frequency and intensity
  • User personas: Based on interview patterns, not imagination
  • Journey maps: How users actually navigate the problem space
  • Opportunity map: Where current solutions fail

FAQ

How many interviews is enough?

Minimum 20 for a new product, 10 for a feature. You’ll feel like patterns emerge at 5—they’re often wrong. Real patterns crystallize around 15.

Should I record interviews?

Yes, with permission. Recordings let you focus on the conversation and review later for exact quotes. Use automated transcription (Otter, Grain).

What if no one has my problem?

That’s a valid finding. Either you’re targeting the wrong segment, or the problem isn’t as painful as you thought. Pivot your focus or your solution, not your research methodology.

How do I ask about pricing?

Don’t ask “What would you pay?” Ask “How much does this problem cost you in time/money?” Ask “What do you pay for similar tools?” Ask “Tell me about a purchase decision you made recently for a tool.”

Should interviews inform MVP features?

Yes, but carefully. Interviews reveal problems and constraints. They don’t reveal optimal solutions—that’s your job as the builder. Users can tell you what hurts; they often can’t tell you what will heal.

When do I stop interviewing?

When new interviews stop revealing new insights. When you can predict what someone will say based on their profile. When contradictions are explainable by segment differences.

Sources & Further Reading

Interested in our research?

We share our work openly. If you'd like to collaborate or discuss ideas — we'd love to hear from you.

Get in Touch

Let's build
something real.

No more slide decks. No more "maybe next quarter".
Let's ship your MVP in weeks.

Start Building Now