The biggest mistake founders make is building something nobody wants. We were determined not to be that founder. Before writing a single line of code for Reel Reviews—our fishing spot discovery app—we spent two months validating that the problem was real, the solution was desirable, and anglers would actually pay for it. This is the complete playbook of how we did it, the techniques we used, and the lessons we learned along the way.
Why Idea Validation Matters
According to CB Insights, 35% of startups fail because there's no market need for their product. That's more than running out of cash, getting outcompeted, or having a poor product combined. The harsh reality is that most startups don't fail because they couldn't build something—they fail because they built something nobody wanted to buy.
Idea validation is the process of testing your assumptions about a business idea before investing significant time, money, and emotional energy into building it. It's about de-risking your venture by gathering evidence that your target customers actually have the problem you think they have, that they're actively looking for solutions, and that they'd be willing to pay for your particular approach to solving it.
🎯 The Validation Mindset
Your goal isn't to prove your idea is great—it's to try to kill it. If you can't kill it after rigorous testing, you might be onto something. Fall in love with the problem, not the solution.
Customer Development: The Foundation of Validation
Customer development, popularized by Steve Blank in his book "The Four Steps to the Epiphany," is the methodology we used to validate Reel Reviews. It's a scientific approach to discovering and validating who your customers are, what problems they have, and whether your solution actually solves those problems.
The core principle is simple: get out of the building and talk to real people. Not your friends, not your family, not people who will tell you what you want to hear. You need to talk to strangers who represent your target market and have honest conversations about their pain points, behaviors, and desires.
Step 1: Problem Discovery Interviews
We started with problem discovery interviews. Our goal wasn't to pitch Reel Reviews—it was to understand how anglers currently find fishing spots, what frustrates them about existing solutions, and whether they even perceived this as a problem worth solving.
Over two weeks, we interviewed 50 anglers in person at boat ramps, fishing shops, and online forums. We used open-ended questions to encourage storytelling:
- "Walk me through the last time you planned a fishing trip. What did you do first?"
- "Tell me about a time you went to a spot and it was a complete waste of time. What happened?"
- "How do you currently decide where to fish when you're visiting a new area?"
- "What tools, apps, or resources do you use for fishing trip planning? What do you love and hate about each?"
- "Have you ever paid for fishing information? Tell me about that experience."
The patterns that emerged were striking. We heard the same frustrations repeatedly: existing review platforms were flooded with outdated information, 5-star ratings were meaningless without context, and anglers craved details about conditions, timing, and what actually worked—not just generic "great spot!" comments.
Step 2: Solution Validation
Once we understood the problem deeply, we moved to solution validation. We created simple mockups and described our proposed solution: an app with verified, recent reviews that included specific details about weather, season, bait used, and actual photos from the day of fishing.
We showed these concepts to a new batch of 30 anglers and asked for their reactions. Crucially, we didn't ask "Would you use this?"—people are notoriously bad at predicting their future behavior. Instead, we asked:
- "If this existed right now, would you use it on your next trip? Why or why not?"
- "What would make you choose this over [current solution they mentioned]?"
- "Is there anything missing that would be a dealbreaker?"
- "How much would you expect something like this to cost?"
The Landing Page Test: Validating Demand at Scale
Customer interviews gave us qualitative confidence, but we needed quantitative validation. We needed to know if our value proposition would resonate at scale and whether we could actually attract customers cost-effectively.
Building the Smoke Test
We built a simple landing page using Carrd that described Reel Reviews as if it already existed. It featured compelling copy, a few mockup screenshots, and a clear call-to-action: "Get early access—join the waitlist."
The key was that we were transparent. The page clearly stated we were in development and collecting interest. We weren't pretending to have a finished product—we were validating that people wanted what we were planning to build.
Running Paid Traffic
We created Facebook ad campaigns targeting fishing enthusiasts in New Zealand, aged 25-65, interested in brands like Shimano, Daiwa, and local fishing publications. We ran three ad variants with different value propositions:
- Ad A: Focused on finding hidden gem fishing spots
- Ad B: Focused on avoiding wasted trips with verified reviews
- Ad C: Focused on connecting with a community of serious anglers
Over two weeks, we spent $500 on ads and achieved a 12% click-through rate—exceptionally high for cold traffic. The "avoiding wasted trips" angle performed best, validating that the pain of bad information was stronger than the promise of discovery.
Most importantly, 8% of landing page visitors signed up for the waitlist. In the world of landing page testing, anything above 2-3% is considered promising. At 8%, we knew we had strong product-market fit potential.
The Concierge MVP: Testing Without Building
Here's where most founders go wrong: they interpret validation signals as permission to build. But there's a crucial intermediate step—the Concierge Minimum Viable Product (MVP). Coined by Eric Ries in "The Lean Startup," a concierge MVP delivers your product's value proposition manually to a small group of customers.
For Reel Reviews, we manually matched 20 beta testers with fishing spots based on their preferences. No app, no algorithm, no automated matching—just emails, phone calls, and manual research. We asked each participant about their fishing preferences, target species, location constraints, and skill level. Then we personally curated spot recommendations and followed up after their trips to gather feedback.
What We Learned From the Concierge MVP
The concierge MVP taught us lessons we never would have learned through surveys or landing pages:
- People don't trust 5-star reviews—they want to know what went wrong, not just what went right. Negative reviews with context were actually more valuable than glowing praise.
- Location matters more than anything else—anglers would drive past a 5-star spot to get to a 3-star spot that was closer to home. Proximity and convenience trumped quality in many cases.
- Photos are non-negotiable—every single participant asked for recent photos. Stock images or photos from different seasons were worse than no photos at all.
- Timing is critical information—knowing a spot produces fish is useless without knowing when. Season, time of day, tide, and weather conditions were essential context.
- Community features were less important than we thought—we assumed social features would drive engagement, but anglers primarily wanted reliable information, not networking.
These insights fundamentally shaped our product roadmap. We deprioritized community features in favor of robust filtering, recent photo verification, and detailed condition reporting.
Advanced Validation Techniques
Beyond the core methods above, we employed several advanced validation techniques that provided additional confidence:
Fake Door Testing
On our landing page, we included buttons for premium features that didn't exist yet—like "Download Premium Spot Guide" and "Book a Local Guide." When users clicked, they saw a message explaining the feature was coming soon. We tracked these clicks to understand which premium features generated the most interest.
Pricing Experiments
We ran A/B tests on our landing page showing different price points: $4.99/month, $9.99/month, and $49.99/year. We weren't actually charging anyone—we just wanted to see if price sensitivity was a concern. Surprisingly, the annual plan had the highest click-through rate, suggesting anglers preferred a one-time commitment over an ongoing subscription.
Competitor Analysis Interviews
We interviewed users of competing apps like Fishbrain and Navionics. We asked why they signed up, what they use regularly, what frustrates them, and what would make them switch. This helped us identify differentiation opportunities and potential partnership angles.
The Result: Validated and Ready to Build
After two months of validation work, we had:
- 80+ customer interviews with detailed notes and patterns
- A landing page with an 8% conversion rate and 200 people on the waitlist
- 20 concierge MVP participants with 90% satisfaction scores
- Clear validation of our core value proposition
- A refined product roadmap based on real user feedback
- Confidence that anglers would pay for a premium solution
When we finally launched Reel Reviews, we had 200 people on the waitlist who had already validated the concept. Our first month retention was 65%—well above the industry average of 20-25% for similar apps. Our net promoter score was 72, indicating strong product-market fit.
Key Takeaways for Your Validation Journey
If you're thinking about building a product, here's what we learned:
- Start with conversations, not code. Talk to 50 potential customers before you write a single line. The insights you gain will shape everything that follows.
- Test value propositions, not features. People don't buy features—they buy solutions to problems. Test whether your solution resonates before worrying about how you'll build it.
- Use landing pages for scale validation. Interviews give you depth; landing pages give you breadth. Both are necessary for complete validation.
- Try the concierge approach. Deliver your value manually before automating it. You'll learn 10x more for 1/10th the cost of building software.
- Fall in love with the problem, not your solution. Be willing to pivot if validation reveals a better opportunity. Your ego is less important than building something people actually want.
- Look for pattern density, not outliers. One person saying they'd pay doesn't validate your idea. Ten people saying the same thing in different words does.
The lesson is clear: Talk to customers before you build. It's uncomfortable, time-consuming, and sometimes humbling. But it's the difference between success and failure. In a world where most startups fail because nobody wants their product, validation isn't optional—it's essential.
References and Further Reading
- Blank, S. (2005). The Four Steps to the Epiphany: Successful Strategies for Products that Win. K&S Ranch.
- Ries, E. (2011). The Lean Startup: How Today's Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses. Crown Business.
- CB Insights (2023). The Top 20 Reasons Startups Fail. CB Insights Research.
- Maurya, A. (2012). Running Lean: Iterate from Plan A to a Plan That Works. O'Reilly Media.
- Fitzpatrick, R. (2013). The Mom Test: How to Talk to Customers & Learn If Your Business Is a Good Idea When Everyone Is Lying to You. CreateSpace.
- Torres, T. (2016). The What & Why of Continuous Discovery. Product Talk.
- Cagan, M. (2018). Inspired: How to Create Tech Products Customers Love. Wiley.
- Perelman, D. (2021). The Product Manager Class No One Asked For. First Round Review.