Evidence-Based Strategy
Most roadmaps I see are treated like contracts. Teams build them in Q4, present them to leadership, get approval, and then spend the next year trying to deliver what's written down.
When reality diverges from the plan — and it always does — teams either bend the truth to fit the roadmap or quietly ignore it while pretending they're still on track.
This isn't strategy. It's theater.
Real strategy adapts when evidence changes. It's not a document you write once; it's a habit you practice constantly.
The best product teams I know don't have better long-term vision than everyone else. They have tighter feedback loops.
They're constantly testing their assumptions at the smallest possible scale. Before they commit to building anything substantial, they run cheap experiments: a conversation with five customers, a landing page, a prototype that takes two days instead of two months.
They're not trying to predict the future. They're trying to learn faster than their competitors.
The key is knowing what question you're trying to answer. Not "Will users like this feature?" or something as vague as that but "Will users who currently do X manually pay $50/month to automate it?" because that's testable – you can design an experiment that gives you a clear answer in a week.
You just need to define your thresholds upfront: What result would make you confident enough to invest more? What result would make you kill the idea?
If you can't answer those questions before you run the test, you're not really testing: you're just gathering evidence to confirm what you already believe.
I've seen this repeatedly in multiple teams I've worked with: they run dozens of experiments and learn nothing because they never decided what would count as success or failure or, most importantly, how this answer enables future decision paths. Every result just becomes "interesting" and "worth exploring further."
That's not learning. That's procrastination with data.
In these organizations, the hardest part is actually changing course when the evidence contradicts your plan. Especially when you've already told people what you're building. Especially when you've got designs done and engineering allocated.
The sunk cost fallacy is powerful. It's hard to justify changing course up the chain because organizations are never built for the agility they claim to have. So teams end up spending six months building something they already have evidence won't work, just because they said they would build it in January.
The most mature product teams I've worked with have a simple discipline:
- they start with a product vision and they review their roadmap as regularly as possible against what they've learned, and change it when the evidence demands it.
Not because they're wishy-washy or lack conviction. Because they value learning over being right.
Articles about UX, PM, and AI
Member discussion