A friend and I had lunch recently. I was telling him about all the conversations I have with people looking to build, but struggling to find product market fit. They feel like they have a solution to a real problem, but can’t find people willing to buy.
He joked about how I am a long-time Reddit user, and it’s a huge database of people describing in detail their challenges, followed by other users suggesting solutions. How many threads are created where the consensus is “there is no good solution for this.” He said “If you want to know if there's a real market, find the places where people are already complaining about the problem. Reddit. Facebook groups. Anywhere people talk unfiltered.”
The real question isn't "why isn't my GTM working." The real question was: does the problem I'm solving actually hurt enough people, in enough of the same way, that a product can get traction quickly?
That's a different question. And it doesn't get answered by tweaking subject lines.
I started thinking about what it would look like to actually scan the internet for that signal systematically. Not just Google things. Not just read a few subreddits. But build something that could ingest hundreds of posts across many communities, find the recurring friction patterns, score them against real venture criteria, and hand me a ranked list of opportunities with proposed solutions attached.
So I built it. I called it Signal.
Signal is a three-pass AI pipeline that runs on demand from the command line.
You give it a set of topics and a time window. It asks an AI layer to identify the right Reddit communities for those topics automatically, which means you never have to maintain a hardcoded list of subreddits. You just say "cars, parenting, small business operations" and it figures out where to look.
Then it collects hundreds of posts and comments. It filters by engagement so it's reading things people actually responded to, not low-signal noise.
Pass one runs the collected posts through a friction extraction prompt. It's looking for a specific thing: posts where someone describes a problem they can't solve, a workaround that isn't working, or an existing solution that falls short. Not entertainment. Not opinions. Actual friction.
Pass two clusters those signals. It finds the patterns across hundreds of individual posts and collapses them into named problem categories. "People are overwhelmed by their task list and close the app" becomes a cluster. "Financial institutions failing customers with no escalation path" becomes a cluster. It gives each cluster a frequency count and supporting examples.
Pass three scores each cluster against a five-point filter: reach, urgency, market gap, buildability, and monetization potential. Anything scoring 35 out of 50 or higher is flagged as high-signal. The output is a structured Notion report with a ranked table, deep dives on the top opportunities, and a proposed app concept for each.
The whole thing runs in about 20 minutes on a Raspberry Pi. No Reddit credentials required. The public JSON API is enough.
I ran it across three subreddits with a 30-day window as a test. 155 posts. 59 friction signals. 12 clusters.
Two came back high-signal.
The first was financial system failures: banks, payment processors, and loan servicers that leave people stranded when something goes wrong. Real examples from the data: a bank closing an account with $50k in it, an ATM dispute reversed after initial resolution, a mortgage company refusing payoff timing on a house sale closing. People with nowhere to go. The proposed concept was an escalation tool that converts helpless situations into documented, tracked processes using regulatory complaint templates and consumer protection resources.
The second was task overwhelm freeze: people who open their todo list, see everything at once, close the app, and scroll instead. The insight Signal surfaced was precise: current task management apps make this worse by design. The gap is a tool that shows one or two tasks at a time based on context and energy, hiding the full scope until items are completed.
These are real patterns. Recurring, emotionally intense, underserved by what's currently on the market.
Signal doesn't replace judgment. It surfaces raw material faster than any manual research process I've tried.
The most useful thing it does is separate "people talk about this" from "people are stuck and nothing helps." That distinction matters more than almost anything else at the earliest stage of building something. A lot of founders, myself included, can mistake conversation volume for pain intensity. Signal is tuned specifically to find the pain.
I'm using it to evaluate whether Rebuilt's core premise has the density of signal I need to justify continued investment. I'm also using it to scan entirely new territory. The topics don't have to be things I already care about. The tool is open-ended by design.
The other thing I find valuable: it forces rigor on the scoring. Every cluster goes through the same five criteria. Reach doesn't matter if monetization is impossible. Urgency doesn't matter if the problem isn't buildable in a reasonable timeframe. The filter is opinionated, and that's the point.
Signal lives inside the Emmett repo, which is Meridian's autonomous venture engine. The signal module is self-contained: eight files, no proprietary dependencies beyond the Anthropic and Notion APIs. You need Node.js, an Anthropic key, and a Notion integration.
I'm putting it out there because the methodology is more interesting than any particular application of it. If you're building something and you're not sure the problem is real enough, run this against your topic area. The data will tell you something your assumptions won't.
The full source is at github.com/everettsteele/meridian-emmett. MIT licensed. Fork it, adapt it, ship what you find.
If you improve the extraction or scoring prompts in a meaningful way, open a PR. That's the part where the real value is.