Was this newsletter forwarded to you? Sign up to get it in your inbox.
The internet is full of rabbit holes—plenty of shallow ones, a few deep ones, and the occasional tunnel that opens into the future. The last kind are the ones tech investor Sumeet Singh is hunting when he sits down at Butler, a cafe in Brooklyn, with a cup of coffee and his iPad.
This is his Friday and Saturday morning ritual. The iPad is his designated deep-reading tool, a respite from the barrage of pings and notifications that flood his phone and laptop. It’s been loaded with research papers and articles about AI that he has bookmarked throughout the week. He unlocks the screen and opens the first page.
Singh’s job as a venture capitalist is to predict the future and fold that back into how he chooses which companies to invest in through his firm Worldbuild. And while not every article or book he reads will lead to a breakthrough, the process has paid off in the past.
In 2021, while others chased the crypto boom, Singh, who was then a partner at Andreessen Horowitz, noticed a second-order effect: A booming financial ecosystem was emerging, but it was riddled with fraud, creating a need for infrastructure and tooling that could detect fraud. This underscored his bet on Sardine, which was primarily serving fintechs such as neobanks and crypto exchanges at the time. Even after a collapse in cryptocurrency prices into 2023, Sardine’s fraud detection technology was still valuable to banks, online retailers, and fintech companies that operate far outside crypto. Earlier this year, the company raised a Series C at 10 times Singh’s entry valuation.
Those readings helped inspire his Thesis essay, in which he argued that the startups that will succeed in this new era of AI are those that build either:
- The infrastructure layer—the stuff that keeps models scaling: compute, data, energy, and security
- A new generation of apps that are built around what models make possible (as opposed to bolting AI onto existing workflows)
Let’s take a closer look at Singh’s research process.
How Sumeet Singh thinks things through
Step 1: Design your own information ecosystem
Our world is bursting at the seams with information; getting your hands on the right kind, though, remains a challenge. Singh has found a few ways to tilt the odds in his favor:
An algorithm that serves you. “Algorithm hacking” is Singh’s pet research method when he’s trying to know more about a topic. For example, when he was looking for information about the power markets in Texas, he ran the same search on X enough times for the algorithm to take the hint—his timeline was flooded with posts, threads, and debates on exactly that. “You’re still fishing,” he says, “but you’ve set up a trap, and the information starts coming to you.”
Get as close to the source as you can. When Singh studies the history of past technological shifts—like the internet buildout, when investments in networking helped the world move from dial-up to broadband—he makes it a point to go straight to the source. To learn about the early days of the internet era, he digs up news articles from the 1990s on archive sites like the Wayback Machine, and talks to people who were around at the time (and when he worked at a16z, that included chatting with Marc Andreessen on Slack).
Draw from multiple minds. When doing research with LLMs, Singh sets up different models—like ChatGPT and Claude—to play with each other. Lately, he’s been giving different LLMs the same source material and having them role-play as investors he respects—like Peter Fenton from Benchmark and East Rock Capital cofounder Graham Duncan—each poking at a thesis from a different angle. It’s a way to break out of what he calls the “recursive” loop that can form when working with just one model.
All this curated information—screenshots of his DMs, interesting excerpts from research papers—lives inside a single Notion doc, in the hope that when he comes back to it, a more coherent image of future trends will begin to form.
Step 2: Hunt for the hinge that swings open the door
Once the information is in place, the next step is understanding what matters. Singh calls this spotting the “catalyst,” or identifying the subtle forces that make an idea, product, or experiment worth paying attention to. When he comes across something interesting—a founder, an experiment, a paper—he asks himself: Is this really as far out as people think, or is there some nuanced catalyst that makes it possible right now, even though it seems so implausible to others?
The catalyst could be a technological advancement, a change in regulations, or a cultural factor, but to Singh’s mind, the strongest ones are almost always technological. He’s not talking about a “massive top-down catalyst like AI,” but something far more granular, like Apple’s Mac mini gaining enough memory to make running local models practically feasible.
Step 3: Turn research into conviction
With the information sorted and the catalysts identified, Singh moves on to hypothesis generation. He starts asking himself: What products become possible because of these shifts? How do distribution dynamics change? What new value stacks start to form? While doing this, Singh puts his founder hat on, using the analysis to earn credibility with the founders worth building alongside, instead of focusing on being “right” all the time. This exercise helps him turn his research into a sharper point of view—especially important in venture investing, when “everything is up and to the right.”
The two sides of his brain
Singh’s twin hypotheses about which businesses will be successful in an AI-shaped world map neatly onto the two sides of his own mind.
The first—value pooling in the infrastructure layer—appeals to the analytical, financially-driven part of him. It satisfies the side that likes to understand how things work under the hood: the business models; the economics of compute; the messy realities of scaling, securing, and powering these systems.
The second—value emerging from a new generation of apps built around what AI uniquely makes possible—speaks to his creative side. Ideas about the new social interactions, content formats, and consumer experiences that LLMs enable tap into the part of him that’s always been drawn to creativity.
He once ran a custom clothing business where he did all the graphic design work, and these days, he finds his creative side in DJ-ing. According to him, curating music for a DJ set scratches the same itch as going down rabbit holes of information, looking for the interesting bits. Whether he’s digging for new music tracks or technological catalysts, he’s ultimately doing the same thing: searching for the signal in the noise and assembling it into something that makes sense. (By the way, for all the music lovers reading this, Singh’s current favorite album is alternative artist Tame Impala’s latest.)
Rhea Purohit is a contributing writer for Every focused on research-driven storytelling in tech. You can follow her on X at @RheaPurohit1 and on LinkedIn, and Every on X at @every and on LinkedIn.
To read more essays like this, subscribe to Every, and follow us on X at @every and on LinkedIn.
We build AI tools for readers like you. Write brilliantly with Spiral. Organize files automatically with Sparkle. Deliver yourself from email with Cora. Dictate effortlessly with Monologue.
We also do AI training, adoption, and innovation for companies. Work with us to bring AI into your organization.
Get paid for sharing Every with your friends. Join our referral program.
For sponsorship opportunities, reach out to [email protected].
The Only Subscription
You Need to
Stay at the
Edge of AI
The essential toolkit for those shaping the future
"This might be the best value you
can get from an AI subscription."
- Jay S.
Join 100,000+ leaders, builders, and innovators
Email address
Already have an account? Sign in
What is included in a subscription?
Daily insights from AI pioneers + early access to powerful AI tools



Comments
Don't have an account? Sign up!