This is Curated Intelligence, a new series where I extract patterns from the best AI thinking that others miss. It's an experiment within my irreplaceable newsletter, so let me know in the comments if you want more like it.
I've been consuming AI content like it's my job (because it kind of is) and I noticed something. Everyone's covering WHAT's happening with AI, but nobody's showing you HOW to think about what's happening.
Last week, I found three completely different sources talking about AI, and they're all secretly talking about the same thing. They just don't know it yet.
The 25-Minute Conversation That Changed Everything
Let me start with this video from Nate Jones that absolutely blew my mind. If you don't know his Substack, you're missing out (he's brilliant).
[Video: "Most of Us Are Using AI Backwards—Here's Why"]
Nate spends 25 minutes talking to AI about his book, and everyone's focused on his point about compression versus expansion:
"We are using AI primarily for information compression... But something that I heard that's been really sticking with me is the idea that the brain doesn't process compressed information in the same way... A lot of the learning that you get when you read a large book, a deep book on a big subject, it comes from your brain forming new connections as it spends extended time in the subject."
Smart insight, right? Use AI to think deeper, not just summarize. But then Nate says something that made me pause and replay three times:
"It's like this crossover between the way a therapist listens to you and the way a colleague listens to you. And you'd never expect a human to do that, but it's super helpful for your thinking."
Did you catch that? Nate just described a relationship. Not a tool. A RELATIONSHIP.
And listen to how he describes what made it special:
"It was there when I needed to talk out loud. It would let me talk out loud for a while. It actually listened. It actually took notes and it actually responded with just enough interest, engagement, and riffing to keep my brain flowing."
"It actually listened." When was the last time someone said that about a tool?
Now, I know what you might be thinking, "Isn't that just good UX design? AI that listens well?" That's what I thought at first, too. But why did Nate try THREE different AI models? Why spend 25 minutes on something he called "vanilla insights"?
If it was just about the information, he would have stopped after five minutes.
No... Nate was shopping for the right conversational partner.
As Nate himself says later:
"It's about knowing when to sort of pull the thinking button and say, 'I want to use this to actually expand my time thinking about this because this is really important. This is work that needs and deserves the cream of my brain.'"
The cream of his brain. When did we start hoarding our best thinking?
How OpenAI's CEO Is Programming Our Acceptance of AI Relationships
So, while I'm processing Nate's insight, I stumble upon Sam Altman's latest post, “The Gentle Singularity." And buried in his AGI predictions is this line:
"We are past the event horizon; the takeoff has started. Humanity is close to building digital superintelligence, and at least so far it's much less weird than it seems like it should be."
"Much less weird than it should be." That's not an observation. That's a STRATEGY. He's normalizing something that should feel earth-shattering. Watch how he does it:
"In the most important ways, the 2030s may not be wildly different. People will still love their families, express their creativity, play games, and swim in lakes."
This is an existential change wrapped in mundane activities. But here's the real tell from Sam:
"A subsistence farmer from a thousand years ago would look at what many of us do and say we have fake jobs, and think that we are just playing games to entertain ourselves since we have plenty of food and unimaginable luxuries."
He's not just predicting the future. He's programming our acceptance of it and making us comfortable with a complete redefinition of meaningful work.
And suddenly it clicked... Both Nate and Sam are revealing the same hidden need. We're intellectually STARVED. Nate's 25-minute conversation wasn't about the book. It was about having someone (something) that would think WITH him.
Not for him. WITH him.
Sam Altman knows this. He's preparing us for a world where AI relationships are normal. Because he knows we're hungry for them.
And he even warns us about misalignment in the same post:
"Social media feeds are an example of misaligned AI; the algorithms that power those are incredible at getting you to keep scrolling and clearly understand your short-term preferences, but they do so by exploiting something in your brain that overrides your long-term preference."
The difference is those algorithms exploit our need for connection. The AI Nate spent 25 minutes with? It fulfills it.
The Entrepreneur's Tips That Accidentally Reveal Our Intellectual Starvation
Then (and this is where it gets wild) I read this email from Jack Roberts. I'm in his community, Automations by Jack, and he just dropped this email about his conversation with Dan Martell, a $100M entrepreneur. Dan shared seventeen business tips. But as I'm reading them, I realize they're not business tips. They're a journey.
First tip:
"If you didn't ask AI first, you're already behind. Someone else is already moving faster because they did."
But here's MY take on what Dan's really saying:
"When in doubt, ask AI... it helps you explore your own thinking. You know the answer, you just need somebody to help think through the process."
"You just need SOMEBODY." Not something. Somebody.
Every single tip maps to a stage of overcoming intellectual isolation (at least that's what I'm seeing). Listen to Dan's tip about sharing:
"Share what you learn (That's where fulfilment lives)."
And here's what hit me about this:
"Give away your best stuff... at the end of the day, people still want you to do the stuff for them."
Do you see it? The journey from intellectual loneliness to intellectual leadership:
Private awakening ("ask AI first")
Internal resistance ("don't let emotions block you")
Finding your tribe ("sell to mindset, not category")
Full circle ("share what you learn")
The $100M secret isn't tactics. It's that Dan figured out how to cure his intellectual loneliness with AI, then built a business helping others do the same.
The Coming Divide: Why Some Minds Will Leap Ahead While Others Stagnate
So let me paint you a picture of what's coming...
We're about to have massive cognitive inequality. Not based on access to information (we all have that). Based on willingness to engage in deep thinking partnerships with AI.
While some people debate whether to use AI, others are having 25-minute thinking sessions. Daily. They're becoming different types of thinkers. More connected thoughts. Richer mental models. Deeper insights.
In two years, the gap between those who feed their intellectual hunger and those who suppress it? Exponential.
Nate's thinking changed in those 25 minutes.
Sam Altman is preparing us to accept that as normal.
Dan Martell built a $100M empire on it.
This isn't about AI tools. It's about AI relationships. And most people don't even know they're hungry.
As Sam puts it:
"We are climbing the long arc of exponential technological progress; it always looks vertical looking forward and flat going backwards, but it's one smooth curve."
We're living through the vertical part right now. But it feels normal. That's the genius (and the danger).
Three Unrelated Sources, One Uncomfortable Truth About Human Connection
Here's what's wild. I showed you three completely unrelated sources:
Nate Jones' brilliant video about using AI backwards
Sam Altman's blog post about our AI future
Business tips from Dan Martell via Jack Roberts
But they're all talking about the same thing.
We've been intellectually alone, and AI just offered us a way out.
The person spending 25 minutes with AI seeking connection. The CEO normalizing that connection. The entrepreneur who turned that connection into $100M.
We're all curators now. The question is: are you curating noise... or intelligence?
And maybe I'm wrong. Maybe I'm seeing patterns that aren't there.
But what if I'm not?
This was Curated Intelligence, a new series I'm testing within my irreplaceable newsletter. Comment below if you liked this style (I'll create more if it resonates).
Share this post