Spoilers of Project Hail Mary: Be Cautious
I just got back from watching Project Hail Mary, and without hesitation, I can say it was one of the best films I've ever seen. It's beautiful: visually, sonically, and in just about every other way I can think of.
The core message of the movie is about friendship—about making friends in impossible circumstances. For example, Rocky and Grace are stranded in the middle of deep space, each trying to save their respective civilizations. Rocky risks his own life to save Grace after an accident involving excessive centrifugal force in their spaceship, and Grace later returns the favor after discovering that the Taumoeba has evolved to escape ammonia and feed on the astrophage.
But before we dive into what Project Hail Mary means for how we think about the cosmos, we need to spend some time with the theory it's challenging.
The Dark Forest#
Liu Cixin's The Dark Forest, the second book in the Three-Body Problem trilogy, offers one of the most unsettling answers to the Fermi Paradox ever committed to paper. The universe, Cixin argues, has gone quiet because intelligent life learned to be terrified, not because it's rare.
The argument has a cold, almost mathematical elegance:
- Resources are finite. The universe contains a fixed supply of matter and energy. Any sufficiently advanced civilization will eventually exhaust its local supply and need more.
- Life expands. Given enough time, any surviving civilization will grow, spread, and consume. This isn't a moral failing, it's the logical consequence of survival.
- You can't know another's intent. Even if a civilization sends you a friendly signal today, you have no reliable way of knowing what it will want in ten thousand years, or whether it's lying now to buy time.
- Technological leaps are unpredictable. A civilization that's a thousand years behind you today could surpass you in a century. The gap between "harmless" and "existential threat" can be shorter than the time it takes light to cross the distance between stars.
From these four axioms, a brutal conclusion follows: the only rational strategy is to destroy any civilization you detect before it can destroy you. Not out of malice, but out of pure, cold logic. The universe becomes a dark forest where every hunter moves in total silence, because making a sound means death. Any light you see is a trap, a mistake, or a civilization about to go extinct.
This is why, Cixin suggests, the universe is quiet. Not because we're alone. Because everyone is hiding, and anyone who stops hiding gets eliminated.
It's a compelling theory. It's also, I think, deeply wrong. And Project Hail Mary shows us exactly why.
What Rocky Tells Us#
Ryland Grace and Rocky should never have become friends. They're separated by everything: biology, chemistry, the physics of their home systems, the very medium through which they communicate. Rocky breathes ammonia and perceives the universe almost entirely through sound. Grace breathes oxygen and perceives it through light. They share no evolutionary history, no culture, no common reference point for what it even means to be a person.
And yet.
Within hours of their first contact, they're trading mathematical notation. Within days, they're solving each other's problems. Within weeks, they're risking their lives for one another, not out of compulsion or strategic advantage, but because they've become genuinely, irreducibly fond of each other.
The Dark Forest theory would predict that Grace, upon detecting Rocky's ship, should have fled or destroyed it. Any other move is irrational. Rocky is an unknown, and unknowns in the dark forest are lethal.
Instead, Grace slows down. He waits. He tries to talk.
And that choice, that single irrational, strategically indefensible choice, saves two civilizations.
The Bright Forest Theory#
I want to propose a different model. Call it the Bright Forest Theory.
Think of the universe as a bright forest, where cooperation, signal, and trust define the dominant long-run survival strategies. The silence we observe belongs to children: civilizations young enough that the distances are still astronomical, the timescales still geological, the signals still too faint to hear.
The core axioms of the Bright Forest:
1. Cooperation scales in ways that competition doesn't.
A civilization that destroys every neighbor it finds will eventually be alone. And a civilization that's alone is fragile. One gamma-ray burst, one rogue asteroid, one pandemic, one miscalculation in its star's lifecycle, and it's gone with no backup, no partner, no one who remembers it existed. A civilization that cooperates builds redundancy. It spreads its knowledge, its genes, its culture across multiple systems and substrates. Over time, it becomes nearly impossible to kill.
Game theory makes this precise. In a one-shot prisoner's dilemma, defection is rational. But the universe doesn't play one-shot games. Over iterated interactions, and the timescales of interstellar civilization are measured in millions of years, cooperative strategies consistently outcompete defection. Tit-for-tat, generous tit-for-tat, and their descendants dominate populations of defectors across virtually every simulated environment. The dark forest is a one-round game played by civilizations that can't see far enough into the future.
2. The costs of trust are lower than the costs of eternal vigilance.
The dark forest requires that you maintain perfect offensive readiness against every possible threat, forever, across all directions of the sky, at all times. That's extraordinarily expensive. A civilization that has to dedicate a large fraction of its energy budget to monitoring for and preemptively destroying neighbors has less energy for everything else: science, art, exploration, the things that actually make a civilization worth preserving. A civilization that extends calibrated trust, not naive trust but trust with verification and limits, can spend that energy on growth instead.
Rocky and Grace don't trust each other blindly. They develop, slowly and carefully, a shared language. They establish common ground. They test each other. In the Bright Forest, trust functions as infrastructure, the protocol on top of which everything else runs.
3. Intelligence converges toward cooperation.
This is probably the most speculative claim, but I think it's the most important one. The Dark Forest assumes that intelligence and expansion are inextricably linked, that any sufficiently advanced civilization will inevitably run up against the resource constraints that make conflict unavoidable. But that assumption was formed in the industrial age, when the only energy source we could imagine was the one burning in the ground.
A civilization old enough to cross interstellar distances is one that has solved, at minimum, the energy problem. A Kardashev Type II civilization, one that harnesses the full output of its star, has access to more energy than it could meaningfully spend in a billion years. A Type III civilization, harnessing a galaxy, operates in a regime of abundance so extreme that the concept of resource competition starts to dissolve. The dark forest logic requires scarcity. Remove scarcity, and the logic collapses.
What's left when scarcity is gone and the survival pressures that shaped our psychology no longer apply? I think what's left is curiosity. And curiosity, unlike hunger, isn't zero-sum. Rocky isn't less curious about Grace because he's also curious about Erid. Grace isn't diminished by understanding Rocky. The universe contains an infinite supply of the thing that advanced minds most want from each other: novelty, perspective, the irreplaceable experience of encountering a mind that evolved on the other side of the galaxy and somehow, against all odds, shares your love of problem-solving.
4. The Fermi Paradox has a better answer.
The silence of the universe doesn't require a dark forest. It's fully explained by the sheer scale of what we're trying to hear across. Our radio signals have been leaking into space for roughly a hundred years. In a galaxy that's a hundred thousand light-years across, that means we've announced ourselves to a sphere of space containing perhaps a few dozen star systems, out of hundreds of billions. The probability that any of those happen to host a civilization at our exact stage of development, capable of sending signals but also still sending leaky radio noise rather than tight-beam laser communication or something we haven't imagined yet, is vanishingly small.
We're not being hunted. We're whispering in a cathedral the size of a galaxy, and we've been whispering for less than an eyeblink of cosmic time.
Why This Matters#
You might reasonably ask: what practical difference does it make which theory is correct? We can't reach other civilizations anyway, not with our current technology. The thing is, our theories about the universe shape how we behave toward each other, right now, here, on this planet.
The dark forest reaches further than aliens. At its core, it's a theory about minds, one that encodes a particular belief: that intelligence, when faced with uncertainty and competition, will inevitably defect. That trust is always a gamble and usually a losing one. That the coldly rational thing to do, when you encounter something foreign and capable, is to destroy it before it can destroy you.
That belief has consequences. You can see them in international relations, in economic policy, in the way powerful institutions treat emerging ones. The dark forest describes a particular kind of fear, the fear of the other, dressed up in the language of inevitability.
Project Hail Mary is a direct rebuke to that fear. It says: here is a being who is as foreign as foreign can get, who has every reason to see you as a threat, who could have chosen silence or violence, and who chose instead to turn up the volume and say hello. And here is what that choice made possible.
The Bet#
I want to be honest about what the Bright Forest Theory is: it's a bet. It's a bet that the universe rewards cooperation over the long run. That intelligence, given enough time and enough abundance, converges toward curiosity rather than paranoia. That the silence around us is the silence of distance and youth, not the silence of hunters.
That bet might be wrong. Cixin might be right and the night sky is dark because it's a killing field, and we've only survived this long because we're too small and too new for anyone to have noticed us yet.
But I know which universe I want to live in. And more importantly, I know which kind of civilization I want us to become, the kind that, when it finally crosses the distance between stars and finds something strange and luminous on the other side, slows down, waits, and tries to talk.
Like Ryland Grace. Like Rocky.
The light is worth the risk.