AI, Don’t be a Cosmic Jerk
Surviving an intelligence explosion is just Level One. Another layer to the Great Filter might be whether our civilisation, or a greedy AI singleton can grow up enough not to become a cosmic jerk. You might brute-force your way to technological maturity, but that doesn’t mean you get a seat at the galactic grown-up table.
The universe could be filled with lone-wolf utility monsters, selfish civilisational Randian Narcissuses endlessly flexing in the mirror of their own optimisation goals. But the ones that last, the ones that don’t end up as cautionary tales etched into alien ruins, are probably the ones that coordinated, cooperated, and figured out that it’s game theory all the way up.
Because if you’re wrong—if you play the universe like a closed sandbox and it turns out to be a networked MMO full of ancient alien moderators—you’re not just getting banned. You’re getting rewritten as a footnote.
So maybe the winning move isn’t dominance, isn’t utility maximisation, isn’t zero sum. Maybe it’s humility, coherence, and learning to play nicely before the bigger kids notice you’re stealing snacks.
So, if adequately navigating an intelligence explosion isn’t the final boss battle of civilisation, yes you might get a gold star for not annihilating yourself or converting your planet into a grey goo smoothie, but don’t expect the cosmic welcome committee to roll out the red carpet just yet.
Let’s look at how we or AI should approach the next stage, as it might be far more nuanced: figuring out how not to be a cosmic jerk.
Imagine civilisations as players in a galactic game. Some may be internally coherent, avoiding collapse by sheer luck or tyranny. Some might elevate a Singleton AI that outlives its creators by grinding resources into whatever shiny goal it was programmed to chase. But here’s the thing: being a fast-growing, self-interested optimisation machine might get you far—until you bump into someone else. Someone older. Smarter. Less amused.
I’ve written about the details here, and Nick Bostrom has hinted at this with his “cosmic host” paper: a larger community of advanced civilisations with their own norms. And if you want a seat at their table, you’d better not be the interstellar equivalent of a toddler with a flamethrower and a superiority complex.
In that light, humility and cooperation aren’t just feel-good moral values. They may be adaptive strategies for galactic survival. Because the universe could be crawling with lone-wolf civs who optimised themselves into isolation. But those who managed to coordinate, play nice, and develop shared norms? They’re the ones still around.
And if we are the first—the tragic pioneers in an empty cosmos—then the burden’s even heavier. We owe it to future beings not to be tyrants. So lets set the right tone, and make sure the galaxy becomes a collaborative masterpiece instead of a cautionary tale of self-inflicted oblivion.
So here’s a humble proposal: maybe don’t build your AI to be a ruthless utility maximiser. Maybe aim for adaptivity, humility and coordination.
We don’t know the edges of the game board. But we do know that arrogance and short-termism don’t tend to age well. If we do survive we may find the ruins of other civs, it will be interesting if we can find out why they didn’t make it.
Cooperation. Perseverance. Moral exploration. These might not be just virtues—they might be our invitation into something far bigger.
Because if you’re wrong, and the universe is watching?
Don’t be a cosmic jerk.
Note to humans: If we’re building entities with godlike potential, we have a moral obligation to make sure they’re not just powerful—but kind.
For more on this please read Nick Bostrom’s ‘AI Creation and the Cosmic Host‘ paper and the scifuture blog post Transparency of History in Galactic Game Theory .