Sunday, September 10, 2017

A Cosmos of Fear

Not Your Father's Star Trek Federation: Cixin Liu's dark vision in "The Dark Forest". (Warning: Spoiler alert)

After slacking off most of the book, Luo Ji gets down to the business of saving humanity in the last couple of chapters, even as he is spat upon by his beneficiaries as a false prophet. It has not been easy for the wallfacers project, which Earth developed after learning not only that there are extra-terrestrial aliens, but that they are intent on taking over Earth, and are also listening in to everything humans are doing. The idea was to nominate a few super-people to think and scheme in the privacy of their own minds, where the Trisolarians (aka Alpha Centurians), couldn't listen in. Sure, this involves a total lack of accountability. But on the other hand, it is humanity's only hope for secure strategic innovation. Sadly, Luo Ji's three colleagues have either committed suicide or disgraced themselves with schemes bordering on insanity.

Luo Ji's strategic insight is borne of a shockingly negative view of how intelligent civilizations would operate and relate in the greater cosmos. It incidentally offers a fascinating and neat hypothesis to the question of why we have never detected, let alone been visited by, aliens. The hypothesis is that natural selection operates in a particularly brutal way among rapidly developing, and far-distant civilizations. Communication is essentially impossible due to the distances involved. Yet the technological scope of civilizations that have a billion or two years on us is still remarkable, including missiles made of strong-force matter, vastly harder than regular matter, missiles capable of destroying whole stars, star-based amplification of messages that can easily be heard by the entire galaxy, and manipulation of sub-atomic / string theory dimensions to create protons with special computational and communication properties. The upshot is that a civilization can grow to highly threatening capability before others even know of its existence. All civilizations at some far level  of advancement are mortal threats, in principle, to others they know about, and given some with fewer scruples, some of them will simply extinguish any local threats they learn of, before asking too many questions.

Thus the cosmos becomes a very dark place, where announcing your existence is tantamount to a death sentence, there being far, far more advanced civilizations always listening and on the lookout for either room to grow, or just threats to their own comfort and security. Nothing could be farther from the positive vision of Gene Roddenberry, which, admittedly, was developed out of a social agenda rather than a study of galactic sociology and biological imperatives.

What is one to think of all this? Firstly, much of the science underpinning this vision is fictional, such as star-killing technologies, and matter manipulation at theoretical or inconceivable scales. Secondly, insofar as the model is biological evolution, it takes far too dim a view of the possibilities and benefits of cooperation. Evolution has shown countless times that there is room for cooperation even amidst the struggle to survive, and that cooperation is the only way to gain truly vast benefits, such as from multicellularity, eukaryotic organelle specialization, and human sociality. Thirdly, we also know from our strategic games with existentially destructive weapons, such as is presently playing out with North Korea, that deterrence rather than offense becomes the principal goal among those having achieved such powers. Would deterrence work in cosmic game theory? It is a problem, since, given a star-killing capability that takes many years to travel to its target and presumably could not be detected at launch or perhaps even at close proximity, offense might happen with impunity. (The Milky Way is about 100,000 light years across.)

But perhaps the most persuasive argument is simple morality- that despite the cold logic Cixin Liu develops, intelligent beings of such vast sophistication are probably more likely to wish to learn about other civilizations than to destroy them, whether out of boredom or out of a Prime-Directive kind of respect and interest. It is sort of unimaginable, to me at least, that the explicit goal of any advanced civilization would be to sterilize its environment so completely, even if unadvanced human history does offer that as a common, blood-drenched, theme.


  • Toles on the maelstrom.
  • Noonan, defending the indefensible.
  • Do we want open borders? Not if we want a welfare state and other public goods.

No comments: