The 3 Body Problem Problem

Disclaimer: This isn’t really about 3 Body Problem because I haven’t seen the Netflix series or read the book. This is, if anything, about The Dark Forest (second in the trilogy, which I haven’t read either. So, it’s not really about that either. I’m not into criticising something I haven’t read. Oh, this is a bit of a long rant. Feel free to not read it. 🙂

Given what I wrote above, I should clarify. What I have a problem with is the so-called Dark Forest Theory as a solution to the Fermi Paradox. I won’t be assuming that makes any sense to anyone, though I suspect you may have a clue about the Fermi Paradox at least if you read my books. The Fermi Paradox is fairly simple: if alien life is common in the universe, why haven’t we seen it? Various solutions exist, almost all of them being horribly depressing. The Dark Forest Theory adds a bit of existential horror to the mix. It says that any civilisation which pokes its head out (sends signals from its planet which can be detected in another system) is immediately destroyed. That means that any civilisation which does exist is either keeping very quiet, or has been nuked into oblivion by now. Thus, the last thing we should be doing is making noise. If we do, we’re toast. The name comes from Liu Cixin’s second novel, where this theory is explored in some detail (I believe), but the basics behind it have been around a lot longer. Fred Saberhagen’s Berserker books have ancient war machines roaming the galaxy looking for developing civilisations to destroy, for example.

The Dark Forest Theory is based on game theory. Take two worlds, A and B. Assume that both have the means to destroy the other using some sort of high-tech, planet-busting weapon. A makes noise and B detects it. Now B has some options. Make contact with A. Stay silent. Destroy A. If they make contact, A may decide to destroy B: B has lost (everything). If they stay silent, A may eventually find them and then may destroy them, so they lose. The only sensible option is to immediately destroy A making them no longer a threat. A cold, hard assessment of how things could play out suggests that any reasonable civilisation should stay silent and blast anyone they discover. Hence, the silent universe we see. It all sounds perfectly reasonable, doesn’t it?

Except…

Except that this is basically Pascal’s Wager for sci-fi. Blaise Pascal came up with his wager to explain why you should believe in God. It goes like this (well, not quite, but here’s the basic idea): If God exists, not believing in him will result in infinite punishment (an eternity in Hell), where believing in him grants infinite reward (Heaven). If God doesn’t exist, believing in him results in minimal loss (a bit of time and money). You can’t begin to compare infinity to anything with a real number, no matter how big, so it makes logical sense to believe in God just in case he does exist. If you don’t and he’s real, you’re in for an eternity of pain. I hope you can see where this is reflected in Dark Forest Theory. It’s the ‘you lose everything,’ infinite punishment thing. It’s common to this kind of thing.

There’s also Roko’s Basilisk, an existential horror thing which came out of Reddit. Suppose there will, at some point, come to pass a super-AI which will be of great benefit to mankind. It is of such benefit, in fact, that it thinks that anyone who fails to ensure that it comes into existence should be punished, forever. (Then we add the existential horror thing: It won’t punish people who don’t know about it, so it’s only now that you’ve heard about it that you’re in danger. You’re welcome.) Of course, this AI is only a hypothetical, so you’re free not to believe in it, but can you risk it? No, you just have to start working to make it exist for fear of infinite punishment.

All of these fall foul of what’s called the Black and White Fallacy. All of them assume that the situation is black and white, there are only two possible outcomes. They ignore third parties. Pascal ignores other gods. Roko ignores other AIs. Dark Forest assumes two players in the game. In reality, if you believe in God, you are ignoring other gods who may exist and may not be pleased with you for worshipping that one tribal war god. (Also, Pascal assumes God is a superficial idiot. If I were God, people persuaded by this argument would be the first ones in the Pit.) Roko ignores all the other potential super-AIs who also want to be created. (Roko’s Basilisk is an obvious Pascal clone, and not even a very good one. How this caused such a stir on Reddit, I do not know. If you want to know more, Wikipedia is your friend.) Dark Forest ignores other civilisations…

If someone else is watching, and B attacks A, it’s likely that B just gave themselves away to other, sneakier aliens. B is toast. B loses. This is where the Berserker idea makes way more sense. If you really want this to work, you send out probes to find potential threats and destroy them. Do it quietly enough, keep your head down, and you can stay alive while everyone else perishes. But, if you’re being really sensible, you can’t leave it at that because someone might find you. In the original Dark Forest idea and the Berserker one, the only way to really win is to colonise as much of the universe as you can. Go fast and go far. Make sure that the destruction of any one planet or system can’t destroy your species. Killing anyone else you find is just a side benefit. But, of course, that doesn’t solve Fermi’s paradox. In fact, instead of being a solution to the Fermi Paradox, Dark Forest suggests that we should already have been visited by aliens. Maybe they come and kick our butts for the habitable planet, maybe they come in the form of a mega-warship which destroys us. Either way, given how long the galaxy has been around, someone should have turned up to wipe us out by now.

I suspect the real solution to the Fermi Paradox is that it’s way harder to evolve intelligent life than we suspect, or that most intelligent species blow themselves up before they become noisy enough for us to notice. And we can leave the existential horror to fiction.

PS. If anyone has watched or read 3 Body Problem, let me know what you thought, please. It strikes me as somewhat Lovecraftian, which has put me off. The basic idea behind Lovecraftian horror is that intelligent people are really stupid (i.e. they stay in the haunted mansion and let the monsters get them when someone dumber would’ve run away). I got a bit frustrated with Lovecraft’s ‘heroes ‘ when I was a lot younger.

17 responses to “The 3 Body Problem Problem

  1. Cecil_Montague's avatar Cecil_Montague

    Time is a factor too. Humans haven’t really been around and in a position to notice anyone else for very long. When/if we get out into the galaxy at large we may find it full of the remains of civilizations that peaked then faded away millions or billions of years ago.

  2. Frankie Gouge's avatar Frankie Gouge

    I think that space is huge and time is long. Contact would require being close enough and existing at the same time.

    3 Body Problem (show) does not seem to fall in your Lovecraftian definition.

    I would define it as we are ants and the boot is coming. With space so large, it gives us centuries to come up with something under the worst of circumstances. There is no running away. Humans gonna have to smart their way out of this one.

    Show is not for everyone, but I liked it,

    • Weirdly, I think that sounds exactly like Lovecraft. 🙂

      I may give it a go. Or read the book. It’s been a long time since I read Lovecraft.

  3. I figure it’s because post-human (except not human, of course) civilizations are not particularly visible to us and not interested taking in our stuff.

    We are very interested in things like food, air, and real-estate. We like radio broadcasts for communication and think we’ll find other civilizations by looking for them.

    People whose primary existence is as software are interested in computing platforms, the materials to construct them, and the energy to operate them. They like fiber-optics for communication, or lasers for long-range communication.

    We’re never going to find a post-human-class civilization by looking for radio because at most they only use weak radio signals with a detectable range of a few meters. WiFi, not the BBC.

    We’re also not going to find them by being invaded because our “garden world” is worthless to them compared to silicon or nickel asteroid with a weak gravity field, or a compute box in close solar orbit.

    With no conflict for resources and no overlap of communication media, there’s no impetus for interaction. We could be sharing our solar system with branches of multiple civilizations and never notice.

  4. Ive always thought that people who couldnt come up with several plausible and not doomsdayish possible answers to the Fermi Paradox lacked imagination. To me it comes down to one of a few things on their own, or a combination of them.

    The first is timing, meaning that theres just no one out there right now making noise we can hear or was making noise at a time that signals are reaching us now that were listening.

    Second, were looking in the wrong direction. Weve only been actively listening for a bit more than 40 years, not constantly listening, and only in a very narrow field of view. So, really its far easier to miss the proof than people generally realize considering how little time and resources are spent on things like SETI, and just how much sky there is to watch. Everything has to line up perfectly, meaning something has to have been transmitted in exactly the right direction with enough power to reach us, and we have to be pointing a radio telescope in just the right direction at the right time.

    Third, maybe theres nothing to hear. Whos to say that in a hundred years well still be using radio for mass communication and wont have discovered something else we dont even have theories for yet. Even now loud omnidirectional broadcasts barely exist anymore, at least not the kind that have the power to reach far. Weve got alot quieter since satellite communication has become so prevalent and the power on transmissions has become less and pointed either at the ground or at a receiver in orbit and dont head off to deep space. Its a very real possibility that any advanced civilization out there is past its “noisy” stage, and our window for hearing radio transmissions is closed even if they arent using something we dont know how to detect.

    Thats a pretty simplified version of the three basic reasons as I see them. I could get alot more detailed and address things like omnidirectional radio broadcasts, or looking for visual evidence but I dont feel like typing that much. I think I got my point across. The answer to the Fermi Paradox is not as unfathomable, sinister, or as black and white as is generally portrayed.

    Also, Niall, your comment about civilizations nuking themselves into oblivion and no longer making noise is part of the Great Filter Theory. Probably are familiar with it, but if not I think you would find it interesting. Its an actual well thought and highly plausible answer to the Fermi Paradox, if a depressing one that falls into the doomsday category.

    • Yeah, I’ve heard of the Great Filter. I suppose, really, most of my sci-fi worlds are based on that idea.

      • We might, but I don’t think it’s at all guaranteed. We don’t have eyes on anything like all that is in this solar system, and it often takes years or decades to notice a change in something we were nominally watching.

        Heck, it’s not at all uncommon to find significant things we didn’t know were near Earth. Large asteroids we didn’t spot until they were heading away from us again, for example. Of course, we have no idea how often it happens that we don’t spot something at all.

        There’s a more meaningful response to your point, though, and that is that I do not think such a civilization would be driven by the need to reproduce, or not to the extent we are. Software beings have options: create a backup, create a fork, or create a new person. I ordered those in decreasing order of how often I think a person would choose them.

        Backups ensure one’s own continued existence. Forks have utility but also downsides or at least increased complexity. Creating a new person has little to recommend it, beyond getting to meet someone new.

        However, only forking or creating new people requires an increase in available computing. Backups do not.

        All of which is a long-winded way of saying I don’t think we’d see the asteroid belt vanishing even if we were checking all of the asteroids regularly. The loss of a few pounds of materials now and then would fly under the radar.

        As an aside, there’s also the possibility that they’d consider KBOs a more accessible source of resources than asteroids or planetary rings, which is something we’d have no hope of spotting with current technology.

      • Argh, replied to the wrong thread. I’m sorry.

      • Not a problem.

        And for those who might have to look it up KBOs are objects in the Kuiper Belt which we would have little no chance of spotting going missing. Unless someone stole Pluto or Eris.

  5. rkaliskidfe6c4f241's avatar rkaliskidfe6c4f241

    Black forest sounds like playing hide and seek with guns. Also you don’t know how many players there are so why risk anything like mounting an an invasion.

    Annnnnd there is the problem. Without very fast FTL you are launching what you think is a technologically advanced force against a lesser opponent. By the time you get there he may have progressed past you since your tech is frozen at launch.

    It all seems like a big risk for little gain.

    Suppose we don’t destroy ourselves and science is moved from warfare to exploration. Would humans destroy an entire world “just in case”? Humans are vicious but we do draw the line at wiping out a whole race of beings. Before you disagree think back to post WW2. The U.S had the technology to glass Germany and Japan then destroy mother Russia. It was in our self interest to make them partners.

    • Hide and seek with guns is exactly how the Dark Forest is characterised. Yup. And your objections are exactly what my problem is with it.

    • Cecil_Montague's avatar Cecil_Montague

      There is also the Adams theory where we have already been invaded but due to a massive miscalculation of scale the entire invaison fleet was accidentally swallowed by a dog.

  6. rkaliskidfe6c4f241's avatar rkaliskidfe6c4f241

    I miss Douglas Adams. His ability to take the butterfly effect to the absurd was unmatched.

    One thing that always bothered me about space warfare is that space is big in a way we can’t grasp. Without wormholes, jump gates or warpdrives there is so much that can go wrong in the time it would take even at .99C.

    How would you get anyone to go risk their neck….assuming they had necks when home will look totally different even by the time you got to the beachead. Assuming the travel time from 3body is 400 years. England sends an invasion fleet to the shores of China in 1624. They arrive there and the date is now 2024. Some soldiers don’t want to stay and go home. It is now 2424. in England.

    Good luck with recruiting. Not every alien civilization will be on the brink of destruction and that desperate.

    • The idea behind Dark Forest is that you kill your enemy at long range, probably with some sort of relativistic impact weapon. So, you don’t need to travel. That’s part of my problem with the theory; such a weapon is likely to be noisy, revealing the location of the attacker and identifying them as dangerous.

      And Douglas Adams was just great.

Leave a reply to Arik Cancel reply