This week I'm responsible for "veckans skifte", the theme on the blog Skiften. My subject is the changes in global risk (in Swedish). This is a continuation of previous looks at global catastrophic risks this summer.
I find doom-mongering both depressing and annoying, but at the same time there seems to be serious cognitive bias against recognizing rare but serious risks. I think there is also some form of approximate "conservation of worry" in society that makes different risks compete for attention. The result of the dominance of climate change as The Big Risk is that people underestimate many of the others, even when they are completely uncorrelated. At the same time there is clearly more than a little irrational millennialism around, biasing people towards focusing on the nice, dramatic risks. But the ones I fear are the total unknowns.
If there is a future great filter as explanation for the Fermi Paradox (unless it is some evolutionary convergence that turns us into a quiet M-brain civilization), we should expect it to strike relatively soon. Within a century we could definitely spread across the solar system and be on our way beyond, becoming quite hard to kill off as a species. Hence whatever threatens us must be striking before then. But we should expect that all other (now lost) alien civilizations have tried to reduce the risks facing them, including taking the great filter into account. So the filter, in order to be a cause of the great silence, must be something no civilization can plan for, yet extremely reliably deadly.
To me, this argument suggests that we should be concerned with a category of unknown unknowns as major risks. We do not know what they are, and maybe they are even unknowable before they happen. But we better find a way around them if we want to survive.Posted by Anders3 at September 2, 2008 10:53 AM