October 11, 2008

Putting out fires

Martin Wood MondrianBill McGuire reviews Global Catastrophic Risks (eds. Nick Bostrom and Milan Cirkovic) in Nature (full disclosure: Nick is my boss and I’m good friend with Milan, but I did not contribute to the book). McGuire takes the volume to task for not taking climate change seriously enough:

“Any ranking exercise of global threats must put contemporary climate change and peak oil firmly in the first two places. Yet the latter is not even mentioned in Global Catastrophic Risks”

Not that the book actually tries to rank the threats; it represents a first stab at putting all global nastiness into a framework; much discussion during the GCR conference this summer was about how ranking could actually be done sensibly. It is not clear to me that peak oil would achieve sufficient nastiness to be worth putting into the same list as nuclear war, grey goo or global oppression. It might certainly be trans-generational, but it is definitely endurable. Maybe if one subscribes to the Olduvai theory it would be a GCR (but that theory is about as plausible as the Club of Rome predictions – a typical “lets ignore technology and economics” doom theory).

“If we are to evaluate future global threats sensibly, we must distinguish between real and projected risks. We should consider separately events that are happening now, such as anthropogenic climate change and the imminent peak in the oil supply, other events that we know with certainty will occur in the longer term, notably asteroid and comet impacts and volcanic super-eruptions, and extrapolated risks, such as those associated with developments in artificial intelligence and nanotechnology, which are largely imagined.”

This is where things get interesting. McGuire is clearly right about this. But we can suffer a 1917-style pandemic today, and a IL4-armed smallpox virus could be made and distributed by a number of actors, likely killing tens of millions if not billions.

“A mushroom cloud may hang over the distant horizon and nano-goo may be oozing in our direction, but we still need to douse the flames wrought by climate change and peak oil if we are to retain for ourselves a base from which to tackle such menaces, when and if required.”

ParhelionHis view seems to be that we should focus entirely on the near-term risks in order to deal with future risks later on. Again, this sounds rational. But the implications are likely not what he would approve of. Given the current credit crisis, wouldn’t this be a valid argument for putting attempts to fix climate change on hold while the financial markets get fixed? Obviously, if there is no money in the future for the environment due to a global recession, less will actually be done. Given that climate change is going to be a slow process stretching over decade to century timescales it makes sense to prioritize threats that are closer and can be easier dealt with. One can quibble over what discounting rate to use (choose a too high, and you will ignore anything beyond the near present), but as the Copenhagen Consensus has shown, there are some pretty sizeable problems that can be dealt with in the near future.

What is the rational strategy of prioritizing GCRs? What we want to maximize is (likely) our long-term survival chances (individually and as a species). This involves reducing the total probability per year that we get killed by one or another GCR. Some probabilities are cheap and easy to reduce (get rid of bioweapon stockpiles, fund an asteroid survey), some are harder (wipe out infectious disease, build secure shelters) and some are likely extremely costly (space colonization to reduce risks). If we have a set of risk probabilities, a marginal cost for risk reduction ($ per probability unit reduced) for each risk and a fixed budget and are trying to reduce the sum of the risks, then the rational strategy at each moment is to spend money on the risk that has the lowest cost per unit of reduced risk. This continues until the cost rises to the same level as the next “cheapest” risk, at which point we divide resources evenly between them. Eventually we get to the third, fourth and so on.

In order to do this we need to know a lot. We need to know what risks there are, how deadly they could be, and their “fixing elasticity” – how much we can reduce them per billion thrown at the problem. We don’t have that information for most risks in the book or beyond. We are just getting started here. This is what annoys me about McGuire’s criticism: he assumes he already knows the relevant information. I am pretty convinced he is wrong about that.

Doom and gloomA lot of effort, money, research and books are already spent on climate change and oil. The total amount of effort spent on acquiring the above information needs has so far been minuscule, and it seems frankly irrational to think we know enough. If we go back in time and look at the Big Concerns of the past they have often been deeply mistaken (dysgenics is no longer the big challenge for western civilization, the population bomb got defused by the demographic transition). Hence we should be skeptical about overly certain claims about what we need to prioritize.

I’m also not entirely convinced my risk-reduction strategy above is the best. If we actually try to minimize the expected risk (with some discounting) over the future of all risks the optimal strategy might look different (I’d love to do some math on this). In particular, we have to take uncertainty in our estimates into account and model that some risks are changing over time (by 2100 supervolcanism will be at the current level, peak oil will be resolved one way or another but unfriendly AI will likely be a much larger threat than today). We should also consider the effect of growing technological abilities and wealth (SARS was dealt with in a way not possible 50 years ago). Some problems should be given to our children, because they will be better than us at solving them. But that requires both a sense of how technology is likely to develop, how tractable some problems will be in the future and whether problems will compound themselves over time or not. Nuclear waste may be a problem that our children will find easy, but they will berate us for spending so much effort on it while we crashed the oceans by overfishing.

There are a lot of things to do if we are serious about reducing global risks. One of the most obvious is to actually figure out how to get better at it. When the house is on fire it is stupid not to investigate (quickly!) where it is most important to put out the fire. Otherwise you might be saving the living room as the gas tubes in the garage blow up.

Posted by Anders3 at October 11, 2008 10:48 PM
Comments