Econstudentlog

Martin Hellman’s take on the risk of nuclear war

I’m sure some of you have already seen it, and I know I am late to the party as both marginalrevolution and freakonomics have already mentioned it and linked to it, but I never got around to linking to it from this blog, even if I found it somewhat interesting when I watched it more than a week ago, a mistake I now shall correct:

Even if the title of the presentation/video is: “Soaring, Cryptography and Nuclear Weapons”, he spends probably 45 minutes talking about nuclear weapons, so that is really what the whole presentation is about. The first 8 minutes or so on soaring is related to his talk on nuclear weapons, and I would advise you to watch it all. You can read a paper closely related to the talk here, if you prefer a written version from a video and don’t have an hour to spare (you should be able to read the paper in a significantly shorter amount of time than it takes to watch the presentation).

A few selected main points:

i) Problems related to high impact low probability events are easily overlooked and/or ignored due to ie. framing and status quo bias (that’s just another way to state what he’s trying to say in the beginning with his 99,9 % safe maneuver).

ii) You need units of time on your risk assessments. And not only because risk factors change over time. The concept of compounded risk is important, and often overlooked.

iii) Status quo bias is very important when explaining the nuclear policy of countries with nuclear weapons. As Hellman states about the US experience: Even minor changes in our nuclear weapons posture have been rejected as too risky even though the baseline risk of our current strategy had never been estimated. (what does this argument, if it is valid, btw. tell us about the sustainability path of the current state of affairs in the long run?)

…which leads us to…

iv) This risk is not well understood, and very difficult to assess – and nobody really seem to care about it much.

v) There are many different ways nuclear weapons can be used today, both when it comes to warfare and -terror. Some scenarios lead us to a state we can return from again; others do not. Close monitoring of early warning signs is critical when it comes to risk assessment and -prevention of this problem.

The recently conducted North Korean nuclear weapon test was one of the warning signs mentioned above, and it sure as *** did not decrease the risk of nuclear weapons being used somewhere in the future.

More general comments: I would say I think Hellman overestimates the risk, but I’m also pretty sure I think most people underestimate it, and/or don’t think about it at all. Also, I’m not so sure this risk is either as assessable or as preventable as Hellman believes. But, I must add, I do not think that the fact that the risk is not easy to properly estimate, is a weighty argument against trying much harder than we do today to do so. Last, compounded risk is important, but it’s also a problematic concept to use when forecasting and designing long run estimates, precisely because risk factors change a lot over time: The risk of nuclear war was zero 80 years ago, but that fact is irrelevant today. Yes, you can weigh the data in the model so that risk in recent periods weigh higher than risk many decades ago, but it’s not clear that this is the best approach (in a crisis, a near miss 40 years ago would provide better information on how to act – or on how not to act – during the crisis than the risk assessment ten years ago, and the actions undertaken then, where nothing out of the ordinary happened) – maybe it would be better to weigh annual data according to the risk of the specific year, so that the actions undertaken in relative high-likelihood neighborhoods (near-misses) will be better taken into account? Maybe a combination, and maybe seven other variables should be included? No matter how you weigh the data, you’re gonna have problems knowing what to do in a bad situation, even if you knew the proper risk model and distribution of the data, which you most certainly don’t. If people don’t think long and hard about this before something ugly pops up somewhere down the line – and the default position here would probably be that the fact that some people did think long and hard about this for some time would decrease the chance of “something ugly” eventually popping up – there’s a lot of stuff that needs to be done in a very short amount of time, and that is a recipe for disaster. As Hellman makes it clear in his presentation, the “do/think-very-little” seems to be the current state of affairs, seeing as no one so far has even made an attempt to quantify the risk we’re facing.

Oh yes, one thing I forgot: In my mind, the risk of nuclear weapons being used is not on a path of uniform motion, where the absolute risk of one or more nuclear bombs being used somewhere increases over time at a steady pace, whereas the annual risk is fixed. I think the absolute risk in the long run is accelerating, that is, the risk of a bomb going off increases over time and gets bigger every year. This is related to the fact that I do not consider the most relevant metric, when it comes to the risk of a nuclear bomb being used, to be the number of weapons available in the world, but rather the number of relatively autonomous agents each having at least one – and that number has only gone up since the first nuclear test was conducted.

May 28, 2009 Posted by | nuclear weapons | Leave a comment

   

Follow

Get every new post delivered to your Inbox.

Join 146 other followers