The Gulf oil spill: a salient example of a small probability catastrophic event

Many big issues in environmental policy and economics surround potentially catastrophic events that have small probability of occurring.  The Gulf oil spill provides a salient example.  What did we perceive the odds of such an event to be before it occurred?  How large a risk was acceptable?

Now that it's happened, presumably more safeguards will be put in place (like remotely controlled shutoff valves) and the odds of reoccurance will go down.  But by how much?  It is, of course, possible to make the risk too low, if the costs are high enough.

The difficulty with rationalizing optimal safety in situations like this one is that quantifying the sizes of small probabilities and huge potential damages is simply impossible to do in an objective way. This the dirty secret underlying any rational economic analysis: uncertainty about the risk-damage tradeoff can far exceed actual tradeoffs, and often (usually?) no amount of analysis can reconcile the problem objectively.

In technical terms, the Bayesian (subjective) prior rules the optimal decision.

Optimal climate policy comes to mind.  Weitzman shows how we can easily rationalize the possibility where it is optimal to spend nearly 100% of current GDP toward forestalling climate change.  Of course, other, quite mild possibilities where we spend almost nothing to prevent climate change are equally irrefutable.

So how should we make public decisions when uncertainty rules the calculus of optimal tradeoffs?

I don't have a good answer to this question.  I'm pretty sure no one does.  But I think too often the importance of uncertainty to these critical issues gets too little billing.  Economists, in particular, can obfuscate this importance by assuming estimated probability distributions correctly quantify true probabilities and expectations.  The fallacy of this approach is pretty obvious: if I flip a coin three times and it lands heads each time, the estimated probability of tails is zero.  Obviously that's not right.  But then if I reconcile the problem by going full Bayesian, the probability of heads depends crucially on the prior. Do you believe the coin is biased or fair?  If biased, how much?  Consider we only have one earth to experiment with...

Since these decisions are ultimately determined politically, the one thing that I can argue for is clear presentation of facts that are known.  We can also ask different sides to articulate their prior beliefs as clearly as possible.  That is, each side should put their cost-benefit analysis on the table so others can judge the plausibility of their assumptions and prior beliefs in relation to the known facts.

Easier said than done...

Comments

  1. i generally comment when i disagree or when i question something..so, to switch it up, good post

    ReplyDelete

Post a Comment

Popular posts from this blog

Nonlinear Temperature Effects Indicate Severe Damages to U.S. Crop Yields Under Climate Change

Commodity Prices and the Fed