Via Matt Steinglass, I came across this piece by New York Times economics writer David Leonhardt that compares the Deepwater Horizon blowout and the financial crisis, and argues that both reflect a failure to think adequately about black swans:

We make two basic — and opposite — types of mistakes. When an event is difficult to imagine, we tend to underestimate its likelihood. This is the proverbial black swan. Most of the people running Deepwater Horizon probably never had a rig explode on them. So they assumed it would not happen, at least not to them….

On the other hand, when an unlikely event is all too easy to imagine, we often go in the opposite direction and overestimate the odds. After the 9/11 attacks, Americans canceled plane trips and took to the road. There were no terrorist attacks in this country in 2002, yet the additional driving apparently led to an increase in traffic fatalities.

This echoes Christopher Beam's piece in Slate (which I wrote about recently) that argues that "the real lesson of the oil spill may be how bad we are at dealing with unlikely but disastrous events."

Clearly there's a short, very interesting book to be written that takes a handful of issues– a few really spectacular failures like the financial collapse and Deepwater Horizon, and a couple really slow but equally important and complex problems like climate change– and constructs a kind of Grand Unified Theory of Huge Mistakes around them.