Joshua Lehrer has a short piece arguing that anchoring bias slowed our ability to respond to the Icelandic ash cloud and the Deepwater Horizon oil spill:

In the last few months, the globalized world has endured two very different crises. First, there was the ash cloud over Europe, which paralyzed air travel for millions of passengers. Then, there is the leaking oil well in the Gulf of Mexico, which continues to spew somewhere between 5000 and 60,000 barrels of crude into the ocean every day.

While these disasters have nothing in common, our response has been plagued by the same fundamental problem. In both instances, officials settled on an early version of events – the ash cloud posed a severe danger to plane engines, and the oil well wasn't a very bad leak – and then failed to update that version in light of new evidence. As a result, valuable time was squandered.

This is a form of anchoring, a mental bias first outlined (of course) by Kahneman and Tversky.

As a commenter points out, we're not talking about exactly the kind of anchoring that Kahneman and Tversky studied, because the association is not completely random; but still, I think the basic point– that initial reports and expectations made it harder to adjust plans in the face of new evidence– holds true.