A decade ago, I discovered Phil Tetlock’s terrific book Expert Political Judgment, which was a study of efforts to predict the future of political events. For someone who works as a futurist, the book was (and remains) pretty earth-shaking: Tetlock did a fantastic job of explaining the biases that keep most of us from correctly forecasting, improving the quality of our forecasts, or even recognizing the sources of our error. Most of us are actually a lot better at explaining how we were almost right, and rationalizing our apparently errors (our prediction came true later than we said, or it would have happened but for the 2008 meltdown, or it was just around the corner).

This is not to say that it’s impossible to do good forecasting, or that you can’t improve; indeed, Tetlock has spent the last few years exploring exactly how people can do that.

So I was interested to see that Futurity has an article about new research that examines how “belief superiority”— that is, our confdience that our own knowledge and beliefs are superior to others, because we’re better-educated, better-read, etc.– leads us astray:

Across six studies and several political topics, people who were high in belief superiority thought that they knew a great deal about these topics. However, when comparing this perceived knowledge to how much people actually knew, they found that belief-superior people were consistently overestimating their own knowledge.

“Whereas more humble participants sometimes even underestimated their knowledge, the belief superior tended to think they knew a lot more than they actually did,” says Michael Hall, a psychology graduate student at the University of Michigan and the study’s lead author.

Not only that, but belief superiority didn’t lead to more rigorous self-examination, or more thorough research and revision of one’s beliefs. Researchers “presented participants with news articles about a political topic and asked them to select which ones they would like to read. Half of the articles supported the participants’ own point of view, whereas the other half challenged their position.”

What happened?

Belief-superior people were significantly more likely than their modest peers to choose information that supported their beliefs. Furthermore, they were aware that they were seeking out biased information: when the researchers asked them what type of articles they had chosen, they readily admitted their bias for articles that supported their own beliefs.

So what’s going on?

all of us feel good when the beliefs we think are important are confirmed.

In other words, when a belief is strongly held, is tied to one’s identity or values, or is held with a sense of moral conviction, people are more likely to distance themselves from information and people that challenge their belief.

This suggests that one of the things you should look for in a futurist with lots of self-awareness, and an ability to handle uncomfortable situations and truths– particularly about their own abilities.