Alex Soojung-Kim Pang, Ph.D.

I study people, technology, and the worlds they make

The Evil Futurists’ Guide to World Domination

The Evil Futurists’ Guide to World Domination: How to be Successful, Famous, and Wrong

You want to be a futurist, but something’s holding you back. Maybe you’re afraid of being wrong. Maybe you don’t have any ideas. Don’t worry. After years of exhaustive study, I’ve brought together ideas drawn from behavioral economics and neuroscience that will help you succeed without having to be right. All you have to do is follow the simple principles laid out below.

It’s more important to be certain than to be right! People love certainty. They crave it. In experiments, psychologists have shown that “[w]e tend to seek advice from experts who exhibit the most confidence – even when we know they haven’t been particularly accurate in the past.” We just can’t resist certainty.

Further, confidence and certainty aren’t things you arrive at after logical deliberation and reasoning: as UCSF neurologist Robert Burton argues in his book On Being Certain, certainty is a feeling, an emotion, and it has a lot less to do with logic than we realize. So go ahead and feel certain; if other people mistake that for being right, that’s their problem.

So no matter what you do, no matter what you believe, be certain. As Tetlock put it, in this world “only the overconfident survive, and only the truly arrogant thrive.”

Finally, for the moralist or logician in you, here’s this: even if you don’t believe what you’re saying, you could wrongly believe you’re wrong, and actually be right. Stranger things have happened.

Claim to be an expert: it makes people’s brains hurt! In a remarkable new study, Jan Engelmann and colleagues used fMRI to observe the brains of people who received expert advice during a financial simulation. They found that subjects thought differently about their decisions when they received the advice– even if it was bad advice– than when they worked on their own. As the researchers put it, “one effect of expert advice is to “offload” the calculation of value of decision options from the individual’s brain.” Put another way, “the advice made the brain switch off (at least to a great extent) processes required for financial decision-making.”

No expertise? No problem! It’ll actually make your work more accurate if you say you’re an expert– if you’re certain that you’re an expert– but you actually aren’t.

Sounds counterintuitive, right? (Ed.: This is how you know I’m a successful futurist. I said what you didn’t expect. Now I’ll quote some Science to make my point.) In fact, as J. Scott Armstrong has shown over the last twenty or so years, advanced degrees and deep knowledge don’t make you a better forecaster or expert. Statistically, experts are hardly better at predicting the future than chimps throwing darts at a board. As Louis Menand put it, “The accuracy of an expert’s predictions actually has an inverse relationship to his or her self-confidence, renown, and, beyond a certain point, depth of knowledge.”

At the same time, it’s perfectly natural to suffer from what Nassim Taleb calls “epistemic arrogance.” In all sorts of areas, we routinely overestimate our own certainty and breadth of knowledge, and underestimate what we don’t know. If you do that, you’re just like everyone else.

So knowing you’re not an expert should make you more confident in your work. And confidence is everything.

Have one simple idea that you apply to EVERYTHING! The future is complex, but you shouldn’t be. Philip Tetlock explained in Expert Political Judgment that there are two kinds of forecasting personalities: foxes, who tend to appreciate contingency and don’t make big claims, and hedgehogs, who have a hammer and see the whole world as a giant nail. Guess who wins. Having a single big theory, even if it’s totally outrageous, makes you sound more credible. Having a Great Idea also makes it easier for you to seem like a Great Visionary, capable of seeing things that others cannot.

Get prizes for being outrageous! It’s important to get quoted in the media. Being a futurist isn’t like being a doctor or lawyer: there are no pesky state boards, no certification tests, none of that. So how do potential clients figure out who to hire? Media attention is one way. As a resident scholar at a think-tank told Tetlock, “I woo dumb-ass reporters who want glib sound bites.”

So you need to set yourself apart from the pack, differentiate yourself from the competition. If you’re not beautiful, or already famous, the easiest way is to be counterintuitive, or go against the grain. Dissent is always safe, because journalists understand what to do with someone who’s critical of the conventional wisdom, and always want someone who can provide an Alternative View For Balance. There are few more secure places in a reporter’s Rolodex than that of the Reliably Unpredictable Contrarian.

There’s a success hiding in every failure! Let’s say you predicted that something would happen, and it hasn’t. Is your career over? Of course not. Tetlock found that after a certain point, expertise becomes a hindrance to effective forecasting, because experts are better able to construct erudite-sounding (or erudite-feeling) rationalizations for their failure. Here’s how to benefit from this valuable talent.

  • Make predictions that are hard to verify. Be fuzzy about timing: it’s always safest to say that something will happen in your lifetime, because by definition, you’re never around to take flak if you’re wrong.
  • Find similar events. Maybe you predicted that we’d all watch TV on our watches. Instead, we watch YouTube on our computers. That’s pretty close, right? Point proved.
  • Say reality came very close to your prediction. Canada almost went to war with Denmark. It was just the arrival of winter that prevented them from attacking each other over competing claimsclaims to the North Pole.
  • Those damned externalities. Your prediction would have come true if it hadn’t been for the economic downturn, which really messed up everything. (The beauty of this is that economic downturns now come with enough regularity to provide cover for just about everything– yet they’re still unpredictable.)
  • The future is just a little slow. Instead of derailing it, maybe that (unpredictable.) economic downturn has just put off the future you predict. The underlying dynamics are solid, it’s just that the timing is off (because of something you couldn’t have foreseen.) Everything will get back on track once the Dow climbs above 20,000 again.
  • False positives show you care. If you’re working an area where the stakes are high, it would be irresponsible NOT to be extreme. Take WMD in Iraq, for example. If experts hadn’t predicted that there were chemical weapons in Iraq, and there had been, the consequences would have been unthinkable. Better to be safe than sorry.

Regardless of which of these reasons you use, just remember this. You weren’t wrong. The world failed to live up to your prediction.

Don’t remember your failures. No one else will! We don’t remember our own failures is that, well, in retrospect they weren’t failures. Experts retroactively assign greater certainty to forecasts they made that came true, and retroactively downgraded their assessments of competing forecasts. (Put another way, experts tend to suffer more from hindsight bias than average people, not less.) When we’re right, we get smarter, and other people get dumber.

Last but not least, remember that everybody has a track record, but no one knows what it is. As Tetlock put it, “We seek out experts who promise impossible levels of accuracy, then we do a poor job keeping score.” Make this work for you. And good luck.

For a while now, I’ve been working on a think-piece on what futures would look like if it started now: if instead of starting during the Cold War, in the middle of enthusiasm for social engineering, computer programming, and rationalistic visions of future societies, futures was able to draw on neuroscience and neuroeconomics, behavioral psychology, simulation, and other fields and tools.

One of thing things I’ve kept coming back to is that, if you take seriously the criticisms or warnings of people like Nassim Taleb on the impossibility of prediction, Philip Tetlock and J. Scott Anderson on the untrustworthiness of expert opinion, Robert Burton on the emotional bases of certainty, Gary Marcus and Daniel Gilbert on the mind, etc., you could end up with a radically skeptical view of the whole enterprise of futures and forecasting. Or, read another way, you end up with a primer for how to be an incredibly successful futurist, even while you’re a shameless fraud, and always wrong.

I’ve finished a draft of the serious article [PDF], so now it’s time for the next project: The Evil Futurists’ Guide to World Domination: How to be Successful, Famous, and Wrong. It would be too depressing to write a book-length study, so I’ll just post it here.

(This exercise is, by the way, an illustration of Pang’s Law, that the power of an idea can be measured by how outrageously– yet convincingly– it can be misused. Think of Darwin’s ideas morphing into Social Darwinism or being appropriated by the Nazis, or quantum physics being invoked by New Age mystics. And yes, I know Pang’s Law will never be as cool as the Nunberg Error, but I do what I can.)

Full essay in the extended post.

The citations are all real. But no, I don’t really mean a single word of it. Yet, I wonder….

The Evil Futurists' Guide to World Domination: How to be Successful, Famous, and Wrong

You want to be a futurist, but you're afraid of being wrong. Don't worry. Everyone has that concern at first. But here, I've brought together ideas drawn from a number of books and articles that will help you succeed without having to be right. All you have to do is follow the simple principles laid out below.

Be certain, not right. People love certainty. They crave it. In experiments, psychologists have shown that "[w]e tend to seek advice from experts who exhibit the most confidence – even when we know they haven’t been particularly accurate in the past." We just can't resist certainty.

Further, confidence and certainty aren't things you arrive at after logical deliberation and reasoning: as UCSF neurologist Robert Burton argues in his book On Being Certain, certainty is a feeling, an emotion, and it has a lot less to do with logic than we realize. So go ahead and feel certain; if other people mistake that for being right, that's their problem. But before too long, people who listen to you will become invested in believing that you're really an authority and know what you're talking about, and will defend your reputation to salvage their own beliefs.

So no matter what you do, no matter what you believe, be certain. As Tetlock put it, in this world "only the overconfident survive, and only the truly arrogant thrive."

Finally, for the moralist or logician in you, here's this: even if you don't believe what you're saying, you could wrongly believe you're wrong, and actually be right. Stranger things have happened.

Claim to be an expert: it makes people's brains hurt. In a remarkable new study, Jan Engelmann and colleagues used fMRI to observe the brains of people who received expert advice during a financial simulation. They found that subjects thought differently about their decisions when they received the advice– even if it was bad advice– than when they worked on their own. As the researchers put it, "one effect of expert advice is to 'offload' the calculation of value of decision options from the individual’s brain." Put another way, "the advice made the brain switch off (at least to a great extent) processes required for financial decision-making."

No expertise, no problem. It'll actually make your work more accurate if you claim to be an expert– if you're certain that you're an expert– but you actually aren't.

Sounds counterintuitive, right? (Ed.: This is how you know I'm a successful futurist. I said what you didn't expect. Now I'll quote some Science to make my point.) In fact, as J. Scott Armstrong has shown over the last twenty or so years, advanced degrees and deep knowledge don't make you a better forecaster or expert. Statistically, experts are hardly better at predicting the future than chimps throwing darts at a board. As Louis Menand put it, "The accuracy of an expert’s predictions actually has an inverse relationship to his or her self-confidence, renown, and, beyond a certain point, depth of knowledge."

And it's perfectly natural to suffer from what Nassim Taleb calls "epistemic arrogance." In all sorts of areas, we routinely overestimate our own certainty and breadth of knowledge, and underestimate what we don't know. If you do that, you're just like everyone else.

So knowing you're not an expert should make you more confident in your work. And confidence is everything.

One simple idea may be one too many. The future is complex, but you shouldn't be. Philip Tetlock explained in Expert Political Judgment that there are two kinds of forecasting personalities: foxes, who tend to appreciate contingency and don't make big claims, and hedgehogs, who have a hammer and see the whole world as a giant nail. Guess who wins. Having a single big theory, even if it's totally outrageous, makes you sound more credible. Having a Great Idea also makes it easier for you to seem like a Great Visionary, capable of seeing things that others cannot.

Get prizes for being outrageous. It's important to get quoted in the media. Being a futurist isn't like being a doctor or lawyer: there are no pesky state boards, no certification tests, none of that. So how do potential clients figure out who to hire? Media attention is one way. As a resident scholar at a think-tank told Tetlock, "I woo dumb-ass reporters who want glib sound bites."

So you need to set yourself apart from the pack, differentiate yourself from the competition. If you're not beautiful, or already famous, the easiest way is to be counterintuitive, or go against the grain. Dissent is always safe, because journalists understand what to do with someone who's critical of the conventional wisdom, and always want someone who can provide an Alternative View For Balance. There are few more secure places in a reporter's Rolodex than that of the Reliably Unpredictable Contrarian.

There's a success hiding in every failure. Let's say you predicted that something would happen, and it hasn't. Is your career over? Of course not. Tetlock found that after a certain point, expertise becomes a hindrance to effective forecasting, because experts are better able to construct erudite-sounding (or erudite-feeling) rationalizations for their failure. Here's how to benefit from this valuable talent.

  • Make predictions that are hard to verify. Be fuzzy about timing: it's always safest to say that something will happen in your lifetime, because by definition, you're never around to take flak if you're wrong.
  • Find similar events. Maybe you predicted that we'd all watch TV on our watches. Instead, we watch YouTube on our computers. That's pretty close, right? Point proved.
  • Say reality came very close to your prediction. Canada almost went to war with Denmark. It was just the arrival of winter that prevented them from attacking each other over competing cliams to the North Pole.
  • Those damned externalities. Your prediction would have come true if it hadn't been for the economic downturn, which really messed up everything. (The beauty of this is that economic downturns now come with enough regularity to provide cover for just about everything– yet they're still unpredictable.)
  • The future is just a little slow. Instead of derailing it, maybe that (unpredictable!) economic downturn has just put off the future you predict. The underlying dynamics are solid, it's just that the timing is off (because of something you couldn't have foreseen.) The future will get back on track once the Dow climbs above 20,000 again.
  • False positives show you care. If you're working an area where the stakes are high, it would be irresponsible NOT to be extreme. Take WMD in Iraq, for example. If experts hadn't predicted that there were chemical weapons in Iraq, and there had been, the consequences would have been unthinkable. Better to be safe than sorry.

Don't remember your failures. No one else will. We don't remember our own failures because, well, in retrospect they weren't failures.

Experts retroactively assign greater certainty to forecasts they made that came true, and retroactively downgraded their assessments of competing forecasts. (Put another way, experts tend to suffer more from hindsight bias than average people, not less.) When we're right, we get smarter, and other people get dumber.

Last but not least, remember that everybody has a track record, but no one knows what it is. As Tetlock put it, "We seek out experts who promise impossible levels of accuracy, then we do a poor job keeping score." Make this work for you. And good luck.

6 Comments

  1. Maybe Harvard itself is successful as a result of this self-confidence (wait, I better cut “maybe” out of my vocabulary…)

  2. Great piece, Alex. I took copious notes.

  3. Always trying to advance the field!

  4. Alex,
    I agree that the increasing complexity of problem solving and pace of change alter the feasibility of planning in for the future. The generally sluggish learning about it reveals how humans are poorly adapted for the challenge. Here’s something I think directly addresses it.

    We used to be able to rely on nature not changing. For years I’ve been trying to show people how useful it is to understand how physical systems change by themselves, and to use that to improve mathematical models that don’t. Reading the difference can expose exactly where modeling assumptions will need to change, among other things. Getting that across has been rough going, though. Most scientists don’t seem to have a way to understand physical systems except as represented by their equations. I have a very effective empirical method.

    The physics is that because physical systems operate through multiple scales of organization, things like positive and negative feedback processes are not open ended as they are for equations, but must begin and end through changes on other scales of organization. Models are not made any more predictable by that, but predicting model failure definitely is. The validity of models is subject to the continuity of the environment they are assumed to be operating in. Observing changes on other scales that mark an environment switching from positive to negative feedback, for one example, invalidates the old models. I think that kind of certainty about the need to ask that question, and the advantage of having some real foresight on when to change models, should be a great aid.

    Have a look at my web site and blog to see if you find things to take an interest in. I think there’s a real possibility of understanding enough about the physical cybernetic switches to consider attempting to imitate the ones nature uses to constructively solve our increasingly unmanageable set of problems.

    Thanks for insight. All the best, Phil

  5. Heard your interview on Australian ABC this morning so thought I’d do a search for “Evil Futurists” and here I am. In your interview you suggested take a big position and then start making predictions about everything and there are bound to be successes. And I immediately thought about astrology. Successful enough to maintain a market place but not successful enough to achieve world domination (anymore). Any thoughts?

  6. Thanks for a great, entertaining summary on expertise and expert prediction. Here’s my moral dilemma – perhaps one you shared when writing this: I teach thinking skills to folks; but by showing people how others pull the wool over their eyes, one also teaches them how to do it themselves. Believe it or not, this is not my intention. So in your expert opinion, what’s the chance that someone will actually explicitly follow your Guide? (Any additional expert tips on how to prevent it would also be welcome.)

    Cheers…
    Yanna
    —–
    PING:
    TITLE: Futures 2.0: rethinking the discipline
    URL: http://www.experientia.com/blog/futures-2-0-rethinking-the-discipline/
    IP: 82.195.142.143
    BLOG NAME: Putting people first
    DATE: 07/17/2009 08:45:53 AM

    Alex Soojung-Kim Pang of The Institute for the Future has been working recently on a think-piece on what futures would look like if it started now:
    If instead of starting during the Cold War, in the middle of enthusiasm for social engineerin…

Comments are closed.

© 2017 Alex Soojung-Kim Pang, Ph.D.

Theme by Anders NorenUp ↑