At Howard Rheingold’s urging, I read Robert Axelrod’s The Evolution of Cooperation over the weekend. It’s a short book, the kind you can get through in a day or two of fractured reading– while watching the kids at the playground, or standing in line to register your daughter for her gymnastics class. (I did this Thursday, and was the only father in an otherwise ferociously competitive group of mothers, all applying the organizational and project management skills that they’d learned in business school to the raising of their children. A entire generation of economics and communications majors from Duke and Northwestern is reproducing here in Menlo Park.)

But back to Axelrod. The Evolution of Cooperation is a short, brilliant books, like Carlo Ginzberg’s The Cheese and the Worms, or Thomas Kuhn’s The Structure of Scientific Revolutions. Its subject is the Prisoner’s Dilemma– a simple game in which two players have the option of cooperating or betraying each other– and how to win it. Axelrod’s story begins with a competition in which people submitted computer programs to play Prisoner’s Dilemma; the winner was a four-line program called TIT FOR TAT. The same program won a second competition against some sixty programs, and later was successful in an evolutionary simulation– one consisting of many rounds of programs being pitted against each other, and “reproduced” in proportion to their scores (low-scoring programs eventually go extinct, high-scoring ones come to dominate).

TIT FOR TAT is extremely simple. It begins the game by cooperating; after that, it simple copies its opponent’s last move. If the opposing program cooperates, TIT FOR TAT cooperates; if it “defects,” so does TIT FOR TAT; but if the program returns to cooperating, TIT FOR TAT does too.

The most interesting part of the book, for me, was the degree to which simplicity turns out to be a virtue in crafting a strategy to promote cooperative behavior. Indeed, a winning strategy comes down to four rules:

  • Be nice. TIT FOR TAT begins each game by offering to cooperate. Indeed, all the top-scoring Prisoner’s Dilemma programs were “nice” in this way. It doesn’t pay to start off taking advantage of other players.
  • Be forgiving. If an opponent tries to take advantage of you, but changes their ways, you shouldn’t hold a grudge. In a situation in which cooperation wins more points than competition, grudges are self-destructive.
  • Retaliate. At the same time, it should be clear that you won’t stand for betrayal.
  • Be clear. As Axelrod puts it, in zero-sum games like chess, it pays to be devious, because you want to keep your opponent off-balance. In non-zero sum games like Prisoner’s Dilemma, in contrast, you have more to gain from predictable behavior.

When the book first came out in 1984, it seems to have been read largely as a manual for dealing with the arms race (which at the time, with deployment of medium-range missiles in Europe, and discussion of building the MX “Peacekeeper,” was much on people’s minds); today, though, it’s been rediscovered by people interested in emergence, social software, and the like. An interesting second life for the book.

I suspect there are some applications in actor-network theory, too, but the connections are still too elusive for me to articulate.