Van: www.spectacle.org.
Prisoner's dilemma
The prisoner's dilemma is a game invented at Princeton's Institute of Advanced
Science in the 1950's. In the basic scenario after which it is named, two
prisoners who the police know to have committed crime A, but whom they wish to
convict of the more serious crime B, are held in separate cells and offered a
deal:
The one who testifies implicating the other in crime B will go free, while the
other will receive 3 years in prison (the "sucker's payoff").
If they both testify against each other, each will receive two years.
If they both remain silent, they will both be convicted of crime A and serve one
year.
Thus there are two choices--usually known as to cooperate, in this scenario
remain silent, or to defect, which here means to confess. And there are four
possible outcomes, depending on your partner's move: you may serve 0, 1, 2 or 3
years in prison.
Cooperation either means you serve one or three years. The results of defection
straddle this: you may serve 0 or 2 years. Because you do not know whether you
can trust your partner (there is no opportunity to communicate when deciding
your move), most rational players will choose to defect in order to maximize the
upside (0 years) and minimize the downside (only 2 years instead of 3). Yet the
outcome consistently is better for two cooperating players than for two
defecting players.
However, in a sequence of games (an "iterated prisoner's dilemma") something
different may happen. One or both players may fall into a pattern called "Tit for Tat", in which cooperation is rewarded and defection punished. Effectively,
this means doing on this move whatever your partner did on the last. In a
computer tournament of programs playing the prisoner's dilemma against one
another, held by political scientist Robert Axelrod in 1980, a four line program
playing "Tit for Tat" beat out much more complex and sophisticated programs. Yet
"Tit for Tat" can only draw; it can never score more points in the game (fewer
years, in this scenario) than its partner. On the other hand, a player who, out
of moral obligation or naivete, cooperates on every move no matter what the
partner does (the All C strategy) will be ignominiously defeated. His partner
has no incentive to cooperate, but can defect and earn the greater payoff on
every move. The moral: cooperation is best, but only if defection is immediately
punished. Axelrod coined the phrase "shadow of the future" to describe the force
that keeps a player cooperating. Someone who knows he will never meet you again
may have nothing to lose by betraying you; someone who will have to deal with
you many times more may be deterred, for fear of retaliation. Thus the future
has a longer shadow in the second case.
Human nature being what it is, in some prisoner's dilemmas, both parties will
always defect (all D strategy), thus scoring worse than they would if they
always cooperated.
The prisoner's dilemma is a simple but powerful idea; once you have hold of it,
you see its applicability to every walk of life and all human experience. The
prisoner's dilemma has been used to analyze problems in nuclear warfare,
anthropology, biology and evolution.
The prisoner's dilemma is a simple but powerful idea; once you have hold of it,
you see its applicability to every walk of life and all human experience. The
prisoner's dilemma has been used to analyze problems in nuclear warfare,
anthropology, biology and evolution. In the following essays, I discuss its
applicability to love, business, law, politics and software development;
introduce some variations on the theme, such as the Gandhi game and the scorpion
player; and finally raise the question of whether it is possible to base an
ethical system on the prisoner's dilemma.
Terug naar Rijnlandmodel, promo
, Rijnlandmodel, overzicht
, Rijnlandmodel lijst
, of naar site home
.
|