Joshua PollackThere Are Idiots

Students of nuclear deterrence could learn something from the financial crisis. (So I’ve argued before.) One reason to think so appears in a review of Justin Fox’s new book, The Myth of the Rational Market.

In Paul Krugman’s telling, Fox describes how the efficient markets hypothesis, which is the basis of modern financial risk assessment, depends on a glaring fallacy: the assumption that others consistently decide and act in their own optimal self-interest, or close enough that it doesn’t matter.

This journey to disaster began with a beautiful idea…

[Harry] Markowitz’s model told investors what they should do, rather than predicting what they actually do. But by the mid-1960s other theorists had taken the next step, analyzing financial markets on the assumption that investors actually behaved the way Markowitz’s model said they should….

But if the markets are already getting it right, who needs finance professors?

What the idea of efficient markets has to do with the non-use of nuclear weapons is actually pretty straightforward, since deterrence is traditionally modeled on bargaining and risk-taking. And indeed, back in 1974, Alexander George and Richard Smoke detected a similar problem in deterrence theory. The theory, they warned, was primarily “abstract-deductive” in origin, based upon ideas about what states ought to do rather than evidence about what states actually have done. This was a problem if anyone was supposed to rely on it:

The character of the theory is fundamentally normative-prescriptive, not historical-explanatory. …[T]he large deterrence literature has grown up with almost no systematic attention to historical cases of deterrence, to the explanation thereof, or to inductive theory-building therefrom. These time-consuming activities have been sidestepped, for reason of theorists’ understandable sense of urgency during the Cold War era…

To put it another way: if everyone is already acting in their own best interests, who needs strategists?

The common root of these problems is the assumption that optimization of subjective expected utility (SEU) self-evidently forecasts actual behavior. As Herbert Simon put it, with dry understatement,

The classical theory of omniscient rationality is strikingly simple and beautiful. Moreover, it allows us to predict (correctly or not) human behavior without stirring out of our armchairs to observe what such behavior is like.

Or as Larry Summers once put it, according to Krugman, “THERE ARE IDIOTS. Look around.”

Comments

  1. Bruce A. Roth (History)

    Another good book in the genre of behavioral economics is Predictably Irrational by Dan Ariely.

  2. yousaf

    And, of course, an outlier, black swan, event will have much more serious consequences in matters of deterrence than in finance.

    That is why it is completely false to say that because there has been no nuclear war for 60 years, it means nuclear arms can keep us safe indefinitely.

    “Much of the research into humans’ risk-avoidance machinery shows that it is antiquated and unfit for the modern world; it is made to counter repeatable attacks and learn from specifics. If someone narrowly escapes being eaten by a tiger in a certain cave, then he learns to avoid that cave. Yet vicious black swans by definition do not repeat themselves. We cannot learn from them easily.”

  3. FSB

    Larry Summers should be one to talk about idiots! (He set-up the conditions to blow Harvard’s endowment, and then left town…)

    The point, then, is this: there are idiots, yes — and we are them.

  4. MarkoB

    not really; what is at issue here is an externality. in this case a “risk externality”. financial corporations took on far too much risk from the point of view of society as a whole i.e. risk was under-priced and so you had market failure due to the under-pricing of systemic risk. nuclear war is kinda similar. states, as rational utility maximizers let us say, engage in risky behaviour and from the point of view of society this risk is too high because as rational utility maximizers states underprice systemic risk which uin this case is nuclear war. mrogenthau stated during the vietnam war that if US credibility was on the line running the risk of nuclear war was a risk worth paying. in order to shore up US credibility Obama is prepared to run the risk of escalation with North Korea; Obama’s get tough policy on North Korea was decided upon in February 2009 according to David Sanger i.e. well before the second test indeed before Unha-2. notice that none of this obviates deterrence theory because a credible external deterrent can function as a means for an external state to internalize the externality; this post could come across as a kind of argument for tailored deterrence keith payne or colin gray style.

  5. Josh (History)

    It’s not really an externality if the corporation itself is poised to be the first victim of its own excessive risk-taking.

    I’ve just read the Sanger article on North Korea today and see no reference to any February 2009 decision.

  6. Major Lemon (History)

    Fear and greed are the 2 big motivators in life. This applies to international politics as it does to business. Whoever brought such a thing as ‘rationality’ into the equation??

  7. Danny (History)

    Nice analogy. And yousaf’s point about the consequences is key here. We’ve recovered from quite a large number of market meltdowns (eventually). Remember the Panic of 1873? The 1953 recession? I suspect we would have remembered the nuclear war of 1962 for a long time if the Cuban Missile Crisis had gone bad.

    Incidentally, in economics—-which has a lot more data and historical experience than nuclear strategy—-the observational issues are called a “small sample” problem. We haven’t necessarily observed all states of nature in our data set. So inferences from the data set might be flawed. (It’s a little different than the “black swan” idea, which is that extreme events may occur more often then we think.) This would leave me very frustrated when thinking about nuclear strategy, where there aren’t a lot of observations to begin with. But then, as an economist I’m a statistical thinker. I salute you guys for taking this on.

  8. J House (History)

    Yousef,

    I’m not sure anyone argues that ‘because nuclear weapons haven’t been used for 60 yrs, they will keep us safe indefinitely’.
    While that is certainly not true, one must consider that time period in a world with no nuclear weapons…if previous history is any measure, it isn’t pretty. Tens of millions of civilians perished due to the use of ‘conventional’ weapons.
    The sheer destructiveness of nuclear weapons and the fear that followed may or may not have kept history from repeating itself in the latter 20th century.
    In addition, there is no historical evidence (that I am aware of) an outcome of the Cuban missile crisis would have resulted in a decision to use nuclear weapons, by either side.

  9. Josh (History)

    J House,

    I’m a little unclear on the thrust of your last sentence, but if I understand it correctly, you mean that neither side would have used nuclear weapons in the CMC absent a conscious and specific decision to do so by the center.

    Alarmingly, that does not appear to be correct.

    Our understanding of events seems to shift every few years, as new accounts come to light. The last thing written is Michael Dobbs’ book One Minute to Midnight, which describes the decision of a Soviet submarine commander to use his boat’s nuclear torpedo against the US Navy ships that were harassing it with depth charges. His officers talked him back, and instead he decided to yield to the pressure, surfacing the boat. It doesn’t appear that anyone in Washington knew that the Navy was using depth charges.

    Something we have known about since a historical conference in 1991 is the account of General of the Army Anatoly Gribkov, who described the six tactical nuclear weapons held — unbeknownst to the U.S. side — by the Soviet forces on Cuba “for contingent use against any U.S. invasion force that landed in Cuba.” General Pliyev, the commander on the island, had “discretionary authority” to use the weapons to stop an invasion force. (All of the above is drawn from Raymond Garthoff’s account of the conference, which you can read here.)

    So in hindsight, it appears that there were at least a couple of pathways to nuclear war, based on encounters between forces in or around Cuba.

    There were other pathways as well. For a variety of blunders, accidents, and problems with the high-alert operation of U.S. nuclear forces during the Crisis, see Scott Sagan’s book The Limits of Safety.

    For me, at least, these incidents and revelations give real force to Thomas Schelling’s idea of the threat that leaves something to chance. Neither side knew what might happen at any moment if the crisis continued, and not merely because of what the other side’s leader might intend. Kennedy’s choices were much more cautious than Schelling’s formulation might imply, but only because he rejected the advice of most of his inner circle.

    If we are still looking for an analogy from economics, perhaps these are examples of principal-agent problems.

  10. FSB

    J House,
    this has been discussed before.

    If there were a all-out nuclear war with thousands of nukes (which is possible given the inherent risks involved), would it have been worth the lives they “saved” during the time they “kept the peace” (noting that this peace was not really applicable to the citizens of several 3rd world countries which were destroyed as collateral damage of the Cold War)?

    You are assuming that the risks of accidental, inadvertent, mistaken or even intentional use are not significant — there is no evidence to support that. We came close in the Cuban missile crises and during ‘tape runs’ during training.

Pin It on Pinterest