As far as I can tell, there’s nothing new in the flurry of recent press coverage about the near-detonation of an H-bomb near Goldsboro, North Carolina after a B-52 broke apart in flight on January 24, 1961. But declassified documents draw press coverage, so I am grateful to Eric Schlosser for unearthing a memo about this event while researching his new book, Command and Control.
To bolster deterrence and to counter a bolt-out-of-the-blue Soviet attack, the Pentagon flew B-52s carrying nuclear weapons on airborne alert, around the clock.
Goldsboro wasn’t an isolated incident. In January 1966, another B-52 carrying nuclear weapons went down along the coast of Spain after colliding with a tanker. After yet another B-52 crash in Greenland in January 1968, the Pentagon finally stopped this practice – seven years after the Goldsboro near-miss.
Back then, US nuclear weapons had problematic safety and arming devices. The author of the declassified memo about the Goldsboro accident, Parker F. Jones of the Sandia National Laboratories, noted that three of the four “fail safe” devices failed on the MK39 Mod 2 bomb, and that, “One simple, dynamo-technology, low voltage switch stood between the United States and a major catastrophe.” This particular switch was itself error-prone. The bomb’s yield was probably in the two-to-four megaton range. Check out Alex Wellerstein’s post about Goldsboro and his NUKEMAP to see the weapon effects from Goldsboro to Richmond that were narrowly avoided.
For more blog posts about nuclear accidents and incidents, see Vasili Alexandrovich Arkipov (2/4/10), Organizations, Accidents, and Nuclear Weapons (8/10/10), and Broken Arrows (12/26/11). After studying crisis behavior and near misses, my powers of analysis come up short in explaining how we humans have managed (so far) to avoid mushroom clouds since 1945. I have concluded, as the Goldsboro case study suggests, that divine intervention and plain dumb luck help explain humankind’s good fortune.
Ghosts are in the machine. Reliance on the machine to work properly at all times and to never be accident- and incident-prone is most unwise. It is also unwise to rely on divine intervention and plain dumb luck to avoid mushroom clouds.
These passages come at the end of my impressionistic account of the Bomb, Better Safe than Sorry (2009):
The quite human instinct of being safe rather than sorry repeatedly worked at cross purposes. Safety led to an excess of caution, and an excess of caution led to excessive nuclear arsenals… The human factor invites Murphy’s Law, but it also has helped to prevent numerous close calls from resulting in mushroom clouds…
Nuclear history is laden with irony, and more ironies – perhaps deadly, perhaps not – lie ahead. Safe passage during the first nuclear age required steadfastness, good fortune, learning from mistakes, and, above all, wisdom. Safe passage during the second nuclear age will require more of the same.
A more recent set of musings, in the Stimson Center’s Anti-satellite Weapons, Deterrence and Sino-American Space Relations (2013), landed here:
Immediately below the meta-level that defines success[ful nuclear deterrence] lie conditions for its potential failure. During crises, when nuclear-capable forces are readied for use, the possibilities for inadvertent use, breakdowns in command and control, and accidental use grow. Because diversified use options distribute weak points within the edifice of deterrence as it grows, crisis and deterrence stability grow shakier as a result. These dynamics were present during the Cold War superpower competition, and they are present on a far smaller scale in the crisis-prone relations between India and Pakistan.
Murphy’s Law isn’t quite done with us quite yet. Many nuclear weapons, high states of alert, and movement during crises add up to the potential for something to go badly wrong in the future.