Jeffrey LewisMissile Defense Software


This doesn’t usually happen. Honest.

MDA Director Henry “Trey” Obering (right) attributes the failure of IFT-13C to a “very minor computer software glitch.”

Noah Shachtman, however, has posted a fantastic analysis from former DOD Director of Operational Test and Evaluation Phil Coyle who argues the software glitch is anything but minor.

Rather, the glitch reflects inherent challenges of processing the massive amount of information needed for a hit-to-kill missile defense intercept:

I also read Obering’s comments about message drop out rates as suggesting that they also had a problem with the 1553 data bus on board the interceptor.

[snip]

Basically, the 1553 data bus is too slow and doesn’t have the capacity to handle the tons of information the interceptor has to process.

[snip]

A 1553 data bus in a high-speed missile system can be like trying to go 3,000 mph on shoe leather.

Or drinking from a firehouse like that kid in UHF.

Anyway, worth a read.

The demands on software were one of first objections to the SDI program proposed by President Reagan. Twenty years into the computing revolution a case study loooking at software requirements then and now concludes that missile defense still “presents the most challenging real-world software engineering problem imaginable – to interpret real-time sensor data taken under natural conditions and appropriately handle an attack by an intelligent adversary likely to employ strategies that have not been fully anticipated.”

That case study—Kevin W. Bowyer, “Star Wars Revisited – A Continuing Case Studay in Ethics and Safety-Critical Software,” IEEE Technology and Society 21:1 (Spring 2002) 13-26—is based on a series of debates hosted at MIT in the 1980s. It is available on-line.

Comments

  1. Mark Gubrud (History)

    Both Obering’s comments and Shachtman’s report indicate that the problem was not exactly software, but rather an overloaded hardware data bus, and the “software glitch” was just a setting that if changed would have allowed the interceptor to ignore the hardware overload. This is not quite, or really not at all, the same as the general issue of battle management software complexity that dominated the 1980s debate. On the other hand, it is again another illustration of point that when you are pushing system complexity and performance to the limits of the achievable, something is likely to fail. If it ain’t the software, it’s the booster, or it’s a plumbing line, or it’s a sensor… and the only question left is what it is going to be on the day the balloon goes up.

  2. Mark Gubrud (History)

    Also, the overloading of the data bus isn’t really a function of the speed of the missile, which in this case was zero mph. It’s a function of system complexity, how many messages are being pushed onto the bus, how lax the software designers are being about padding the messages with headers, handshakes, room for future changes and so on. This could have led to underestimating the needed bandwidth. The milliseconds of jitter that Shachtman talks about are no good for the end game, but shouldn’t pose a problem for the booster.