Software Bugs: What Wall Street can Learn from Mars

Posted on August 3, 2012 by


In the last few months we’ve seen two spectacular software failures on Wall Street. Most recently, on Weds August 1, Knight Capital’s trading software fired off trades in one hour that should have taken place over days.  The flurry of trades caused erratic pricing for 150 US stocks.  Knight is said to have lost $440M in the debacle.

Earlier in the year, the Facebook IPO was hampered by the software system at NASDAQ that was  designed to transition trading from an initial pricing to active trading.  The base software is the same as that used for the “opening cross” which transitions from the first trade of the day (the opening price) to interactive online trading.  There were at least two differences NASDAQ’s code faced that day:

  1. A “small change” in the software to reflect slightly different rules for IPOs versus the normal cross, and
  2. Massive volume. The normal opening cross has nowhere near the volume, and therefore network load that this IPO attracted.

I’m not privy to the details, in fact we may never know exactly what happened, but my guess is that the software was tested against each of these issues. But I’ll bet it wasn’t tested under both conditions at once. When the system went live, it choked. The ensuing delays and problems with pricing rippled through the markets and cost many individual investors a fortune.

The failures at NASDAQ and Knight seem to be linked to software that was tested, but not in ways that exposed the bugs.

The importance of Computer Science versus mere programming

I hope these clouds will have a silver lining: Namely, that they will highlight importance of computing in finance, and the importance of rigorous testing and sound software development principles.

These failures remind me of the NASA Mars Climate Orbiter that crashed several years ago because two software modules had never been tested together failed.  One module was sending commands in English units (feet & pounds) while the other was expected commands in metric.

One reason this example occurs to me is because I worked one summer at NASA/JPL. I observed a culture there that I think may exist on Wall Street as well. At NASA, Science is king, Engineering is important, but software is just the glue you slap on at the last minute. In fact the summer I was there, while the Sojourner rover was on it’s way to Mars, it’s software was still in development. Sojourner was a success, but I honestly think they just got lucky, and their luck ran out with the Climate Orbiter.

NASA has changed since then, but we need Wall Street to change as well.  They need to move from viewing software merely as ‘something to build’ to software as a their life blood.  Investment banks and brokerages are really software companies, but I’m not sure they realize that yet.

Perhaps along with a change in culture, we’ll see a change in practice. I hope we’ll see them apply methodologies such as black box testing.  Black box testing examines how a program works under external stresses.  So for instance, the NASDAQ and Knight Capital problems might have been prevented if the programs were placed under simulated market conditions.  These sorts of tests reveal potential problems that are not revealed otherwise.