I am a technoptimist. A word coined by Harvard history professor Niall Ferguson to define that glass half-full group who optimistically envision a technologically enabled future. The advances in science and technology have produced dramatic changes across all aspects of our daily lives and will continue to do so. Yet Niall takes a surprising cynical outlook on technology’s continuing impact. By taking a narrow quarter century look-back his conclusions are oddly pessimistic especially for a historian. It was not all that long ago that a fossilized society scoffed at Louis Pasteur for his ideas of germ theory. Now the invisible is the basis for scientific breakthroughs from the Higgs boson to the human genome – the genetic recipe of man poised to revolutionize healthcare. Information accessibility has advanced from a time of chained libraries to a price/performance where the world’s knowledge is instantly available to everyone on handheld devices more powerful than anything the brilliant Alan Turing could dream of. Yet Niall’s self-professed pessimism is a reactionary behavior that is endemic to the human condition even as we park a 1-ton vehicle on Mars.
The acerbic reaction to Knight Capital’s technology-glitch has elicited a similar cynical pessimism about technology’s role in the markets. The code bug has ignited the techno-luddites with warnings of market mayhem and the portents of another flash crash. Knight’s failure exposes a latent Skynet-driven fear of technology with the HFT critics celebrating the old days and a desire to bring back the human element of market making. Sometimes that humanity has a way of becoming dubious. The manipulation of the Libor benchmark interest rate, now dubbed as organized fraud and the biggest financial scandal ever reeks of that human touch.
HFT is a capitalistic pursuit that provides liquidity and as Larry Tabb points out complex markets require more sophisticated data management platforms. U.S. markets are the Usain Bolt of trading compared to more simplistic markets such as those in China; that market is like having the Olympics in the middle of nowhere.
Knight’s misstep is not an indictment of the whole algo-driven industry. In the days following, the market rallied closing the week on a positive note (S&P 500 climbed 1.1 percent, the Nasdaq advanced 1.8). Pessimism is a cynical attitude some choose to ignore – thankfully. Trading firms do not have an altruistic (market-integrity) motive for testing algorithms. It is for ensuring robustness, accuracy and profitability. That is core to sustaining the business and the investment is heavy. It begins with unit testing through integration testing and acceptance testing – analyzing test results looking for regressions and validation before a production roll out. However, bugs always occur that’s the nature of software. The more complex the environment increases the likelihood of corner cases that cause failures. Those can be caused by spurious conditions due to volume, volatility or market structure or combinatorial situations that are extremely difficult to replicate in a test environment. That environment should consist of a data management framework that includes both deep history and live content.
Unfortunately, Knight’s debacle has regulators spinning on the issue of “properly tested“ code, something they would have little recourse to rectify or regulate. Algo’s and their test environment are highly unique to individual trading styles. That the SEC is outgunned, by reference to the analogy of bringing a knife to a gun fight simply highlights a pessimistic techno- naive attitude. Regulators cannot prevent the gun fight, nor should they. The fight is between participants. But they can ensure each side has to shoot through the same size gun barrel.
The technoptimist would see opportunity in Knight’s failure. Just as the sinking of Titanic exposed a tragic failure in passenger safety, it set the stage for grand improvements in maritime safety and communication standards. Knight’s failure sets a clear path to improvements in pre-trade risk, a functional area of trading regulators can more definitively grasp.
Pre-trade risk is a toll that some firms chose to pay by their own volition and by regulatory mandate (15c-31 Market Access Rule). This is an independent function from the trading systems designed to monitor order flow and act as a gatekeeper. Specific checks can prevent algos from inadvertently misbehaving (i.e. going rogue) due to bad code. There are the typical thresholds on Order price and quantity that can prevent the “fat finger” bug and also checks to detect more insidious issues. One that likely caught Knight flatfooted is an Order throttle to detect runaway Order activity. The FIX Protocol Organization has released a set of practical guidelines for pre-trade risk and defines throttling as follows:
Runaway Checks: The purpose of this type of check is to identify the scenario where a client’s trading algorithm has stopped working correctly and/or is no longer under control of the client. Specific examples of metrics to compare are:
- The ratio of order cancels or cancel/replaces to new orders is unusually high relative to the client’s historical trading patterns.
- The ratio of orders to executions is unusually high.
- Trading patterns indicating the algorithm may have gone into a loop (e.g. repeatedly sending an order and then canceling it)
Given that definition, seems rather plausible that Knight decided not to enable such a pre-trade check.
Algorithmic trading is advancing rapidly due to sophisticated algo-development frameworks and complex global market structure. While test infrastructure can validate new algorithms, it’s pre-trade risk that is the sentinel safeguarding the castle and the kingdom. The IOSCO’s recent Consultation Report addresses the pre-trade risk side of market surveillance by describing what is expected of Market Authorities to “prevent disruptions to orderly trading and maintain ongoing confidence…”.
Pre-trade risk is a well-defined set of functions on Order flow; it includes a common set of limit checks that cut across all trading firms. Like the Market Access Rule, it can be pervasively implemented across the market and has the opportunity to prevent rogue algo’s of wreaking havoc. Regulators would be well served to further investigate this instead of wasting time wielding a knife in the test kitchen.
Once again thanks for reading.
For an occasional opinion or commentary on technology in Capital Markets you can follow me on twitter, here.