Time Precision, Measuring Which Came First in the Race To Zero

There are many things in life measured and inspected with the finest of detail. The quality of a diamond can really only be judged with the close inspection of a jewelers loupe. A politician’s every word is examined and analyzed by naysayers as they circle like vultures on the lookout for any opportunity to extend their talons to scratch and claw. Time, that venerable gem is always under intense scrutiny in the financial markets and was the subject of a recent article in the Financial Times.

Time is a precious commodity in the trading world; it is inspected, analyzed, and squeezed. As this article on the speed of trading describes, systems have made a steady progression in their reduction of time, specifically latency over the past few years. Not so long ago the benchmark standard was the millisecond. The advent of faster processors, co-location, faster networks, the use of FPGA assist and faster disk I/O subsystems have all contributed to the millisecond’s demise and the rise of even finer levels of resolution.  While reduction in end-to-end latency is a goal of trading systems a by-product of these faster systems is increased throughput,  as systems are capable of trading faster they increase the volume or density of data as well.  The FX market alone has seen spot transactions grow by 48% in the past 3 years; OPRA has gone from 500,000 messages-per-second peak in 2008 to over 2 million mps in January of this year. The volume and density of that data is almost unimaginable, for those old enough and weren’t grade school houligans 20 years ago, such was a realm only known to science fiction.

Larry Tabb in a recent interview entitled
Measuring Nanosecond Latencies, noted that speed is the final frontier. It seems like the buzz has hardly died down on the microsecond, and now we’ve zoomed past that to nanosecond precision. Larry comments: “…shocking how fine the measurements are getting… if liquidity is being provisioned for a nanosecond, I better be able to react to it...”. NYSE/Wombat is time stamping their marketdata to nanosecond precision already, I’m sure others are as well, and many are soon to follow. Corvil’s newest nanosecond latency monitor announced, “… microseconds are no longer sufficient to track performance“. There are even some vendors, such as OneTick that support picosecond precision for their time-series analytical functions providing a means to analyze data and precisely measure with incredible accuracy. Sure, no commercial data provider that I know of is time stamping data at that level of precision today, but not so long ago we didn’t have microsecond granularity either and still there are those that haven’t even progressed beyond the millisecond.   Will we ever get to the pico-level for trading systems? I find that highly unlikely, the barriers keep getting higher as we approach zero. But this does signal the demise of the old benchmark standard. The millisecond is old news, microsecond is fast approaching that as nanosecond becomes the next yardstick.

For a number of years, I’ve often used the term “the race in the microsecond” when referring to HFT, but twice now in the past few weeks I’ve heard a new one: “the race to zero“, a theoretical statement of course in the goal to achieve Planck time, but telling in a way in that we’ve crossed the microsecond barrier and are heading to the next one. The laws of physics will get in the way of course; we can’t break free of the grip of Newton. A rip in the space-time continuum only happens when trekking around the galaxy.

Time precision is important for many reasons; Victor Yodakien talks about time drift across servers in an article in Automated Trader. Victor concludes that the current NTP standard is outmoded by today’s precision-driven markets. Trading takes place across a multiplicity of networks, equities are hedged on the futures market, equities are arbitraged across a number of exchange connections, minute differences in time can have enormous impact on trading systems. The demand to precisely measure is driven by increased trade volumes; data density packs more trades and quotes in every moment.  Measuring execution times, comparing trade venues for best execution, determining slippage, measuring current trades against the prevailing quotes, what came first? all guide trade analysis and surveillance systems as well.  This all drives time precision to finer and finer granularity. As Victor states: “it’s the quality of time data that is attracting the attention of the most aggressive and technologically advanced trading firms …“.

So where will it end, what is the next leap in latency reduction? Will firms demand that vendors time stamp data with even finer levels of precision?  We will eventually hit an unbreakable barrier, logic dictates that reasoning, but trying to answer these time-tested questions is like trying to answer that age-old one, which came first the chicken or the egg.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

About Louis Lovas

Director of Solutions, OneMarketData, with over 20 years of experience in developing cutting edge solutions and a leading voice in technology and trends in the Capital Markets industry.
This entry was posted in Algorithmic Trading, Analytics, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, HFT Regulation, High Frequency Trading, OneMarketData, OneTick, Tick database. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s