Data management – Speed, Quality or Volume

Recently Getco announced two grim figures. Their net income fell 82 percent and revenue declined by 41 percent. Even against the backdrop of the January Effect which has created a sugar-high for the S&P500,  a winning streak just shy of its all-time high of 1565.15 set in 2007, Getco’s losses are sobering news. And they coincide with the precipitous drop in stock-market volume coupled with erratic volatility.  This streak of withering volumes and volatility has existing through much of 2012 and it likely to continue well into 2013. It’s looking like (un)lucky 13 is a fait accompli.  Just saying.

A new normal of uncertainty pervades the markets. The combination of economic turmoil and regulatory overhang may seem like it is the end of the world and the demise of proprietary trading. However, there will always be winners and losers. This new normal defines a benchmark, a barometer that will create a separation. Those that choose to embrace the new reality and leverage the tools and technology to ensure their success and those that will indict a scapegoat as they sink into oblivion.

Data take center stage for conquering the new normal. The in-depth understanding of market behaviors and structure and optimal price transparency must be achieved for defining and calibrating trade models and knowing trade performance.  A key to sustainable success is to learn what you do not know. Successful companies constantly challenge assumptions looking for validation. While the squeeze in on because of market uncertainty, the desire to survive will push firms to get a better handle on managing data and ensuring its quality. Here’s a short Q&A to that end.

What is more integral, to go faster or to do more without getting slower?

What can be done in a microsecond?  It goes without saying that technology plays a critical role in the trade life cycle – the evolution of technology goes hand-in-hand with innovation in trading. The most important areas include:

  1. Fast access to market data.  Quant strategy decision making is critically dependent on the immediacy of pricing data often from multiple sources where, as an example, order books need to be consolidated and possibly currency converted. 
  2. Strategy execution performance.  High frequency strategies are typically first written in C++ for speed and then, leverage a trade infrastructure such as an analysis GUI working integrally with CEP. This provides the optimal combination for speed.

In general, this question is rooted in what is behind quantitative trading and what exactly do quantitative analysts do? They apply an empirically-tested and rules-based approach to understanding markets, exploiting inefficiencies manifested by human behavior, geo-political events and market structure. The end result is model creation, validation and optimization to prove profitability. Ultimately, quants look to get the most out of mathematical models, which are increasing in complexity by leveraging sophisticated tools to design and backtest strategies. We are in the midst of a negative divergence in the equity markets, where prices continue to climb even as volumes and volatility fall.  When will momentum shift and a reversal occur?  The Holy Grail is really to outsmart the crowd, that does not imply being faster but smarter.  In devising those smarter strategies quants will demand access to deeper global history and use observations from the past to characterize relationships, understand behaviors and explain future developments.  This is next algo battleground.

Can a company succeed without fast, high quality data?

All trading and business decisions today are critically dependent on time-sensitive data quality. The scope of data analysis deals with extracting effective value from large data sets in a timely manner.  This is for trade-related decision management including econometric trade modeling, risk and cost analysis.

To achieve alpha and discover new investment opportunities demands sophistication in tooling, analytics and data management.

New ideas for measuring data quality.

The need for high quality of data cuts across all aspects of the trade lifecycle. Those trade-related solutions, model back-testing, portfolio mark-to-market and compliance depend on a high-degree of data quality, where accuracy is vital to determining outcomes.

Tick data is often derived from many sources; there are 13 major exchanges in the U.S. alone. Across Asia are 15 different markets with different technologies, cost structures, regulations and cultures.  That’s a natural barrier to algorithmic trading and it creates unique challenges for efficiently trading across them – recognizing their differences in both market microstructure and regulation. The determinants of price discovery, volume, and trading patterns define the structure unique to a market, an asset class and geography influenced by participants and current regulation. The collection and analysis of data across the wide geography demands sophisticated tools and systems as firms look to discover the unique determinants of transaction costs, prices, and trading behavior to devise smarter trading models.

Data has to be scrubbed clean by applied reference data – cancels/corrections, corporate actions (splits, dividends) and exchange calendars.  Crossing borders means global symbologies, requiring symbol maps and currency conversions.

For risk management, data quality goes one step further for accuracy – statistically scrubbing algorithms. Analytically cleansed prices ensures chosen ‘snaps’ are within the tolerance levels (i.e. 0.5 standard deviation of the last 2 minutes) when used for risk compliance and calculating yields.

Measuring data quality for TCA is about bringing together disparate data sources. TCA measures broker performance; identifies outliers and measures trades against benchmarks. TCA is looking at intra-day execution efficiencies by measuring and monitoring executions against those benchmarks – arrival price and market price. The ability to accurately derive these benchmark prices at the precise time of trade execution is a cornerstone for understanding trade performance. And it demands a high quality of data, its reliable capture and storage.

Will high latency or poor quality data affect your business?

Firms are capturing data for back-testing new and evolving trade models in the never ending hunt for alpha and for risk management to monitor and control exposure and transaction cost analysis to measure trade performance.

Market data comes in many shapes, sizes and encodings. It continually changes and requires corrections and an occasional tweak. The challenge of achieving timely data quality in this data dump is dealing with the vagaries of multiple data sources and managing a sea of reference data for …

  • Map ticker symbols across a global universe of exchanges and geographies
  •  Tying indices to their constituents
  •  Tick-level granularity
  •  Ingesting cancelations and corrections
  •  Inserting corporation action price and symbol changes
  •  Detecting (missing) gaps in history

Any and all of these factors are vital to data quality and the science of quantitative trade modeling.  As Paul Rowady from the TABB Group mentions, “Reference data is like the glue that binds disparate data together. Without it, data can take no shape … In short, big data is dumb data without some glue”.

Effective market data management is about linking disparate data sets under some common thread to tease out an intelligible answer, a diamond from a mountain of coal. It’s about finding the cross asset correlations or understanding how to best hedge a position to offset risk, these are the underpinnings for profitable trade models, portfolio management and transaction cost analysis.

Low latency is the ante to the game in the trading business, it is evolutionary not revolutionary. The evolution in technology goes hand-in-hand with trade innovation. Firms have to closely monitor latency; it is a tax everyone must pay to play. Good models and a deep understanding of the market through deep data research, coupled with infrastructure that does not constrain is the goal.

Data management is the centerpiece of that goal touching all aspects of the trade lifecycle from quant research, model design/development/back-test and deployment to risk and cost analysis.

Uncertainty will create diversity as firms expand their business to be multi-asset, multi-broker. Doing so they need to improve their visibility and manage trade costs through TCA solutions to accurately measure costs for the purpose of executing orders in a more efficient way. Profiling trader and broker behavior – looking at intra-day execution efficiencies and highlighting the impact of delays on strategy performance.

The quality of data ultimately defines an end game, that Holy Grail for sustaining the profitability of the business. Firms should not lose sight of that.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

About Louis Lovas

Director of Solutions, OneMarketData, with over 20 years of experience in developing cutting edge solutions and a leading voice in technology and trends in the Capital Markets industry.
This entry was posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, HFT Regulation, High Frequency Trading, OneTick, Tick database. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s