Time Advantage to Analytical Inquiry, the value of Visual Data Discovery

How much time do you need to receive, analyse and respond to market events in today’s world of fragmented liquidity?

Even if you have the fastest links to exchanges and dark pools, precious time is not taken in data delivery, but in data analytics. Of critical importance to strategy decision time is immediacy of pricing analytics whether from single or multiple sources.

In the trading world, understanding data is a game changer. Firms are awash in data across a market structure fragmented intdozens of sources and data types. In the Equities markets there are 13 major exchanges in the U.S. alone and 20 across Europe and Asia providing trades and book depth. True price discovery… volume and trading patterns can only be achieved by analysis across markets. That old saying, the whole is greater than the sum of the parts fits well in this case.  And of course the importance is to find trading opportunities. That is in the immediacy of pricing data… time is of the essence.

By dramatically reducing the time to understanding, we provide a time advantage to spread trading, pairs, reversion, smart routing and risk analysis. Increasing algo sophistication implies more calculations and more analysis – in realtime. This should not be diametrically opposed to latency goals.

Analysis is not just about the here and now and the blending of multiple venues but also understanding the past. History is how we got to now, time is just a continuum.  Data analysis reveals unique observations and patterns and the possibility for predicting future values. Only by comparing current values to past activity can it determine unusual behaviour, or market abnormalities. Monitoring raw market data may show us the prices, history gives us benchmarks.

So given that basic problem scope, you can divide the solutions into 3 domains; managing scale for data capture and storage; analytics and visualization.  Of course they are not mutually exclusive but highly intertwined.

For high precision market analytics to derive real business value it requires enterprise an data management platform able to deliver capture and query performance… and data quality – to ensure prices reflect cancels, corrections, splits and dividends and symbologies across markets. And the data architecture must be able to easily conflate order books across markets – this is critical to the discovery process.

The third is visualization… fashioning analytical metrics into a human readable format. Visual display across deep time allows users to see things they were not aware of. It can simplify comprehension of data and promote understanding.

The terabyte volumes of market data and order activity can be easily consumed and processed by high-speed data management and analytics. However, the single easiest way for our brains to interpret large amounts of information, communicate trends and identify anomalies is to create visualizations against the distilled, filtered and smoothed content.

This webinar recording demonstrates that by looking at the U.S. Equity Markets across all major venues, calculating metrics based on market behavior, and visually analyzing trading abnormalities such as order book imbalance.

The right analytical content at right time to the right people creates the competitive advantage everyone seeks.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, OneMarketData, OneQuantData, OneTick, Tick database, Transaction Cost Analysis | Leave a comment

Industry Survey, Momentum Builds for Cloud Services across Capital Markets

What is it about cloud computing that causes such a disturbance in the force?  The seduction of low-cost yet massive compute power creates overwhelming interest. Yet fears of a too-good-to-be-true offer create an emotional tug-of-war not for the faint hearted.

Momentum is building fast within the trading and investment business to benefit from the $18B cloud computing industry.  In a recent OneMarketData survey, respondents overwhelmingly look to avail themselves of all that cloud offers.  Over 77 percent expect to jump headlong into public and/or private cloud platforms in 2014. A weighty figure, made even more significant against the backdrop of current usage – disappointingly low.

Yet, despite its perceived benefits the capital markets industry has been slow to adopt cloud services or decide what, if any trading-related functions to migrate to a cloud platform. That is changing as market participants look to find alpha anywhere they can. The allure of cloud computing is lower technology costs and improved profitability that comes from more in-depth algo testing.

A key characteristic of the cloud is rapid elasticity which offers self-service scalable compute power unheard of prior to cloud technology. Such pay-as-you-go scalability defines a new archetype for large scale model back-testing. The increasing importance of software testing is a major focus in the post Knight Capital era. 63 percent of the survey respondents picked it as the primary solution for the cloud.

Yet with all the fervor toward increased adoption, there are still barriers ahead. Financial firms face the same angst as any commercial entity – navigating the risks of externalizing infrastructures and the inherent loss of control in doing so. But the technology-heavy trading and investment business invite unique challenges to cloud platforms not so easily solved – performance for one.

Performance has a direct impact on profitability for all market participants, not just High-Frequency Traders (HFT). Processing speed has an immediate impact on trade decisions and increasing algo sophistication implies more calculations, more analysis and more processing time. This should not be diametrically opposed to latency
goals. Fast processing time offers advantages to spread trading, pairs, reversion, smart routing and risk analysis. And is the difference between winning and being just another also-ran.

A move towards cloud signals a fundamental shift in how we handle information. Financial firms will move first with data-heavy decision-support functions – model back-testing, quantitative research and transaction cost analysis. The days of cloud-as-a-fad are over. It’s a game-changer creating a major paradigm shift in business initiatives with its vast computational power, storage and a wide variety of application solutions at a lower cost structure.

Click here for the survey report on the cloud computing trends in capital markets.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, High Frequency Trading, OneMarketData, OneTick, Tick database, Transaction Cost Analysis | Leave a comment

Are Cloud Services Emerging as a Key Factor in Global Markets Trading?

Cloud computing, it means a lot of different things to different people and is one of the most over hyped technologies in all of computerdom.  There are public, private, hybrid, infrastructure, software and platform oriented cloud solutions. Crafty software vendors masquerade as cloud solutions, but having a hosted app in rack-mounted hardware for a monthly fee does not constitute a cloud based solution.  Gartner has identified no less than thirty-seven cloud-based services and technologies in their 2012 Hype Cycle (see diagram). It’s no wonder Cloud evokes a mind numbing analysis paralysis once you dig below the surface.

Yet momentum builds. According to researcher International Data Corporation (IDC), the cloud-enabled Software as a Service (SaaS) market grew 26 percent to become an $18B market in 2012. Cloud infrastructure deployment is predicted to grow to $11 billion by the end of next year.  Cloud services are creating a major paradigm shift in business initiatives and IT technology as it offers the potential to leverage vast computational power, storage and a wide variety of application solutions. The choices, pros and cons abound as the financial services industry examines private and public clouds, cost factors, solution alternatives and in the impact on existing business models.  As for a definition, this one holds up pretty well.

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

Despite its perceived benefits the capital markets industry has been slow to adopt cloud services or decide what, if any trading-related functions to migrate to a cloud platform. Yet the motivation is mounting, or it seems to be… Market participants are witnessing a new normal defined by thinning margins, increasing competition and diminishing volumes. Extracting alpha out of a dwindling pot has been their raison d’être. The desire to capture alpha anywhere it can be found has caused the uptick in transaction cost analysis (TCA) cost controls, intensified strategy back-testing and cross asset, cross border trading.

Rik Turner from the research firm Ovum recently published a progress report on cloud adoption siting increases have occurred due to improvements in security and overall cost-consciousness. But there is a bewildering array of cloud attributes, many that present challenges to market participants and due diligence is necessary to understand them. The business functions of trading and asset management center on data management. Cloud injects complications for data ownership, entitlements, security, compliance and regulatory policies.

A key characteristic of the cloud is rapid elasticity which offers compute power unheard of prior to cloud technology. Such on demand scalability defines a new archetype for large scale model back-testing. The increasing importance of software testing for both profitability and robustness is a major focus in the post Knight Capital era. The cloud’s pay-per-use access to hundreds even thousands of CPU cores affords in depth testing and optimization techniques that where once impractical.

A move towards cloud signals a fundamental shift in how we handle information, yet at what cost?  This is where OneMarketData, the makers of OneTick, a time-series tick database and Complex Event Process (CEP) technology are asking market participants to weigh-in. We have put together a short questionnaire to solicit industry input on cloud computing. Is it a fad or a game-changer?

Click here for the survey on the cloud computing trends in capital markets.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

 

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, OneTick, Tick database | 1 Comment

Information Delivery Finds its Mark in Social Media

In today’s information age, we often take for granted that breaking news stories from across the globe will be delivered to our doorstep in an instant. We have come to expect a deluge of information from radio, television, newspapers and social media to provide not just journalistic coverage of political, business and society’s notable events but also a diversity of viewpoints in social commentary and opinions from the multitudes.

But this was not always the case. It began in the early part of the twentieth century where the machinery of information began to change the rules. Major events from the sinking of the Titanic to Lindbergh’s crossing of the Atlantic catapulted the “information hunger” to titanic proportions.  Marconi’s telegraph and the rotary press, which made possible continuous printing, were game changers delivering breaking news on the calamitous and the momentous.  These technical innovations not only fed the insatiable frenzy for information they created it, which has led to the familiar face on the evening news and the paparazzi.

Wind the clock ahead to present day and the narrative behind information delivery has found its mark in social media.  The information flow from social networks such as Twitter, Facebook, Blogs and many others  not only show trending topics but very often deep insight from the minds of experts and the experienced on a specific business, market trends and products or services. This represents a significant paradigm shift in information’s delivery, consumption and value. While social media can be more noise than signal, the modern enterprise now views twitter as a strategic weapon, a tool for predictive insight to feed business objectives from customer satisfaction to product refinements.

The Hash Crash can be defined as an inflection point, motivating a “sit up and take notice” attitude towards social media among financial market participants – how much influence does social media have over the markets and what value lies within to derive alpha?

OneMarketData’s Social Media survey looks to answer that and many other related questions on using social media as the fodder for predictive analysis.

Social Media Survey Takeaways

The twenty-question survey reveals attitudes towards social media ranging from fear and uncertainty to a belief in the untapped potential.

  • Traditional news sources take a back seat to social media outlets for news and information delivery. And while the journalistic narrative may be the breaking scoop, just as important to the story is the meaning behind the story as expressed in the vast collection of commentary and opinions.
  • Social media has reached widespread use as a vehicle for conveying news, information and insight. But on the flip side – accepting social media as a credible, reliable source of actionable information has not happened. Within the financial industry there is still a skepticism that social media is more noise than signal – the fear of false positives and security breaches looms large. The Hash Crash was only the first major breach, itself a predictor of future malicious hacks.
  • Yet, there is also acceptance of the inevitable – the information flow of social media does influence financial markets. And it will catapult efforts to leverage this data source for actionable insight for incorporation into trade model design, back-testing, optimization and benchmarks. Increased competition, thinning margins and risk concerns will drive a think-out-of-the-box attitude towards social media.  But by and large the challenge is distilling content into sentiment indicators through text pattern analysis of positive and negative commentary. This will take shape in the analysis of captured history and real-time consumption of the social media information flow advancing data management and complex event processing (CEP) technologies.

The Future of Regulatory Oversight

As social media becomes more accepted as a tool in our marketplace regulators will be watching closely. That began with the Securities and Exchange Commission’s (SEC) endorsement of the use of social media outlets to distribute material, non-public corporate information. Somewhat embarrassingly and an unlikely coincidence for the Securities Exchange Commission, the Hash Crash occurred just three weeks later. This malevolent breach of a trusted new source was notoriously bold but is also the catalyst for social media outlets to tighten security. Safeguarding the information flow is tantamount to its credibility, a obvious indication that social media outlets will be increasingly driven to regulate themselves, monitoring usage and potential abuse.

Yet, influential market participants seeking the spot light have clearly proved social media will be under increased regulatory scrutiny in the coming years. It’s not so much that highly regarded authoritative figures like Carl Icahn post tweets announcing their position in a high profile business such as Apple. It’s a matter of who’s listening. The sheer number of social media consumers, each with their own voice defines much greater market-moving potential than anything else ever done in the past – something Marconi himself could have never imagined.

The survey report can be downloaded here.
A video Interview on TABB Forum is available here.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, HFT Regulation, OneMarketData, OneQuantData, OneTick, Tick database | Leave a comment

Industry Survey on the Impact of Social Media on Financial Markets

Commentary on social media outlets has reached peak levels. This year marks the seventh anniversary of Twitter and tweet volume now exceeds 400 million messages per day. Social media steps over traditional outlets to offer faster news delivery sprinkled with opinions, commentary and perspectives on business activity and political events. One glaring example was the public outrage over Rolling Stone’s now infamous choice for a cover photo.  If Werner Media  the publisher of Rolling Stone magazine was a publicly traded company their stock price would have plummeted following the social media berating heaped upon them. Who knows, it might have been known as the Bash Crash.

Social Media has challenged traditional publications as the future of information dissemination. The information flow is not directly targeted at reporting a news story in the classic sense but artistic license to portray an emotion, the global psyche from the voices of millions. 

Leveraging twitter or other social media sources for economic and company information to gain actionable insight has begun to accelerate.  The idea of trading based non-traditional information is not new, but using twitter or other social media sources of economic and business information has piqued beyond mere curiosity. There were a few pioneers who made a bold attempt to filter and trade on social media sentiment, yet only the unsuccessful make the headlines.  The market impact following a fake tweet from a malicious hack into the Associated Press (AP) Twitter account bears witness to the increasing influence of social media on financial markets.  Providing the evidence that social media is more than chatter about nothing.

Yet the unknown is just how strongly social media impacts the financial markets and who is using it? The unabashed commentary suggests automated trading ran wild following the Hash Crash causing the 145 point decline in the Dow that same day. Such accusations are wild speculation at best, unfounded by empirical evidence.  It is just as likely that the human emotional response to the malicious act notoriously described in the fake tweet caused the market pull back. Yet it is decidedly difficult to know.

Who better to answer these engaging questions than the market participants themselves, those actively seeking to discover and exploit new market inefficiencies. To that end, OneMarketData decided to solicit input from industry participants on their use of social media for trading and investment decision making. We are in the data management and analytics business for quantitative finance, therefore we have a vested interest in understanding how social media fits in. Conducting a question and answer survey we have compiled the resulting statistics into the collective wisdom from participants across the global financial centers. The report looks to answer the question, “Is Social Media a Disruptive Force in Financial Markets?”  You can download the survey report here.

Social media, especially Twitter, transcends the old news agency reporting model. It renders artistic license to portray individual opinions, perspectives and commentary on events across the globe. It is a ‘new frontier’ and we are on the cusp, the tip of the iceberg of increasing our understanding of the human state of mind on a more global scale yet at an intimate level.  The yet-to-be-answered question is whether this will translate into actionable business insight.

Combining or correlating social media with traditional market data analytics and quantitative techniques creates enormous potential in understanding the dynamic nature of what makes markets behave and is a huge step towards that elusive holy grail of accurate predictive modeling. At this point, the new frontier of social media is just a set of raw materials waiting to be assembled, yet the possibilities are intriguing.

The survey report can be downloaded here.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, HFT Regulation, High Frequency Trading, OneMarketData, OneQuantData, OneTick, Tick database | Leave a comment

Understanding Complex Event Processing, it’s all about the Data

For the first half of this year equity markets are soaring on a sugar high; market indices regularly hit new highs as exhibited by the DOW’s 18 percent rise and the S&P 500 breaking 12 percent over the end of last year. The debate rages on how long this bullish market will continue. There are numerous factors that could make this year different from the past three, ranging from the continuation of central bank easing policy to improved economic conditions. How do we know this? It’s all in the data as the major economic indicators and market indices are tracked, scrutinized and compared to past results.

Yet the undercurrent of the equity market’s exuberance is a continued downward trend in volumes and trader-loving volatility. U.S. equity trading volume across all the major Exchanges has dropped  around 7 percent so far this calendar year, 2013. NYSE’s volume composite index (MVOLNYE) has been on a slow slide reaching all the way back into last year, down nearly 10% year over year. And while the VIX spiked above 20 in June, overall it too is at a six-year low. Again how do we know this?  It’s all in the data or more specifically the analysis of the data over time.

For the professional trader volumes are a reflection of money flows, achieving margins hinges on total volume and a sprinkle of volatility all the while maintaining an accurate audit trail of trading activity. With the third anniversary of the Flash Crash just behind us is the crush of compliance with increasing regulatory actions cascading from Dodd-Frank, the Consolidated Audit Trail (CAT) and the repercussions of Knight Capital’s mishap in the SEC’s recently proposed RegSCI (Regulation Systems Compliance and Integrity). We live under a cloud of market uncertainty, regulatory oversight and increasing competition. It is a new normal, a fait accompli that is shaping the future and forcing firms to elevate their game. And how do we know this? It’s all in the data.

The new normal may represent a dearth of market activity but also mandates an imperative that firms recognize that data’s intrinsic value impacts the bottom line.  Sluggish reactions to dynamic markets lead to business decision missteps which can result unknowingly in risk-laden exposure. The challenge of the new normal in financial markets is the motivation to think outside-the-box in the hunt for alpha.

The disruptive power of innovation

Amid the cacophony of the narrative of algorithmic trading unfolds the story of Complex Event Processing (CEP), a new breed of technology and a tool for understanding data.  And understanding data is a game changer – where quality is critical.

Data management takes center stage in the trade lifecycle chain from market research through live trading and post-trade (TCA) analysis.  Market data whether years of captured history or streaming live has been and will continue to be a primary business driver. CEP becomes an enabler to drive better business decisions through better data management and analysis.

CEP is a story of the disruptive power of innovation, a nice segue to understanding data, specifically temporal analysis of time-series data. It excels at exacting data consistency from trades, quotes, order books, executions even news and social sentiment which can instill trader confidence for ensuring profit and minimizing risk.

With so many liquidity sources – having a consistent and uniform data model across fragmented markets enables effective analysis for trade model design, statistical pattern analysis and understanding order book dynamics. This spans real-time, historical and contextual content – practically speaking it’s hard to separate them.   The efficacy of CEP, while commonly understood to be real-time analytics is wholly dependent on precedence established in historical data.   This is based on the simple premise that the past can be a rational predictor of the future. This starts with an understanding of what is a time series.

In techie-speak time series refers to data that has an associative time sequence, a natural ordering to its content such as rates, prices, curves, dividend schedules, index compositions and so on. Time Series data is often of very high velocity. The UTP Quote Data Feed (UQDF) provides continuous time-stamped quotations from 13 U.S. market centers representing literally hundreds of terabytes annually.  The data’s temporal ordering allows for distinct analysis revealing unique observations and patterns and the possibility for predicting future values. Time series are often called data streams which represent infinite sequences (i.e. computation that does not assume that the data has an end) or simply real-time data, such as intra-day trades. CEP is a temporally-sensitive programming paradigm designed for calculating and extracting meaningful statistics that are unique to and dependent on the data’s temporal nature. This includes not just the notion of duration and windows of time, but also temporal matching logic of a fuzzy nature such as trade prices to the nearest or prevailing quote.

Consider the scenario where there is a need to understand historic price volatility to determine accurate statistical thresholds of future price movements. It’s not simply a matter of determining price spikes but discerning when they occur, for how long and when a high (or low) threshold is crossed. It is CEP’s intrinsic sense of time that makes it uniquely suited to analyzing time series for achieving data consistency, the foundation for accurate trade decisions. Consistency is also about eliminating anomalous and spurious conditions, bad ticks if you will. But the trick is recognizing a bad tick from a good one. Historical precedence, ranging from the last millisecond to the previous year provides the benchmark for the norm and the means to recognize deviations. CEP’s analytical effectiveness is relative to the depth of the data set. The further back you look the more confidence can be achieved going forward. Of course this assumes that the future behaves like the past.  This is the basis for back-testing algorithmic trading models.

It’s all about the Data, all in good time

Data can be an ally for back-testing, simulation, valuation, compliance, benchmarking and numerous other business critical decisions. It is the fodder for understanding the global economy and the markets. The natural temporal ordering of time series data draws analysis distinct from any other and has given rise to a whole field of study and discourse. For understanding complex event processing, it’s all in the data.

A revision of this article first appeared in Futures Magazine, July 2013

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, HFT Regulation, High Frequency Trading, OneMarketData, OneTick, Tick database | 1 Comment

Data management – Speed, Quality or Volume

Recently Getco announced two grim figures. Their net income fell 82 percent and revenue declined by 41 percent. Even against the backdrop of the January Effect which has created a sugar-high for the S&P500,  a winning streak just shy of its all-time high of 1565.15 set in 2007, Getco’s losses are sobering news. And they coincide with the precipitous drop in stock-market volume coupled with erratic volatility.  This streak of withering volumes and volatility has existing through much of 2012 and it likely to continue well into 2013. It’s looking like (un)lucky 13 is a fait accompli.  Just saying.

A new normal of uncertainty pervades the markets. The combination of economic turmoil and regulatory overhang may seem like it is the end of the world and the demise of proprietary trading. However, there will always be winners and losers. This new normal defines a benchmark, a barometer that will create a separation. Those that choose to embrace the new reality and leverage the tools and technology to ensure their success and those that will indict a scapegoat as they sink into oblivion.

Data take center stage for conquering the new normal. The in-depth understanding of market behaviors and structure and optimal price transparency must be achieved for defining and calibrating trade models and knowing trade performance.  A key to sustainable success is to learn what you do not know. Successful companies constantly challenge assumptions looking for validation. While the squeeze in on because of market uncertainty, the desire to survive will push firms to get a better handle on managing data and ensuring its quality. Here’s a short Q&A to that end.


What is more integral, to go faster or to do more without getting slower?

What can be done in a microsecond?  It goes without saying that technology plays a critical role in the trade life cycle – the evolution of technology goes hand-in-hand with innovation in trading. The most important areas include:

  1. Fast access to market data.  Quant strategy decision making is critically dependent on the immediacy of pricing data often from multiple sources where, as an example, order books need to be consolidated and possibly currency converted. 
  2. Strategy execution performance.  High frequency strategies are typically first written in C++ for speed and then, leverage a trade infrastructure such as an analysis GUI working integrally with CEP. This provides the optimal combination for speed.

In general, this question is rooted in what is behind quantitative trading and what exactly do quantitative analysts do? They apply an empirically-tested and rules-based approach to understanding markets, exploiting inefficiencies manifested by human behavior, geo-political events and market structure. The end result is model creation, validation and optimization to prove profitability. Ultimately, quants look to get the most out of mathematical models, which are increasing in complexity by leveraging sophisticated tools to design and backtest strategies. We are in the midst of a negative divergence in the equity markets, where prices continue to climb even as volumes and volatility fall.  When will momentum shift and a reversal occur?  The Holy Grail is really to outsmart the crowd, that does not imply being faster but smarter.  In devising those smarter strategies quants will demand access to deeper global history and use observations from the past to characterize relationships, understand behaviors and explain future developments.  This is next algo battleground.

Can a company succeed without fast, high quality data?

All trading and business decisions today are critically dependent on time-sensitive data quality. The scope of data analysis deals with extracting effective value from large data sets in a timely manner.  This is for trade-related decision management including econometric trade modeling, risk and cost analysis.

To achieve alpha and discover new investment opportunities demands sophistication in tooling, analytics and data management.

New ideas for measuring data quality.

The need for high quality of data cuts across all aspects of the trade lifecycle. Those trade-related solutions, model back-testing, portfolio mark-to-market and compliance depend on a high-degree of data quality, where accuracy is vital to determining outcomes.

Tick data is often derived from many sources; there are 13 major exchanges in the U.S. alone. Across Asia are 15 different markets with different technologies, cost structures, regulations and cultures.  That’s a natural barrier to algorithmic trading and it creates unique challenges for efficiently trading across them – recognizing their differences in both market microstructure and regulation. The determinants of price discovery, volume, and trading patterns define the structure unique to a market, an asset class and geography influenced by participants and current regulation. The collection and analysis of data across the wide geography demands sophisticated tools and systems as firms look to discover the unique determinants of transaction costs, prices, and trading behavior to devise smarter trading models.

Data has to be scrubbed clean by applied reference data – cancels/corrections, corporate actions (splits, dividends) and exchange calendars.  Crossing borders means global symbologies, requiring symbol maps and currency conversions.

For risk management, data quality goes one step further for accuracy – statistically scrubbing algorithms. Analytically cleansed prices ensures chosen ‘snaps’ are within the tolerance levels (i.e. 0.5 standard deviation of the last 2 minutes) when used for risk compliance and calculating yields.

Measuring data quality for TCA is about bringing together disparate data sources. TCA measures broker performance; identifies outliers and measures trades against benchmarks. TCA is looking at intra-day execution efficiencies by measuring and monitoring executions against those benchmarks – arrival price and market price. The ability to accurately derive these benchmark prices at the precise time of trade execution is a cornerstone for understanding trade performance. And it demands a high quality of data, its reliable capture and storage.

Will high latency or poor quality data affect your business?

Firms are capturing data for back-testing new and evolving trade models in the never ending hunt for alpha and for risk management to monitor and control exposure and transaction cost analysis to measure trade performance.

Market data comes in many shapes, sizes and encodings. It continually changes and requires corrections and an occasional tweak. The challenge of achieving timely data quality in this data dump is dealing with the vagaries of multiple data sources and managing a sea of reference data for …

  • Map ticker symbols across a global universe of exchanges and geographies
  •  Tying indices to their constituents
  •  Tick-level granularity
  •  Ingesting cancelations and corrections
  •  Inserting corporation action price and symbol changes
  •  Detecting (missing) gaps in history

Any and all of these factors are vital to data quality and the science of quantitative trade modeling.  As Paul Rowady from the TABB Group mentions, “Reference data is like the glue that binds disparate data together. Without it, data can take no shape … In short, big data is dumb data without some glue”.

Effective market data management is about linking disparate data sets under some common thread to tease out an intelligible answer, a diamond from a mountain of coal. It’s about finding the cross asset correlations or understanding how to best hedge a position to offset risk, these are the underpinnings for profitable trade models, portfolio management and transaction cost analysis.

Low latency is the ante to the game in the trading business, it is evolutionary not revolutionary. The evolution in technology goes hand-in-hand with trade innovation. Firms have to closely monitor latency; it is a tax everyone must pay to play. Good models and a deep understanding of the market through deep data research, coupled with infrastructure that does not constrain is the goal.

Data management is the centerpiece of that goal touching all aspects of the trade lifecycle from quant research, model design/development/back-test and deployment to risk and cost analysis.

Uncertainty will create diversity as firms expand their business to be multi-asset, multi-broker. Doing so they need to improve their visibility and manage trade costs through TCA solutions to accurately measure costs for the purpose of executing orders in a more efficient way. Profiling trader and broker behavior – looking at intra-day execution efficiencies and highlighting the impact of delays on strategy performance.

The quality of data ultimately defines an end game, that Holy Grail for sustaining the profitability of the business. Firms should not lose sight of that.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, HFT Regulation, High Frequency Trading, OneTick, Tick database | Leave a comment

The Fear and Wisdom of Convergence – Cloud, Big Data and Low Latency

Convergence according to Wikipedia is the merger of previously distinct technologies into a new form; requiring new theories, new products, and new practices. The convergence of cloud computing, big data and low latency technologies have begun to meld together within the financial industry.

Increasing competition and thinning margins are pushing the technology envelope in the hunt for alpha. This has manifested itself on many fronts, increasing sophistication in the tools to search for alpha, controlling costs and managing risk to the confluence of the underlying infrastructure. The key enablers are big data and cloud deployments where low latency is the ante to play the game.

I recently attended the A-Team Group’s Low-Latency Summit in New York where I was a panelist discussing this very topic. Along with an esteemed group of individuals from NYSE Euronext, Intel, Gnodal and others we delved into the character of this evolving convergence.

Competitive advantages come from understanding data better

Trading firms operate in a fiercely competitive industry where success is measured by profit. They are constantly hunting for talent and technology to achieve that goal. Yet firms are ever threatened by fierce competition and controlling costs.  The side effect of this is increasing demands for deep data over longer time periods across a multiplicity of markets. This data dump is the fuel feeding automation technology and centers around two points:


1)  Managing scale – As firms look to capture and store more and more data from many sources across many asset classes it places enormous demands on IT infrastructure. The improvements in compute power and storage per dollar make the consumption both technically and economically possible. Cloud deployments provide advantages in managing the scale through higher levels of data protection and fault tolerance at a cost savings.


2) A solutions focus – Leveraging this big data dump is the fuel that drives the engine across the entire trade life cycle. It begins with alpha discovery moves to trading algorithm design, development and back-test and also includes cost and risk management.


It’s a well-known fact that equity volumes are falling, which is happening across North America and Asian markets. That translates to thinning margins – paradoxically that translates to the increased use of algorithms – as firms are chasing after a diminishing pot, and also the hunt across asset classes for alpha.  Algorithms, essential to the sustainability of the business are born out of the mathematical ingenuity of quants and developers and their complexity is accelerating.  This has led to an increased demand for data, across equities, futures and the options fire-hose for model design and back-test. And the increased demand feeds right back into managing scale.

Fear of the Cloud

The widespread adoption of cloud computing in commercial industry is well known and the benefits are many. But within Capital Markets, which typically is a leader in the adoption of new technology, there has been lackluster enthusiasm for the cloud – more noticeably in Europe and Asia than in U.S. Our panel briefly explored this seemingly puzzling phenomenon. It can likely be summed up in a few barriers to adoption:

  1. Security is probably the biggest, but most ubiquitous fear. In understanding data confidentiality market data is a common sharable resource but the analysis and algorithms are personal and private. Nevertheless, private cloud-based operating models are currently a better first choice than public or hybrid clouds. A recent InfoWeek article claims that clouds are more secure that corporate data centers.
  2. Performance concerns of CPU sharing when the cloud compute power is virtual. So what does that mean when a trading firm demands predictable latency?
  3. Regulatory and compliance. Compliance regulations require that data not be intermixed with other data, such as on shared servers or databases.

These points are valid, but also debatable with well-founded rebuttals. In a private cloud, bare metal deployments can offset VM concerns. But as is often said, perception is reality, so until these perceived fears are overcome adoption will march slowly forward.

Converging on the future

The technical improvements of IT infrastructure – storage costs per dollar and multi-core compute power have created the economic feasibility underpinning the advancements for cloud and big data storage architectures.  Commoditization has always applied to infrastructure; leveraging that ubiquitous notion it’s now possible to apply it to market data as cloud and big data converge. Yet, data’s business value will always remain and only get higher as firms add a personal touch through new algorithms.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, OneTick, Tick database | Leave a comment

Trading Smarter in Asia

I’ve just returned from my first trip to Singapore, speaking at TradeTech Asia and visiting with a handful of trading firms. I was duly impressed. Asia-Pacific is undergoing a sea change. Their penchant for all things ‘high-tech’ is a focal point for growth as they become leaders in the global economy. The Asian trading landscape is on the move creating improvements, in networking infrastructure and technology. Regional exchanges such Singapore Exchange Ltd. (SGX) have deployed expanding services where they can process transactions in 90 microseconds, making it the fastest matching engine in the world. The expanding use of algorithms all across Asia is expected to hit 26 percent within three years and on the eve of Chi-X Australia’s first year anniversary they grabbed nearly 9 percent of the Australian equities market fostering a new era of fragmentation.

Firms have to be ready for the coming paradigm shift and rise above the fear that comes from change. Falling volumes are pervading the region arguably forcing the use of algorithms as firms chase after a diminishing pot. The thinner volumes make an already arduous search for fair priced liquidity even harder for institutional traders. When volume is low, no one can be sure whether the bid/offer prices represent the real market value. Lower numbers of transactions massively affect volatility, since one large transaction can have a disproportionate effect impacting all participants. In such dearth markets firms use every available means in the hunt for alpha. Greg Lee head of trading for Asia at Deutsche Bank recently mentioned in Automated Trader, “Market participants are always looking for smarter ways to trade; alternative liquidity, blocks, smarter algorithms all come into play”.

Asia is spread out over a broad geographic and political expanse where each market is unique. The challenge is efficiently trading across them recognizing their differences in both market microstructure and regulation. The determinants of price discovery, volume, and trading patterns define the structure unique to a market, an asset class and geography influenced by participants and current regulation.  Trading smarter in the Asia-Pacific region begins with that recognition. The place is simply different from the west.

Trading smarter is a convergence of technology decisions on data management and build versus buy and understanding key solutions for alpha, risk management and controlling costs.

Download the whitepaper Four Things to Consider to Trade Smarter in Asia.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, HFT | Leave a comment

OneTick Awarded Best Big Data Provider, Buy-Side Technology

Big Data is a trendy new catch phrase in business and has sky rocketed to critical mass in 2012. Google Trends indicates there has been an explosion in search terms such as big data analyticsThe Big Data rocket is a bomb that explodes neatly to deliver next-generation business value.  Big Data represents a formidable force in business today. Gartner expects big data to reach $34 billion in 2013 IT spend. 

For the financial industry a few key factors distinguish big data from any other industry. To derive true business significance, it’s important to understand it centers on time-sensitive data quality.  Extracting business benefit from large data sets in a timely manner is for trade-related decision management including econometric trade modeling, risk and cost analysis. Determining accurate measurements highlights the extent of large data sets, inclusive of structured tick data from disparate sources and across asset classes and the related reference information.  Steve Hamby, CTO from Orbis Technologies describes the Five C’s of Big Data.  The most unique and relevant to finance is Celerity referring to the high rate at which data is consumed, and it’s time-sensitive relevance to business decisions.  The critical importance of strategy decision time is immediacy of pricing data often from multiple sources where books need to be consolidated and possibly currency converted. Likewise, the creation of accurate and reliable price analytics for measuring trade performance otherwise known as transaction cost analysis (TCA) – average trade price, VWAP, and arrival price are only possible with high-quality data scrubbed clean by applied corporate actions, cancelations and corrections. The ability to accurately derive benchmark prices at the precise time of trade execution is a cornerstone for understanding trade performance.

Another of the C’s Steve mentions is Confidence.  Firms demand confidence in the resulting analytics derived from big data… those used in determining the profitability profile of new models, measuring trade performance and analyzing portfolio risk. For this firms depend on precise, clean market data across all their tradable markets and an accurate recording of order history. 

OneMarketData was recently named Best Big Data Provider at the Buy-Side Technology awards 2012 for it’s OneTick product. OneTick is an enterprise solution capable of capturing, storing and analyzing big data across any asset class, derived analytic or news event, even the massive Options OPRA feed, that 4 million message per second fire hose.  OneTick is a focused solution for financial big data providing the scalable database, the analytics functions and user tools to uncover the narrative in the big data dump. The ability to tell a story with the data is what elevates it over raw hardware storage and computational power. The story is germane to an industry and creates relevance and monetizes the data for a business. Mark Beyer, research vice president at Gartner states, ”despite the hype, big data is not a distinct, stand-alone market, it but represents an industry-wide market force which must be addressed in products, practices and solution delivery”. Business is not aiming for do-it-yourself (DIY) big data solutions. Firms don’t want to be pioneers with vendors either, the competitive pressures demand fit-for-purpose solutions for both buy-side and sell-side firms alike. OneTick delivers on that story as many of the OneMarketData customers have already discovered. Yet the ideal case is to view historical activity and real-time as a single time continuum. What happened yesterday, last week or last year is simply as extension of what is occurring today and what may occur in the future. This is where the melding of time series ‘tick’ database and complex event processing (CEP) coupled with the focused financial analytics create a differentiating story.

Across the world’s financial centers countless firms ranging from marketplace data providers like SIX Financial, sell side institutions like Société Générale, Invesco Asset Management and quantitative trading firms like Tyler Capital have recognized the broad and in-depth value of OneTick. Whether researching market microstructure or trade performance, devising new quantitative trade models across the varied landscape of today’s cross border/cross asset environment or working in the different aspects of high-frequency, stochastic and fragmented markets, OneTick is the complete solution for optimizing trading decisions and discovering that diamond from within the big data mountain.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, OneMarketData, OneTick, Tick database | 1 Comment