The Fear and Wisdom of Convergence – Cloud, Big Data and Low Latency

Convergence according to Wikipedia is the merger of previously distinct technologies into a new form; requiring new theories, new products, and new practices. The convergence of cloud computing, big data and low latency technologies have begun to meld together within the financial industry.

Increasing competition and thinning margins are pushing the technology envelope in the hunt for alpha. This has manifested itself on many fronts, increasing sophistication in the tools to search for alpha, controlling costs and managing risk to the confluence of the underlying infrastructure. The key enablers are big data and cloud deployments where low latency is the ante to play the game.

I recently attended the A-Team Group’s Low-Latency Summit in New York where I was a panelist discussing this very topic. Along with an esteemed group of individuals from NYSE Euronext, Intel, Gnodal and others we delved into the character of this evolving convergence.

Competitive advantages come from understanding data better

Trading firms operate in a fiercely competitive industry where success is measured by profit. They are constantly hunting for talent and technology to achieve that goal. Yet firms are ever threatened by fierce competition and controlling costs.  The side effect of this is increasing demands for deep data over longer time periods across a multiplicity of markets. This data dump is the fuel feeding automation technology and centers around two points:


1)  Managing scale – As firms look to capture and store more and more data from many sources across many asset classes it places enormous demands on IT infrastructure. The improvements in compute power and storage per dollar make the consumption both technically and economically possible. Cloud deployments provide advantages in managing the scale through higher levels of data protection and fault tolerance at a cost savings.


2) A solutions focus – Leveraging this big data dump is the fuel that drives the engine across the entire trade life cycle. It begins with alpha discovery moves to trading algorithm design, development and back-test and also includes cost and risk management.


It’s a well-known fact that equity volumes are falling, which is happening across North America and Asian markets. That translates to thinning margins – paradoxically that translates to the increased use of algorithms – as firms are chasing after a diminishing pot, and also the hunt across asset classes for alpha.  Algorithms, essential to the sustainability of the business are born out of the mathematical ingenuity of quants and developers and their complexity is accelerating.  This has led to an increased demand for data, across equities, futures and the options fire-hose for model design and back-test. And the increased demand feeds right back into managing scale.

Fear of the Cloud

The widespread adoption of cloud computing in commercial industry is well known and the benefits are many. But within Capital Markets, which typically is a leader in the adoption of new technology, there has been lackluster enthusiasm for the cloud – more noticeably in Europe and Asia than in U.S. Our panel briefly explored this seemingly puzzling phenomenon. It can likely be summed up in a few barriers to adoption:

  1. Security is probably the biggest, but most ubiquitous fear. In understanding data confidentiality market data is a common sharable resource but the analysis and algorithms are personal and private. Nevertheless, private cloud-based operating models are currently a better first choice than public or hybrid clouds. A recent InfoWeek article claims that clouds are more secure that corporate data centers.
  2. Performance concerns of CPU sharing when the cloud compute power is virtual. So what does that mean when a trading firm demands predictable latency?
  3. Regulatory and compliance. Compliance regulations require that data not be intermixed with other data, such as on shared servers or databases.

These points are valid, but also debatable with well-founded rebuttals. In a private cloud, bare metal deployments can offset VM concerns. But as is often said, perception is reality, so until these perceived fears are overcome adoption will march slowly forward.

Converging on the future

The technical improvements of IT infrastructure – storage costs per dollar and multi-core compute power have created the economic feasibility underpinning the advancements for cloud and big data storage architectures.  Commoditization has always applied to infrastructure; leveraging that ubiquitous notion it’s now possible to apply it to market data as cloud and big data converge. Yet, data’s business value will always remain and only get higher as firms add a personal touch through new algorithms.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

About Louis Lovas

Director of Solutions, OneMarketData, with over 20 years of experience in developing cutting edge solutions and a leading voice in technology and trends in the Capital Markets industry.
This entry was posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT, OneTick, Tick database. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s