Sage advice is in ample supply when it comes to trends, whether in politics, fashion or business. You often here “don’t follow trends, start them“, “don’t yield to trends and fads“. Heeding such counsel is of course prudent. Yet sifting through the endless stream of news, company announcements, marketing messages, acquisitions, regulatory changes and political posturing to draw any conclusion about where an industry is heading is like gazing into a crystal ball. Recognizing sound business trends vs. simple fashion is vital to gauge that macro-level pulse of an industry.
Trends are some times driven by a bottom-up groundswell while others often have a triggering condition such as a regulatory change. Being a trend setter might be a fashionable position to attain, but to the pragmatic the ideal goal is to see the opportunities emerging to be able to capitalize. Recently I’ve noticed a number of recurring themes, coming changes driving new opportunities in Capital Markets. I’ve outlined a handful of those below.
1) The SEC’s 15c3-5 Market Access Rule and CFTC’s advisory recommendations for DMA will rekindle the latency wars.
Just around the corner looms the SEC’s Market Access Rule to ban naked access. No longer will participating firms have unfettered direct market access while broker’s (virtually) look the other way as orders flow into the market with little or no checks or balances. I’m sure it was a profitable enterprise for the broker community to allow this direct channel for those willing to pay a little extra.
Brokers compete on the range of services they offer. They attract client’s order flow by offering better fill rates, better prices, increased liquidity, etc. The SEC’s rule 15c3-5 which mandates pre-trade risk checks does not really inhibit the level of service brokers can provide, but it does ensure everyone pay a latency tax for checking credit limits, and order constraints (quantity and price) brokers must enforce.
As a result, a groundswell is occurring. Pre-trade risk is fast becoming the next latency battleground. While some are scrambling, others such as Morgan Stanley are announcing achievements of microsecond latency. I am sure others will follow with revamped pre-trade risk modules as they leverage multi-core hardware to achieve parallelism in their architecture. A renewed emphasis on FPGA, hardware acceleration has also surfaced. FPGA technology has been readily available for a number of years, it’s success has primarily been in appliance oriented technology for ticker plants and messaging such as Exegy and Solace where it’s an embedded component. An Aite report on Capital Markets Technology spending puts FPGA low on the list of IT spend for infrastructure investments. I think this is primarily due to the fear, uncertainty and doubt surrounding the direct use of non-commodity hardware. From an IT manager’s viewpoint, a series of difficult questions arises regarding FPGA… “complex, non-standard development, handling long-term maintenance, support, diagnosing failures” and lack of experienced talent to hire. Challenging questions and likely the reason for its lackluster success.
Yet the opportunity is laid bare for the enterprising vendor. Pre-trade risk can be modularized as a software/hardware appliance combo. Providing the same appliance-like packaging that has made Exegy and Solace successful in a commodity-driven world.
The SEC’s 15c3-5 Market Access Rule is just the tip of the iceberg. In the commodities markets, the CFTC advisory committee has recommended pre-trade risk regulation extending from the trading firm, to broker to exchange for credit limit checks, fat finger checks (quantity limits/price collars), execution throttles, positions and PnL limits.
While the latency wars have always cut-across many components of trading infrastructure from market data, messaging backbones to the algo’s themselves regulation has triggered a laser-focused emphasis on pre-trade risk. We’ll continue to see new and exciting developments in this area.
2) A desire to service the burgeoning Hedge Fund industry with more complete end-to-end solutions will drive more consolidation
The Hedge Fund business is booming. The statistics show that in 2010 alone, over four hundred new funds opened their doors. I guess you could say it’s fashionable to start a hedge fund these days. But there are some valid reasons why so many are appearing. One case in particular is the Volcker Rule, that edict in Dodd-Frank that limits proprietary trading. As a result, numerous spin-off firms are entering the fray usually led by a star trader. These market geniuses generally have little trouble raising capital for their shiny new firm. Yet those same statistics show that over three hundred funds closed their doors in 2010. So one has to ask why?
The dynamics and scope of running a successful fund extend far beyond the prop trading desk at any bank. To replicate that stardom they achieved in the past, managers have a dual burden to grapple with satisfying a client base and the investment capital they provide plus the added chore of choosing and successfully deploying a wide range of technology to enable that success. Technology plays a critical role in the trade lifecycle and its evolution goes hand-in-hand with innovations in trading. Unfortunately the statistics bear-out that firms are ill equipped to rise to this challenge. While most are excellent money managers few have the depth of skills in technology, or recognize the need to hire appropriately to bring the two together, hedge fund guys ≠ software guys.
Yet every failure is an opportunity in disguise, and this has started to occur. A solution to this technology-soaked problem is not to train money managers in all its vastness, but provide packaged one-stop solutions that mediate the complexities covering all aspects of the trade lifecycle from quantitative research, algo development/deployment, risk, and post trade clearing, settlement and TCA. A solution ‘stack’ so-to-speak. Such Hedge Fund ‘in-a-box’ solutions allow those expert money managers to continue to fine-tune their expertise with client investors and capital. A fund’s team of quants can likewise focus on their competency, alpha discovery.
Bloomberg recognized this opportunity not long ago, announcing their HBOX, Hedge Fund Tool Box. More recently Wedbush’s acquisition of Lime Brokerage consolidating execution, market data, pre-trade risk and clearing services. Enterprising vendors are bringing complementary technologies together as ‘stacked’ solutions, offered together with deployment choices including on-premise for those funds still wanting to manage the infrastructure or hosted on a (fashionable) cloud network. I would expect to see more M&A announcements as vendors recognize the opportunity in consolidation.
3) Increased emphasis on managing risk will drive demand for serviceable historical data.
“History is a Source of Strength”, a simple yet profound comment from author and historian David McCullough. As a historian, Mr. McCullough was speaking of the historical and social sciences and how a deep understanding can provide the guidance, wisdom and confidence to lead into the future. Portfolio Managers and Quant researchers also look to history, historical market and fundamental data to instill a sense of confidence in their decisions as they balance portfolios and evaluate and backtest quant models.
With razor thin margins of HFT in jeopardy due to lower volumes, looming regulation and competitive strategies from participants like Deutsche Bank and their new Stealth execution algo there is an uptick in the use of quantitative models. A fundamental requirement behind designing, building and testing quant models is deep history. Paul Rowady from the TABB Group describes it this way; “… ultra-high speed trading in the US has reached a saturation point, the financial markets are entering a new era of quantitative research, strategy development…“
Firms will want to ensure robustness and correctness of their quant strategies. Backtesting with accurate, clean historical market data imbues that sense of confidence to ensure the next rogue algo is not their own. Back testing can validate trading ideas, catch coding errors, measure exposure and slippage, determine conformance to new regulations and judge market impact, in the micro and macro sense. Yet to do so requires ready access to volumes of historical market data and reference information (i.e. corporate actions) to ensure price/volume accuracy through time. Portfolio Managers leverage historical data to analyze incremental risk conditions looking for correlations between asset classes, performing VaR calculations and overall exposure.
Demand for historical data is inclusive of tick-by-tick, closing prices, reference data/corporation actions and fundamentals. In a recent Aite report on IT spend in Capital Markets, reference data will get significantly increased funding in 2011. Yet there are significant challenges, most firms end up spending incalculable amounts of time processing and organizing data simply to make it serviceable for their backtesting and risk analysis needs. As Paul Rowady say “… the biggest challenges facing quantitative researchers are data management and the need for a single, unified, storage solution capable of meeting future requirements.“. The opportunity is laid bare for the enterprising soul.
Once again thanks for reading.
For an occasional opinion or commentary on technology in Capital Markets you can follow me on twitter, here.