The evolution of OneTickCloud

OneTickCloud is the industry’s premier solution for on-demand access to global tick data and analytics. Deployed in a secure data center, OneTickCloud provides firms the ability to aggregate, normalize and analyze large volumes of data, including Morningstar global tick data, using OneMarketData’s market-leading enterprise data management software, OneTick.

OneTick Cloud is a managed service of normalized and cleansed exchange and OTC data and analytics to support backtesting, algo development, transaction cost analysis, technical studies and charting applications.

OneTick Cloud is a securely hosted service providing managed data and analytics across global equities and futures tick history, with on-demand analytics tools for creating custom datasets.

OneTick provides you 10 years of normalized tick data for US Markets, 45 years of end-of-day data for US equities, 8 years of corporate actions data, and a trove of real-time and historical data from over 120 global markets.

OneTickCloud is accessible over the internet with JSON or CSV formatted extracts. Access the Data by FTP, by Web API or through the OneTick GUI. It offers a subset of the most powerful data analytics available in our OneTick software platform.

A year after its 2015 release, join us as we discuss how the service has grown and matured in our September 29th webinar, “The Evolution of OneTickCloud.”

Microsoft PowerPoint - OTC_Draft.pptx

REGISTER HERE

In this webinar, OneMarketData’s Louis Lovas and Jeff Banker will provide an overview of their company and its powerful hosted solutions. DASH Financial’s Ben Locke will then provide insight into the OneTickCloud customer experience, and how his team has utilized the product.

OneTickCloud Architecture includes Web Queries, Web On-Demand, Web Scheduled Queries and Desktop OneTick. Join our webinar on September 29 for more details on what OneTickCloud can do, and what it can do for you!

 

 

 

Posted in Big Data, OneMarketData, OneTick, Uncategorized | Leave a comment

Finding Alpha in Transaction Cost Analysis

Technology is making a sweeping transformation in trading styles as the accelerating use of algorithms creates a more competitive environment for all market participants. Tighter spreads, diminished liquidity and increased volatility are re-defining global markets and thinning margins.

This translates to the increased awareness of trading costs as participants look to squeeze alpha out of a diminishing pot. Whether you’re an asset manager, an institutional investor, a quant researcher or an executing broker, sought-after cost controls create the incentive to invest in advanced technology for Transaction Cost Analysis or ‘TCA’. Essentially a collection of comparisons between various market benchmarks and traded prices to determine whether the spread between them is high or low at the time of order, TCA can generate alpha by exposing, and ideally lowering, the cost at which you buy and sell. Results from the analysis are used to fine tune the trading process, compare venues, and provide clients desired reports and dashboards.

Benchmarks provide the formal metrics of market conditions to quantify costs. They offer baseline measurements throughout market history, across the trading day and at the time of execution. They include price metrics such as open, high, low, close and volume-weighted-average-price as well as liquidity metrics of the order book.

OneTick TCA, an enterprise platform, offers a set of market benchmarks across global equity and futures markets. This starts with tick-by-tick trade, quote and order book history and boasts over 20 price and volume metrics. Along with global benchmarks, OneTick TCA also includes a collection of best-practice methodologies to determine the effectiveness of your trading.

OneTick TCA provides the tools to …

  • Spot outliers in your trades by measuring slippage against price benchmarks such as VWAP, Bars and Beta
  • Measure participation rates by venue and across venue against stated goals
  • Compare performance vs. volume benchmarks to determine the effectiveness of capturing the visible liquidity at each price level.
  • Measure market impact or the cost of demanding liquidity by understanding quote fade across venues
  • Quantify the opportunity cost of not trading as a justification to tune algo aggressiveness

OneTick TCA is a hosted platform built upon OneTick’s time-series database, analytics and market history to service the demanding and varied needs of both buy- and sell-side institutions. To learn more about OneTick TCA, download a product sheet or request a demo today.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Tick database, Transaction Cost Analysis, Uncategorized | Leave a comment

OneTick Webinar on leveraging Hadoop MapReduce with OneTick Analytics

OneTick Map-Reduce is a Hadoop based solution combining OneTick’s analytical engine with the MapReduce computational model that can be used to perform distributed computations over large volumes of financial tick data. As a distributed tick data management system, the OneTick internal architecture provides support for databases that are spread across multiple physical machines. This architecture designed for distributed parallel processing improves query performance as the typical OneTick query is easily parallelize-able at logical boundaries (e.g. running the same query analytics across a large symbol universe) and can be processed on a separate physical machine.

On April 26, 2016 we had a very successful broadcast webinar on the details behind how OneTick’s large collection of built-in analytical functions and query design can easily leverage the Hadoop middleware framework for large scale parallel processing.  You can watch the recording at this link or click on the image:

  Register Now

OneTick Map-Reduce is a dynamically distributing data (stored in OneTick historical archives) and computation across the nodes using a combination of distributed file system (HDFS) and the MapReduce computational framework.

  • OneTick archives are stored on a distributed file system (e.g. HDFS with Amazon S3 as a backup). The distributed file system serves as an abstraction layer providing shared access — physically the data resides on different nodes of the cluster. The distributed file system is also responsible for balancing disk utilization and minimizing the network bandwidth.
  • Hadoop’s MapReduce daemons are responsible for distributing the query across the nodes of the cluster, by taking into account the locality of the queried data.
  • The distributed OneTick query is an analytical process that semantically defines a user’s business function. OneTick query analytics are designed specifically for that purpose.

OneTick Analytics

OneTick provides a large collection of built-in analytical functions which are applied to streams of historical or real-time data. These functions referred to as Event Processors (EPs) are a set of business and generic processors that are semantically assembled in a query and ultimately define the logical, time series result set of a query. Event Processors include aggregations, filters, transformers, joins & unions, statistical and finance-specific functions order book  management, sorting and ranking, and input and output functions.

The OneTick Map-Reduce design allows an easy to switch between different data representation/job dispatching models – affording support for an internal model and external model. Users define their “map”, “reduce” operations in this restricted computational model and the framework takes care of the parallelization.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Cloud Computing, Complex Event Processing, Equities, OneMarketData, OneTick, Tick database, Transaction Cost Analysis, Uncategorized | Leave a comment

Computational Models in OneTick and Hadoop

Cloud computing, it means a lot of different things to different people. There are public, private and hybrid models, yet the variations are endless. A key characteristic of the cloud is rapid elasticity which offers compute power unheard in priory infrastructures. Such parallelized scalability allows previously intractable problems to become a reality. There is two key underlying components behind this – a computational model and distributed data.

OneTick Map-Reduce is a Hadoop based solution combining OneTick’s analytical engine with the MapReduce computational model that can be used to perform distributed computations over large volumes of
financial tick data. As a distributed tick data management system, the OneTick internal architecture provides support for databases that are spread across multiple physical machines. This architecture designed for distributed parallel processing improves query performance – as the typical OneTick query is easily parallelize-able at logical boundaries (e.g. running the same query analytics across a large symbol universe) and can be processed on separate physical machines.

OneTick Map-Reduce offers a solution to leverage elastic computation by dynamically distributing both data and computation across a Hadoop cluster using a combination of a distributed file system (HDFS) and a computational framework called MapReduce.

OneTick Map-Reduce dynamically distributes data (stored in OneTick historical archives) and analytics across the nodes using a combination of distributed file system (HDFS) and the MapReduce computational framework.

  • OneTick archives are stored on a distributed file system (e.g. HDFS with Amazon S3 as a backup). The distributed file system serves as an abstraction layer providing shared access — physically the data resides on different nodes of the cluster. The distributed file system is also responsible for balancing disk utilization and minimizing the network bandwidth.
  • Hadoop’s MapReduce daemons are responsible for distributing the query across the nodes of the cluster, by taking into account the locality of the queried data.
  • The distributed OneTick query is an analytical process that semantically defines a user’s business function. OneTick query analytics are designed specifically for that purpose.

OneTick Analytics

OneTick provides a large collection of built-in analytical functions which are applied to streams of historical or real-time data. These functions referred to as Event Processors (EPs) are a set of business and generic processors that are semantically assembled and ultimately define the logical, time series result set of a query. Event Processors include aggregations, filters, transformers, joins & unions, statistical and finance-specific functions order book management, sorting and ranking, and input and output functions. Also included is a reference data architecture for managing security identifiers, holiday calendars and corporate action information. Together these allow time series tick streams originating from any of the OneTick storage sources (archive, in-memory or real time) to be filtered, reduced and/or enriched into the business logic supporting a wide variety of use cases.

  • Quantitative Research
  • Algorithmic, low-touch and program trading
  • Firm-wide profit / loss monitoring
  • Real-time transaction cost analysis
  • Statistical arbitrage and market making
  • Regulatory compliance and surveillance

OneTick, Hadoop and Spark

Spark and Hadoop are middleware frameworks that facilitate parallel processing of data, whereas MapReduce is a computation model. These components provide a platform for distributed computation and combined with HDFS offer distributed data access as well. HDFS is (by definition) the file system part of Hadoop, and Spark can make use of HDFS as input data source. Yet, neither Hadoop nor Spark provide targeted business-oriented functions to support the above-mentioned use-case solutions. Furthermore, those trade-related solutions depend on cleansed, normalized high-quality data available in OneTick data management either by itself of integrated into Hadoop. OneTick has its own very efficient mechanisms for parallelization of computations (e.g. concurrent processing of symbol sets across load-balanced group of tick servers, client-side and server-side database partitioning with concurrent partition access, splitting queries locally into multiple execution threads). OneTick also supports Hadoop as an alternative mechanism of parallelization of computations.  The OneTick Map-Reduce design allows an easy to switch between different data representation/job dispatching models – affording support for an internal model and external models (Hadoop, Spark, etc.). The idea is that you start with a collection of data items and start applying certain map and reduce operations on this collection (as in functional programming). Map operations transform existing items into new items and Reduce operations group multiple items into a single aggregated item. The computation must be stateless, so that it’s easier to parallelize. This means that each transformation creates a new collection, rather than manipulating the existing one. Users define their “map”, “reduce” operations in this restricted computational model and the framework takes care of the parallelization.

How does this translate to OneTick’s data model?

Data items <-> OneTick time series
Map operations <-> OneTick transformer EPs
Reduce operations <-> OneTick merge/join EPs


Spark is similar to Hadoop yet it overcomes the limitation that they take a long time upon startup of a job. Similar to OneTick’s own dispatching model, Spark appears to be more suitable for interactive data processing. Nonetheless, both are suitable for large batch processing tasks and thus the reason for OneTick’s integration as a complementary technology.

OT architecture
Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Cloud Computing, OneMarketData, Tick database | Leave a comment

Not Your Granddad’s Spoof

A recent job posting by a major investment bank reads: “Basic qualifications: PhD in Computer Science, specializing in … machine learning … with extensive knowledge of big data technologies … and experience with predictive modeling, natural language processing, and simulation.”  Quantitative trading, right?  Or global risk modeling? Maybe electricity market forecasting? No, Compliance. Specifically, surveillance analytics. Twenty years ago, even ten, the Quants landing on Wall Street with their freshly minted PhDs from Stanford and MIT would have laughed. Twenty years ago, even ten, surveillance was a dreary back office affair, something that somebody in a cheap labor state did on a mainframe, if they hadn’t been laid off yet. Since Dodd-Frank, since the Flash Crash, since MiFID and MAD, since that darned book, compliance surveillance is front office with a capital ‘F’. On the trading floors of New York and Chicago, and on quieter desks from Greenwich to Boston, trading supervisors are reviewing surveillance reports and consulting real-time surveillance monitors as though their bonus checks depend on it—because they do.
Here’s why. The regulators have grown more aggressive, and grown sharper teeth—they’re now empowered to prosecute on the basis of ‘disruptive practice’ rather than ‘intent’.  Staffs are larger and regulatory actions mo
re frequent. Most importantly, money penalties have grown dramatically. Enforcement groups seem to vie with each other for bragging rights.

The regulators are better equipped now, too. They have to be, in order to analyze an ocean of market data. The CFTC’s trade surveillance system, for example, had gathered over 160 Terabytes as of June 2014, and that has likely passed 200 TB by now. Regulators are turning to sophisticated analysis to ferret out patterns of misconduct and detect market stability risks. In 2015, Scott Baugues, Deputy Chief Economist at the SEC, wrote that several SEC departments use machine learning techniques to identify likely misconduct. The old spoofs and other tricks are now easily spotted. New ones are appearing, but they’re being learned by smart analytics. Manipulate on the CME in one contract, and take on ICE in a correlated one?  Nope, a regulator’s cross-market surveillance can see it. Collude in setting a fix? Software that learns social media relationships may detect it. So what should the compliance chief who wants to protect her firm from a business-busting fine do?  The first thing is to wake up to how dramatically the surveillance landscape has changed—it’s not your granddad’s spoof anymore.

chart


References:
Job posting: http://www.goldmansachs.com/a/data/jobs/27931.html
CFTC Enforcements and penalties are from the CFTC Enforcement Division annual results archived on the CFTC Press Room web site.
CFTC Surveillance collects 160 TB: https://fcw.com/articles/2014/06/03/cftc-mulls-retooling-market-surveillance.aspx
Scott Baugues comment: http://cfe.columbia.edu/files/seasieor/center-financial-engineering/presentations/MachineLearningSECRiskAssessment030615public.pdf

thanks for reading.
Dermot Harriss

Posted in Algorithmic Trading, Analytics, Big Data, Cloud Computing, HFT Regulation, Tick database, Trade Surveillance | Leave a comment

OneMarketData enhances OneTick Cloud global content platform with Tick Data, Inc.

OneMarketData has acquired Virginia-based Tick Data Inc., a trusted and leading provider of historical intraday exchange time series data. Tick Data Inc. will be a wholly owned subsidiary of OneMarketData and will continue to operate from their Virginia offices under the current management team.

Data is the resource for better trade decisions, better cost controls and improving compliance and surveillance. As a consequence, key factors emerge for large-scale data management.  Those include infrastructure, data quality and timely access.  This acquisition is focused on enhancing the OneTick Cloud platform:

–   Tick Data enables OneMarketData to expand and expedite our Cloud Solutions platform, which provides the market with a broad set of Data and Analytics

–   The Tick Data, Inc. acquistion allows OneMarketData to expand the content platform to address the $1.5B market opportunity

–  OneMarketData has seen a significant interest in our OneTickCloud platform

–  OneTickCloud platform enables customers of Tick Data to greatly expand their ability to analyze exchange content, for backtesting, algo development, research and compliance

Learn more about this acquistion and how OneMarketData’s OneTickCloud and Tick Data’s content looks to address those challenges providing the tools and services for managing and analyzing data more effectively. Click here

OneTickCloud leverages the capabilities of OneTick to offer the analysis of cleansed, normalized history across domestic and international markets.  OneTick is a leading solution for managing market data – its capture, storage and analysis for use in quantitative research, back-testing, TCA, trade surveillance and many other areas in the trade life cycle.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or
commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Uncategorized | Leave a comment

OneTickCloud: Simplifying Access to Normalized Tick Data for Sophisticated Analysis

As I’m sure most of you have experienced there is an ever increasing need for more data… from domestic markets, international markets, capturing your own order flow, news, and recently sentiment from social media. Regardless of the hype around big data improving your understanding of markets, it’s microstructure and the impact of your own trading can be a game changer.

Data is the resource for better trade decisions, better cost controls and improving compliance and surveillance. As a consequence, key factors emerge for large-scale data management.  Those include infrastructure, data quality and timely access.

Learn how OneMarketData’s new OneTickCloud looks to address those challenges providing the tools and services for managing and analyzing data more effectively in this webinar recorded from a earlier broadcast.

OneTickCloud leverages the capabilities of OneTick to offer the analysis of cleansed, normalized history across domestic and international markets.  OneTick is a leading solution for managing market data – its capture, storage and analysis for use in quantitative research, back-testing, TCA, trade surveillance and many other areas in the trade life cycle.

OneMarketData offers the tools for quant researchers to extract alpha, better manage risk and achieve confidence in model design.

OneTickCloud is a securely hosted service providing managed data and analytics across global equities and futures markets. It offers deep history of tick-by-tick and end-of-day prices. It includes reference data, split and dividend adjustment factors, name changes, earnings announcements and calendars.   You can think of OneTickCloud as providing …

  • Global content – normalized and cleansed across markets and geographies
  • A range of analytical query tools from a self-service web application for assembling and organizing the content as you need to more sophisticated desktop tooling for custom analysis
  • And lastly, easy access from your own application, that can be immediate on-demand access or more traditionally as file downloads.

And you only pay for what you need.  To learn more watch the webinar recording.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Cloud Computing, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, HFT Regulation, High Frequency Trading, OneMarketData, OneQuantData, OneTick, Tick database, Transaction Cost Analysis | Leave a comment

Using the cloud to reduce complexity and lower costs for accessing and analyzing tick data

It’s easy to talk about utilizing market data to improve your business, but the mechanics of actually acquiring high quality, normalized, cleansed data in the format that you need have always posed a significant challenge – in human capital, IT infrastructure and managing a vast array of sources.

OneTickCloud takes the headaches out of sourcing tick data. In addition to lowering your costs, it increases your team’s decision-making effectiveness from the IT staff to quantitative analysts so they can focus on doing their real job – like backtesting trading algorithms, multi-factor model development, portfolio analysis, pre- and post-trade analysis, and TCA. On top of that, OneTickCloud provides a range of custom analytics tools from straightforward queries you can build in seconds from our web interface to complex, multi-dimensional queries utilizing the full power of OneTick analytics.

In this upcoming webinar, I will show you how OneMarketData’s expert data team has built a hosted cloud platform of cleansed, normalized market data from a variety of market centers and geographies all offered through our secure, high performance cloud platform. I will demonstrate how easy it is to build queries using our web-deployed user interface, and then how to invoke those queries for scheduled or immediate download.

OneTickCloud is an easy, fast, and cost-effective way to gain on-demand access to vast amounts of clean tick history and analytics using a zero footprint client. For more demanding analytical sophistication, the tools are available in the complete OneTick time-series tick database engine and query language.

Register Now: http://bit.ly/1d35Qf4
WEBINAR DETAILS
DATE: Tuesday, June 9
TIME: 10:30AM New York / 3:30PM London

Once again thanks for reading.
Louis Lovas

For an occasional opinion orcommentary on technology in Capital Markets you can follow me on  twitter, here.

 

Posted in Algorithmic Trading, Analytics, Big Data, Cloud Computing, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, High Frequency Trading, OneMarketData, OneQuantData, OneTick, Tick database, Transaction Cost Analysis | Leave a comment

OneTickCLOUD, Managed Services across Global Markets

The seduction of low-cost yet massive compute power creates overwhelming interest in cloud technology. Momentum is building fast within the trading and investment business to benefit from the $18B cloud computing industry.  In a survey conducted by OneMarketData respondents overwhelmingly look to avail themselves of all that cloud offers.  Over 77 percent expect to jump headlong into cloud platforms. The allure of cloud computing is lower technology costs and improved profitability that comes from the enhanced capabilities enabled by greater compute power and access to a store house of deep market history. Quant research for alpha discovery, algo development and back-testing are all augmented by what the cloud can immediately dish up.

A key characteristic of the cloud is rapid elasticity which offers scalable compute power and voluminous storage providing immediate access to deep market history. Such pay-as-you-go scalability defines a new archetype for the front-office trade lifecycle chain.

OneMarketData has recognized this technological shift to embrace cloud computing. For that, the launch of the new OneTickCLOUD™ is now on-line.

OneTickCLOUD is a securely-hosted managed data and analytics service supporting global equities and futures tick history, reference data, and adjustment factors. It offers a suite of pre-defined analytics and tooling for your own custom personalized analytics.  Web access is on-demand for flexible and convenient access when you want it.

OneTickCLOUD enables users to focus exclusively on their analytics and eliminate the challenges associated with data collection, alignment and extraction.  No local hardware, software or data licenses are necessary. You have immediate access to 5 years of tick history, 20 years of end-of-day along with real time collection from US Equities and Futures markets. Outside the US over 10 years of tick data is available from over 120 global markets, including market depth.

Knowing a one-size does not fit all, OneTickCLOUD offers personalized access to suite your requirements for content and analytics. The tree-tiered offering is outlined in this Product Matrix.

A move towards cloud signals a fundamental shift in how we handle information. Financial firms will move first with data-heavy decision-support functions – model back-testing, quantitative research and transaction cost analysis. The days of cloud-as-a-fad are over. It’s a game-changer promising a major paradigm shift in business initiatives with its vast computational power, storage and a wide variety of application solutions at a lower cost structure. OneTickCLOUD delivers on that promise.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Cloud Computing, Complex Event Processing, Equities, Foreign Exchange, OneMarketData, OneTick, Tick database | Tagged , , , | Leave a comment

Social Sandbox – Where Social Media Meets Market Data

In today’s information age, we often take for granted that breaking news from across the globe will be delivered to our doorstep in an instant. The evolution of the internet has replaced radio, television and print journalism as the primary information source. With social media we have come to expect and often rely on economic, political and market news to be colored by views and personal perspectives. This creates far reaching influence that has touched our personal and professional lives. A decade ago, I doubt anyone would have predicted that businesses the likes of LinkedIn, Facebook and Twitter would top the IPO market. Yet here we are.

With the chatter on Twitter, Facebook and other social media outlets reaching peak levels it makes you wonder just how strongly social media is proving to impact financial markets. OneMarketData has just released a product that allows market participants the opportunity to find out for themselves.

In conjunction with Datawatch and Social Market Analytics, OneMarketData presents the Social Sandbox a hosted solution to empower quants and traders to research and measure the value of social media sentiment against market activity. It can be used to discover correlations and patterns using a wide range of analytics against global equity data and social sentiment scores. The pre-built solution is powered by OneTick, the leading platform for time-series analytics and storage.

The Social Sandbox is the very first platform which integrates deep market history and analytics to measure the impact of social media. Such analysis requires the alignment and integration of exchange and reference data, in addition to the social sentiment indicators. The platform offers a set of sentiment studies including “Top Score by Social Sentiment,” “Sentiment to Market benchmarks,” (e.g. Open, Returns, etc.), a Portfolio Study and many others.

The advantages to improve your trading performance include:

  • Evaluate social media signals and determine a fit within your investment or trading strategy
  • Use studies to identify relationships between score characteristics and returns over custom time periods
  • Create custom indicator calculations such as beta, momentum, earnings growth, historical volatility, and average daily volume
  • Conduct discriminating Portfolio Performance and Valuation Analysis
  • Perform comprehensive Quant Research, Strategy Development and thorough back-testing through bull/bear and other market-impacting conditions all from the same OneTick platform

The Information Transformation

We are in the midst of a revolutionary change in global news dissemination and consumption through social media. One so subtle we can hardly see the change happening before our very eyes. The technology driving this is an ever evolving proposition, what we know today will be replaced with something new in the future – history tells us that. Social media, especially Twitter, transcends the old news agency reporting model. It renders artistic license to portray individual opinions, perspectives and commentary on events across the globe.  OneMarketData’s Social Sandbox provides the utility, convenience and in depth analytics to stay ahead of the curve to improve your investment performance.

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on  twitter, here.

Posted in Algorithmic Trading, Analytics, Big Data, Complex Event Processing, Equities, Foreign Exchange, Futures and Options, High Frequency Trading, OneMarketData, OneQuantData, OneTick, Tick database, Transaction Cost Analysis | Leave a comment