OneMarketData

    OneMarketData is a leading provider of tick data management solutions for the financial industry

    BOOKMARK AND SHARE

    What are some of the main drivers for investments in data management technology?

    Fear, uncertainty and doubt pervade the market as witnessed by falling volumes, smothering volatility and unsettled regulatory policy. Trading firms operate in a fiercely competitive industry where success is measured by profit.  Shifting competition, understanding and containing trading costs, rapidly advancing technology and managing risk tear at the fabric of all trading firms. This is driving the need to diversify and adapt to the uncertain market mechanics, the narrative of the new normal in today’s financial markets. 

    The business is faced with a pervasive wave of thinning margins induced by a volatile attitude of risk on/risk off from the manifestations of geo-political events and central bank policy. Quants and portfolio managers look to capitalize on the market’s resulting gyrations, exploiting inefficiencies from human behavior to market structure.

    Quantitative analysis is the main tool of the trade, whether for price discovery for execution strategies or market statistics for systematic alpha models. The quest for new and revised models – their design, back-testing and optimizing for profitability – is never ending. Consequently, there is a voracious appetite for data. This dump is the fuel feeding the engine for automation and innovation alike.

    Data management is vital for driving the business and it centers on time-sensitive data quality. This is for accurate measurements and analysis of structured tick data across disparate sources and a diversity of asset classes.  The related reference information – from corporate actions, exchange calendars and global symbologies – is at the root of this quality.

    What are some of the aspects of OneTick that address these investment trends? Are there specific data management issues associated with quantitative and systematic trading and trade performance analysis and how does OneTick address these?

    OneTick excels at extracting effective value from large data sets in a timely manner for trade-related decision management including econometric trade modeling, risk and cost analysis.

    Financial practitioner’s worst fears are spending more time processing and scrubbing data than analyzing it. The challenge of achieving timely data quality is dealing with the vagaries of financial data. This includes multiple data sources, mapping ticker symbols across a global universe, tying indices to their constituents, ingesting cancelations and corrections and inserting corporate action, price and symbol changes. Firms are using OneTick for capturing data for back-testing new and evolving trade models in the never ending hunt for alpha and for risk management and custom transaction cost analysis.

    But there is much more needed beyond the capture and quality of the massive fire hose volumes in financial markets; it’s the analytical tools to make sense of this data, in both real-time processing and historical. OneTick includes a wealth of analytics functions and a visual modeling tool to easily assemble the semantic logic for the algorithms behind trading, risk and cost management.

    Please elaborate on the relationship between CEP and tick data management.  Are there ways for aligning CEP technology with market data infrastructures and historical data management that can improve trading and risk management operations?

    The primary focus of tick data management is the capture and quality of time-series market data.  This data dump provides a historical context to the analysis typically associated with real-time complex event processing (CEP), a foundational technology for analytical pattern detection.

    The conclusions drawn from CEP pattern analysis are always based on historical precedence. Consider scenarios behind the need to understand historic price volatility. It is vital to determine statistical thresholds of future price movements. This is helpful for both trade models and transaction cost analysis. An example may involve comparing current market activity to historic volumes, prices and volatility for trade execution logic. Or trading logic may involve comparisons of the live markets to historic benchmarks that include sector and index movements. The intra-day activity is weighed against historic trends to gauge volatility and smooth outliers.

    As a result, CEP and tick data management are co-dependent technologies. The ideal case for CEP analysis is to view historical time series and real-time streaming data as a single time continuum. What happened yesterday, last week or last month is simply as extension of what is occurring today and what may occur in the future.

    CEP as a developer productivity tool can focus on strategy logic and rapidly go from prototyping directly to production? What are the benefits and pitfalls of this paradigm?

    Continually changing market conditions stress algorithm profitability. Consequently, the quest to revise and tune models is never ending. Firms demand tools to configure trading algorithms for both execution and systematic alpha seeking to support a range of trading styles. OneTick’s CEP offers this customization through visual modeling of the semantic logic and run-time parameterization, vital for rapid redeployment.

    CEP is a low-latency trade infrastructure and visual modeling provides an approachable tool for traders, quants and other non-programmer types. Many CEP vendors provide a graphical modeling tool for the construction of strategies for trading and quantitative research. These tools do have the same objective in mind, to shorten the time from idea to deployment. But not all vendors’ tools are created equal; some incur a penalty for that abbreviated development cycle due to inefficient code, often machine generated.

    OneTick’s visual model is a directed graph that is natively executed by the OneTick server and maps one-to-one to the API that OneTick exposes. There is no intermediary step or mismatch caused by code-generation. The result is no loss of latency comparable to traditional development in C++.

    When it comes to data management, how can firms prepare for coming regulatory changes, in particular around the systems and controls for automated trading?

    The Knight-mare on Wall Street has exposed a latent fear of failure. Their code bug has reignited warnings of the flash crash market mayhem.  Firms desire a confidence that their algo’s will turn a profit and likewise are fearful of becoming headline news as the latest rogue algo to wreak havoc. Trading firms do not have an altruistic motive for testing algorithms. It is for ensuring robustness, accuracy and profitability. Given how core this is to the sustainability of the business, investment is heavy. At the heart of it is a robust data management platform to capture in real-time and load history.

    History can represent normal market activity, highly volatile conditions, bubbles even crash periods.  The vital measure of an algorithm’s profitability and stability are hidden in the “what-if” conditions of market history.

    On the flip side, firms are diversifying into global markets creating greater complexity in cross-border currencies, valuations and accounting standards – requiring improved data accuracy across all business functions from discovery to risk, compliance and reporting.

    In addition to the Larger Trader Rule, Dodd Frank will impose new reporting requirements on any financial institution with assets above $50 billion. This will create a greater strain on data infrastructures for the needed transparency of risk exposure and overall financial health.

    Likewise, the SEC’s Consolidated Audit Trail (CAT), which is reaching the final stages of design, will impose new trade reporting requirements.  Firms need to adapt and plan for this eventuality so data management does not devolve into crisis management.

    Regulation is driving an imperative for data quality and that only comes through robust data management.

    Does low latency factor into a firm’s data management decisions? What evolutionary steps serve as focal points for profitable trading as firms look to take the strategic view?

    Algorithmic or high frequency trading has come under fire as a root cause for market turmoil.

    Regulators may feel pressure to curb the practice, but high speed, low-latency trading is here to stay, an inevitability that will continue to gain ground.  The tools of the trade from multi-core hardware, sophisticated software to co-location cannot be un-invented.

    It’s a well-known fact that liquidity attracts liquidity. And fast access to market data is of critical importance to strategy decision time. The immediacy of pricing data improves the overall decision time ranging from best execution to spread and pair trading. A competitive advantage comes from understanding data better, fast access means faster decisions.The key enablers are effective data management and cloud deployments; low latency is simply the ante to play the game.

    Increasing sophistication in the tools to search for alpha, controlling costs and managing risk increase the demand for deep data over longer time periods across a multiplicity of markets. This data dump is the fuel feeding the evolution of data management and centers around two points:

    1)  Managing scale – As firms look to capture and store more and more data from numerous sources across many asset classes it places enormous demands on IT infrastructure. The improvements in compute power and storage per dollar make the consumption both technically and economically possible. Yet hardware and storage have long been subject to commoditization.  Cloud deployments can provide advantages in managing the scale through higher levels of data protection and fault tolerance.

    2) Algorithms are still king – Leveraging this big data dump is the fuel that drives profitability. Algorithms are born out of the mathematical ingenuity of quants and become the lifeblood of trading firms. Profitable algorithms are part genius, inspiration and perspiration and their complexity is accelerating. Leveraging the best of high-performance, scalable compute power fulfills the demanding needs of quantitative analysts and ultimately defines an end game, that Holy Grail for profitability.

    This Q&A has been sponsored by OneMarketData

    Related content

    News: Wall Street Horizon Releases New Historical Events Data Offerings
    16 January 2014 – Wall Street Horizon
    Rich historical event datasets deliver unique insights for trading strategy development and model back-testing WOBURN, Mass. Wall Street Horizon, a leading provid…

    News: StreamBase and Hithink Partner to Deliver Real-Time Analytics Powered Financial Information Services in China
    9 April 2013 – StreamBase
    Leading Chinese financial information services provider will deploy StreamBase to deliver event-based trading applications tailored to the Chinese market NEW YORK, NY (USA…

    News: Using NVDIMM as Storage, In-Memory Database Gains Durability & Keeps High Performance
    8 October 2013 – McObject
    Issaquah, WA, October 8, 2013 – In the industry’s first test of in-memory database system (IMDS) speed and recoverability using the emerging Non-Volatile DIMM (N…

    News: Diablo Technologies Disrupts Latency and Performance Metrics for the Enterprise with Introduction of New Memory Channel Flash Solution
    7 August 2013 – Diablo Technologies
    Memory Channel Storage provides a 100x increase in accessible memory, serves as superior replacement for alternative flash-based storage technologies  OTTAWA, Canada &nd…;

    News: Gunslinger HFT System Achieves 750ns Event-Detection-to-Trade Latency using Solarflare AOE with Altera Stratix FPGA
    5 March 2014 – Mercury Minerva
    Mercury Minerva’s FPGA-based trading system speeds up, and expands into government and military use   4 March 2014, Carmel IN. Since its launch in Q3 2013, Gunsli…

    News: Breaking the Code Theft Brain Drain [Advanced Trading]
    18 June 2012 – HFT Review
    By Louis Lovas Maybe it happened in the dead of night, maybe it happened in plain sight. But the allure was too great. Over the past year there has been an increase in real…

    News: The CFTC Announces The Departure Of Chief Economist Andrei Kirilenko To The Massachusetts Institute Of Technology – Chairman Gensler, Commissioner OMalia Praise The Chief Economist – Dr. Kirilenko Plans End-Of-Year Departure
    19 June 2012 – News Articles
    The Commodity Futures Trading Commission (CFTC) today announced that Chief Economist Andrei Kirilenko will depart the Commission at the end of 2012 to become Professor of the …

    Leave A Reply