FLASH FRIDAY is a weekly content series looking at the past, present and future of capital markets trading and technology. FLASH FRIDAY is sponsored by Instinet, a Nomura company.
In February 1998, there was a Traders article on the market-data industry which stated that it was undergoing a paradigm shift, “moving like the rest of Wall Street from terminals and boxes to systems that are open, fast and more malleable”. Tom Norby, Head of Nasdaq trading at Portland-based Black & Co., cited in the article, said that “when he is selecting a market-data system, his top priority is the unique information the service provides”.
Fast forward to 2022 and we look into the current state of the market-data industry, outline current traders’ needs and see how they choose their market-data providers, as well as provide outlook for the industry.
According to Jesse Forster, Senior Analyst, Market Structure & Technology at Coalition Greenwich, while the overall trend is secular growth and maturation of newer entrants and platform types, there’s a lot happening on the regulatory front that needs to work itself out over the coming months. “Thankfully, while still a significant portion of their budgets, the existing market provides a wide range of options to suite the disparate needs among brokers and their traders,” he said.
Forster said that traders need to be able to interpret the data and expediently act on it, rather than obtaining, cleaning and normalizing data in addition to supporting the platform. “Choosing the appropriate market data provider will allow for this most effectively and efficiently,” he said.
Whether direct from exchange, via established & integrated financial technology platforms, upstart aggregators, or broker-dealers, each trading firm needs to do their own cost-benefit analyses to determine their ideal consumption method and provider, Forster said.
“Like best execution, what’s most appropriate for one may not be for another,” he argued.
David Aferiat, Co-Founder and Managing Partner of Trade Ideas, added: “We see clear needs for the market from our perch of helping active investors make better decisions in the US markets: Rebuild trust in markets by all participants. Starts with transparency. Acknowledge practices that impact retail; and Bringing new asset classes into more liquidity, retail/institutional participation.”
Christine Short, VP of Research, Wall Street Horizon, stressed that market data is “more important than ever in a downturn”, saying that during these volatile times investors can’t afford to miss important information on the stocks and options that they cover. “Market data providers with a reputation for accuracy and a deep history tend to come into focus during bear markets as a foundation for the investment decision process; having inaccurate data could be very costly.”
She said that traders need robust, accurate data that can be easily incorporated into existing strategies, delivered in the format they desire, and that’s a proven source of alpha. “They need primary sourced data, with strict adherence to compliance. They need updates throughout the day and a very repeatable process. They need clean, well-defined data. They need subject matter experts who can explain the data being provided and illustrate use cases,” Short said.
“What active traders demand, more than returns even, is time back from advanced tools capable of assessing, in real time, millions of data points to identify trends, leverage strategies over performing under constantly evolving, volatile market conditions,” Aferiat added.
From a risk perspective, Stuart Smith, Co-Head Business Development at Acadia, said that the use of market data has changed significantly over the last 10 years.
As failures in risk models from before 2008 were identified they were often attributed to risk systems, which used overly simplified views of market data, ignoring important risk factors, or assumed liquidity in price factors, which was not always possible, he said.
There has been very significant efforts brought to bear by the industry to resolve both of these issues, he added.
“We are seeing a much more ‘joined up’ approach to procurement and use of market data, where common repositories are shared across the front and middle office to ensure that the middle office risk calculations are consistent with the front office,” he said.
“Regulations such as FRTB have put particular spotlight onto this issue with the Profit and Loss Attribution test showing just how challenging it is to achieve a strong alignment,” he said.
At the same time many areas of software have pivoted from install solution to cloud delivered services, Smith added.
While there are specific challenges to financial services moving in this same direction the same transition is occurring albeit at a slower pace, he stressed.
According to Smith, drivers such as UMR where central service providers have dominated in the later roll-out phases have shown that this transition is well underway.
“It has become standard for those solutions to come with an embedded market data model, reducing the integration costs for clients onboarding onto those services, and is a strong feature of the new class of Risk as a Service providers,” he said.
“This transition has enlarged the market for data providers to sell into where firms will often now pay for data which they use on-premise, and also pay embedded costs for data within cloud services. This is traded off against the efficiency of the services they are purchasing and the power of network models validating a single set of data,” Smith commented.
Choosing market-data providers & costs
Experts agree that some of the largest and most established financial technology platforms are also the most popular and widely used market data providers.
As has consistently been the case over time, large vendors dominate the industry, according to Jack Garceau, CTO, JonesTrading. He added that niche players are being introduced to take into account additional venues and asset classes such as crypto. “There is a bit of an inverse proportion if not a paradox in that while as execution and commissions continue to compress, Market Data costs continue to rise,” he said.
“This is likely directly tied to the importance of market data to industry participants. As a result, there is increasing a hybrid use between the large and niche models to reduce overhead and gain efficiency,” Garceau said.
Short argued that many traders look for coverage that is both deep and wide, with the number of stocks covered and data points on those stocks being very important. She added that global coverage is becoming increasingly more important.
“Traders rely on data suppliers that have a reputation for accuracy and diligence,” she said.
Traders tend to use a market data terminal such as Bloomberg, FactSet, Refinitiv, Activ, ICE, Xignite or S&P Global, according to Short and Garceau.
“They use this along with other bespoke data sets such as Wall Street Horizon to get an edge on competitors,” Short said.
She added that responsive and supportive customer service also ranks high on the list of must-haves, which can be vetted during the data testing process.
“Some dedicate in-house vetting teams; some outsource this task. Some look to the data supplier to be the expert on their own data but also helpful in advising where else to obtain data,” Short said.
As trading decisions are always information based, according to Garceau, “fast and accurate data is essential for all participants”. He said that the data feeds themselves need to power Trade Cost Analytics thru the entire order life cycle (Pre Trade- Working Orders – Post Trade), adding that increasingly traders are incorporating Market Data through the adoption of AI. “From this, traders gain a deeper understanding of how and where orders should be managed,” he said.
When choosing market-data providers, the most important decision criteria are: reliability, latency and scalability, Garceau said.
“Obviously, covering the regions and asset classes that the trader is in is essential. Less obvious is the ability to power multiple applications. And finally cost,” he noted.
Forster added that ideally traders and their firms are choosing market data providers based on their actual usage needs balanced by budgets and resources.
He said that some may choose to consume exchange data feeds, incurring server, connectivity, development and other costs, in exchange for more flexibility and control with potentially lower latency.
Others may onboard market data aggregators who have preexisting servers and connectivity in place and available via API for quick setup, he added.
Forster further said that some will have a mix, depending on asset class, geography or usage (think historical/reference for TCA vs low-latency real-time for more active trading). Cyber security, front and center of every CEO and risk officer’s mind, customer & technical support, and overall reliability are also important factors.
“Whichever delivery method(s) they choose, traders should be mindful that – like most other things in life – you get what you pay for and quality costs. Lower cost solutions often lack the overall quality, flexibility and scalability generally deemed appropriate for institutional trading. Cost-effective deliver is more desirable than low-cost delivery. Good, fast, cheap – pick two,” he stressed.
Short said that even before the current bear market, investment firms have been upgrading their infrastructure and data purchases in order to compete in an environment where alpha decays quickly and risk is always looming. “Volatile markets have only accelerated that process as it’s harder to get a leg up on one’s peers,” she said.
In 1998, according to the Tower Group, a Boston-based research firm, firms spent more than 15% of their total technology budgets on market-data services and trading technologies. In 2022, according to a recent Coalition Greenwich research, over 90% of firms are planning to increase funding of ESG data over the next three years. “Without appropriate systems, processes and procedures, ESG-related information can become noise that does not translate into actionable data,” commented Forster.
As discussed in Coalition Greenwich’s “Alt Data for Investing: Not So Alternative Anymore”, while alt data (including ESG) originally was procured separately from traditional market data, vendors have expanded into this market and are developing offerings and marketplaces tailored to this new segment.
“Thus, while many asset owners and asset managers consider their current investment process and research setup sufficient, increasing market complexity is producing new urgency a more modernized architecture and approach to systems,” Forster said.
To stay competitive firms must continue to invest in fast, reliable data, commented Garceau.
“Co-location, faster telco, and the use of satellite delivery continue to drive investment,” he said.
Given its expanding use in trading, analytics, and compliance, Garceau said that the market data industry continues to mature, and innovate as trading becomes more nuanced.
“On the higher level, we can expect consolidation thru merging venues. And we anticipate that costs will continue to increase as demand remains high,” he said.
As Coalition Greenwich first flagged in their “Top Market Structure Trends to Watch in 2022” report, the market data industry was set to evolve quickly this year. “It certainly has, and we believe it will continue to do so in the coming year as well,” Forster said.
He added that unique decentralized finance (DeFi) challengers continue to gain traction, and the demand for ESG and alt data continue to increase each month.
A recent Coalition Greenwich report showed cloud services are becoming ubiquitous for market data delivery by institutions, with 93% of exchanges, trading systems and data providers we interviewed offering cloud-based data and services; 67% of the sell-side consumes cloud-deployed data and 88% intend to consume more.
“While institutions are widely adopting the cloud for market-data distribution and consumption, our research also points to further demand for public cloud,” Forster said.
“Exchanges, trading systems and data providers are prioritizing public cloud for internal data transformation and insights, though additional public cloud use cases are also forthcoming,” he added.
Forster also mentioned that the SEC will play a large role in (re)shaping the market data industry as it works through the new SIP and competing consolidator plans.
“One thing we can count on though is the continuous evolution amongst the various participants and factions as new technologies and offerings arise to meet the ever-growing demand of the institutional investment community,” he said.
The broader adoption of new types of data across the investment industry bodes well for the data providers, commented Short.
While it’s taken a longer time for certain players to get up to speed, specifically on the fundamental side, that is starting to happen and will likely accelerate more quickly if the current bear market holds, she said.
“For data providers the bar is just going to get higher as buyers demand clean and accurate data, and as competition for market share increases,” she noted.
Short added that specialized data sets must increasingly be studied by independent academics alongside the data providers own research. “Identifying use cases and/or strategies for alpha or risk must demonstrably be shown in order for the investment firms to determine if the data is applicable to their own research,” she said.
Ongoing access for content questions or suggestions as well as technical questions or suggestions must be made available by data providers, she added.
“Investment firms don’t want to fact check every detail once they have determined the data works for them. They also require and must be assured of no interruptions in accessing the data,” concluded Short.