FLASHBACK FRIDAY: Differences in Data Delivery Are Fair

Its amazing what just a few years difference can make.

Back in 2012, the equities market was debating amongst itself just what was fair when it came to data feeds, delivery, the Securities Information Processors (SIPs) and the exchanges who ran them. Back then, market data feed speed wasnt as crucial to as many participants as it is today. The exchanges back then offered the SIP to everyone and it was adequate for many. For those who required data faster, proprietary data feeds were available at nominal costs – which too were justified at that time.

Fast forward to 2018. The debate on data delivery and cost have escalated. The brokers and buy-side have claimed the SIPs are outdated and in need of modernization. Many said they are forced to purchase higher tier data from the exchanges in order to stay competitive. And the exchanges, aware of this, keep charging ever increasing amounts for the upgraded data feeds.

The exchanges, whose business mode has moved from making money on trading to charging for data and other related technology, are accused of charging too much for the services they offer. In a recent Traders Magazine poll, 89% percent of those surveyed said the exchanges were not justified in charging what they do for data when compare to the costs associated with generating the data. The exchanges contend that they face increasing costs in generating the ever-increasing appetite for more data that can be delivered more quickly.

So, whats to be done? Who is right? Who is wrong? Who gets hurt? Who benefits? How did the market get here?

The Securities and Exchange Commission has recently stepped into the heated debate and held two days of meetings and panels where the buy- and sell-side, along with the exchanges and others to take a closer look into data delivery, generation and costs. This followed the regulators decision to set aside NYSEs and Nasdaqs depth-of-book fees as of October 16.

The fees falling under the judgment are those fee that Nasdaq and NYSE Arca put in place on September 15, 2010, and November 9, 2010, respectively.

The Commissions decision rests on the exchange operators failure to meet their burden to demonstrate that the fees are fair and reasonable and not unreasonably discriminatory, wrote the authors of the decision. We do not, by our findings here, conclude that the fees are not fair and reasonable. Rather, the factual record submitted and the theories based on the record part forward by the exchanges are insufficient to support a finding that the fees at issue meet the statutory test.

The decision marks a change from the 2016 decision by Law Judge Brenda Murray, who rule that there was a competitive environment between Nasdaq and NYSEs depth-of-book offerings and that SIFMA failure to prove that depth-of-book data is a need rather than a want.

Today we also remanded hundreds of similar requests seeking to increase the price of information necessary to participate in Americas stock markets, noted SEC Commissioner Robert Jackson in a prepared statement. The exchanges will now have the opportunity to review those requests under the standards articulated in todays landmark opinion-which requires exchanges to be prepared to show us that any price increases are justified by competition rather than the exchanges market power.

This pragmatic ruling by the SEC indicates increasing recognition by policymakers that the fee structure for proprietary market data products is broken,” added Melissa MacGregor, managing director and associate general counsel at SIFMA, in a prepared statement. “As noted in the unanimous decision, the exchanges fail to meet their burden to demonstrate that the fees are fair and reasonable and not unreasonably discriminatory as required under current law. Today’s decision should prompt further examination of policy reforms to ensure the efficiency of public market data feeds and fairness of fees.”

At the two-day SEC confab on fees. Oliver Albers, Global Head of Strategic Partnerships for Nasdaq Global Information Services, noted that 97% of equity trades occur at or within the National Best Bid and Offer (NBBO), which the SIP displays. The SIP has seen vast improvements over the past few decades, said Albers, who cited sizable reductions in latency, message-traffic capacity more than 20-fold higher, and 96% lower costs for Main Street investors.

There are many market participants, all with different data needs, Albers said at the roundtable. We provide choice to make it possible for all investors to consume data.

As a key takeaway on the topic, Albers said: a simplistic view of core vs. non-core, or slow vs. fast, or public vs. private, is misleading.

Matt Billings, Managing Director of Market Data Strategy at TD Ameritrade, noted that access to quality market data is critical for retail investors. The SIP can provide adequate top of book price quotes, but using the SIP entails hurdles such as a cumbersome onboarding process and high overhead costs.

Billings disputes some industry data that show huge drops in SIP prices over the years, as those figures likely do not factor in connectivity and other indirect costs. At what point do retail investors move away from SIP to private? data feeds, Billings asked.

Exchanges have additional costs associated with the SIP beyond just producing data, according to Michael Blaugrund, Head of Equities at NYSE. Exchanges havent disclosed the direct cost of the SIP due to concern about misperception of the number, he said.

Charles Schwab Corp.s view is that the SEC should mandate exchanges provide depth of book prices on the SIP feed, not just top of book. Exchanges wont do this voluntarily because it will cannibalize their proprietary data products.

Thats according to Jeff Brown, SVP of Legislative and Regulatory Affairs for Schwab, who likened the SIP to a car designed in the 1980s. Would anyone buy a 1980s car?

As a retail investment firm, we dont think SIP is all you need, Brown said. It is valuable, but it can be upgraded and made relevant again by adding depth of book.

An institutional buy-side perspective was offered by Simon Emrich or Norges Bank Investment Management. Emrich said the firms traders and risk managers still utilize the SIP feed, but use cases for SIP data have decreased substantially.

Traders need to see the full book of data to make the right decisions, not just top of book, Emrich said. And with regard to transaction cost analysis (TCA) that takes place post-trade, using SIP is no longer sufficient to evaluate best-execution obligations.

Stay tuned as this debate continues…

The following article was first published in the November 2012 edition of Traders Magazine

Differences in Data Delivery Fair

By Peter Chapman

You get what you pay for.

In September, the Securities and Exchange Commission fined NYSE Euronext $5 million for disseminating New York Stock Exchange market data to paying customers before it went out over the public feed. In the wake of that event, exchange executives noted that the public feed is naturally slower and that those wanting their data faster expect to pay up.

“The speed of the SIPs has increased tremendously,” Ed Provost, chief business development officer at the Chicago Board Options Exchange, said at the annual conference of the Security Traders Association in Washington. “But there’s still a difference between the direct feeds and the SIPs, and obviously, people are willing to pay for that. They think it’s worth it.”

NYSE Euronext and Nasdaq OMX Group are designated as the Securities Industry Processors, or SIPs, which aggregate all data from exchanges and disseminate it to brokers.

Most of the exchanges also offer their own data on a proprietary basis to their customers. These feeds cost more, but the data arrives sooner. The latency differential is estimated at between 500 microseconds and a millisecond.

The vast majority of broker-dealers-an estimated 80 percent, including the largest-pay for proprietary data feeds.

“A variety of firms take the direct feeds,” noted Joe Mecane, an NYSE Euronext executive vice president, at the confab. “It’s not just the proprietary [trading] firms. All the major sellside participants take the direct feeds for one purpose or another. It’s largely a matter of choice for the industry.”

Exchange executives noted that the latency differential between public and private feeds is due to the number of “hops,” or aggregation stages necessary when consolidating data for the public feed. “The processing time alone is going to add some degree of latency to the system,” said Eric Noll, a Nasdaq executive vice president in charge of transaction services.

Noll added that it is not just speed that makes the proprietary data more attractive to some players.

“The direct feeds are much richer,” he said at the STA conference. “There’s much, much more information. And it’s not just to the advantage of the prop trading firms, but to all broker-dealers who take those direct feeds. They want that richer data set.”

In any event, the 20 percent of brokerage firms not receiving the proprietary feeds are not necessarily at a disadvantage, according to one exchange exec.

“The other 20 percent may not have a need for [the proprietary feeds],” noted Gary Katz, president and chief executive of the International Securities Exchange. “They’re not running automated systems that generate orders.”

In fact, Katz said, “they may be feeding websites used by retail customers and they don’t want to pay the extra money for that sub-millisecond advantage. We are helping them by maintaining a lower cost base. It really depends on the need for the data and what it’s being used for.”