Free Site Registration

How to Avoid Market Data Mayhem

Traders Magazine Online News, July 6, 2017

Jay Patani

On Monday evening, in a bizarre turn of events, Apple’s stock appeared to be down 14 percent, Google plummeted by 86 percent and Microsoft shot up to 79 percent.  The one thing in common? They were all were trading at $123.47.

The actual prices of the stocks, listed on the Nasdaq stock exchange, remained unaffected and no trades could be made at this price. Nonetheless, this kind of incident is a red flag moment, signaling that market data processes are not being adequately monitored.

What happened? A Nasdaq spokesperson quickly stated that it "is investigating the improper use of test data distributed by the UTP [unlisted trading privileges] and consumed by third parties”. This indicates that the data providers should therefore perhaps take more of the blame for the issue.

Wherever the fault and whoever’s to blame, this mishap reminds us how connected different systems are in the financial world.  One small pricing mistake at an exchange can spiral out of control to different financial market participants across the world.

This is also not an isolated incident.  There have been numerous trading glitches in recent years. In August 2013, Nasdaq ground to a halt for three hours for distributed the wrong price quotes on its stocks. In July 2015, a technical glitch led the New York Stock Exchange to close for nearly four hours.

Luckily, Nasdaq’s [mishap] doesn’t seem to have caused immediate damage. However, these events always erode reputation.  In electronic trading, access to market data is crucial. The integrity, quality and latency of prices can make or break trading strategies. Though not the sexiest part of trading, adequate monitoring and analytics on market data can optimise clients’ execution performance and prevent embarrassing mistakes.

Exchanges and their data providers should routinely assess the quality of the data feeds and be able to answer the following questions at any given time:

  • Are any of my market data sources slow or running out of sequence?
  • What is the latency of my pricing feeds?
  • Have there been any inexplicable price spikes or plummets?
  • Are there any gaps in the pricing?
  • Are there any incongruities between pricing across different channels?

If an exchange has real-time visibility on these questions, they’re in a good position. However, firms that only measure the quality of data feeds may miss out on large irregularities. For example, everything may be fine in the process of distributing prices, but what if the actual prices being reported suddenly started to go crazy?


Firms therefore also need to analyse the content of the market data feeds to spot price fluctuations and unexpected differences between the bid and offer prices. Using real time processing, firms can let real-time queries do all the hard work instead of relying on manual post-trade processing.

We still don’t know how Nasdaq’s test data made its way onto so many reputable news websites. It’s likely that at some point in the chain, human error was the cause.  But two things are for sure: this event was entirely avoidable, and stricter, more comprehensive monitoring processes are the solution.

For more information on related topics, visit the following channels:

Comments (0)

Add Your Comments:

You must be registered to post a comment.

Not Registered? Click here to register.

Already registered? Log in here.

Please note you must now log in with your email address and password.