Word for Word
Traders Magazine, November 2011
Matt Lavicka is a managing director at Goldman Sachs. He recently spoke out at the Securities Industry & Financial Markets Association's annual market structure conference about the high costs to the industry of high-frequency traders flooding the market with quote data.
>> On the unfairness of the cost burden
There has been this faster and faster race of trying to be able to react quickly to quotes and pump more quotes into exchanges. There is no end in sight. There is a certain element in the industry that is producing a lot of messages and not necessarily bearing the costs that result from that production. There is a broader universe of market participants that must consume all these messages. That has to worry about having their feeds upgraded to support the latest bandwidth. That has to worry about latency. It's a big cost to the industry to consume these kinds of messages.
>> On a solution
We need to do something. There has to be a re-aligning of incentives and disincentives. There has to be a better throttle mechanism to deal with the ever-increasing market data message rates.
>> On the problem of private exchange feeds
The NBBO and the production of market data is supposed to be an industry utility. It is mandated by the SEC. Essentially, it is a monopoly. The industry is obliged to consume it. It has to pay whatever price is necessary to get that data. The problem has to do with the competitive nature of the exchanges. Exchanges used to be non-profits. And they were responsible for production of this data. Now they are for-profit and they have various private market data feeds. And we have this misalignment of the private market data versus the SIP data-feeds, or the NBBO. These issues will get in the way of us solving this. There needs to be the right incentives set up. It will take the SEC to make that happen.
>> On the threat to a limit up/limit down mechanism
Putting in limit up/limit down is crucial, but it can't happen without recognizing the importance of market data. If limit up/limit down gets behind...It's the old "garbage in, garbage out" problem because of an overwhelming amount of market data being produced. The NBBO and the last trade are [supposed to be] accurate and a reflection of where the market is trading. But in fact that is not always the case. We should put more effort into detecting and making sure the industry has the ability to detect when that is not the case. That way we won't have these issues about who withdrew from the market and who can't get to...Why are the market participants doing things like having to ping consolidators? And checking one versus another? The originators of the trades should be recording time stamps and disseminating them so the consumers can actually see if there is latency in the data. There are a lot of important things we need to do to shore up the infrastructure to make sure the market data is accurate.
For more information on related topics, visit the following channels: