Make It Work, Make It Right (fast, easy, secure), and Make It Last

In my comment to the SEC last year, I recommended requiring the Exchanges to spin-off their data and technology business. It would limit their economy of scope and urge them to refocus on the economy of scale in the listing and trading business. The fairness of this mutually exclusive approach is based on trust that the market can best determine the worthiness of these data and technology businesses.

It takes guts to implement such revolutionary approach. So, in pursuit of a different method to solve the market data and access challenges, I developed a new suite of patent-pending solutions to address the latencies and content differentials between SIP’s core data and the Exchanges’ proprietary feeds. I call this the “audio compression and time-lock encryption” approach. This new approach would ensure SIP’s core data evolves along with the broader market ecosystem. The following describes how it works, why it is the right solution, and how it will support the sustainable development of capital markets. [IMGCAP(1)]

First of all, in adopting Regulation NMS and per 70 FR 37567 in 2005, the Commission stated that “adopted Rule 603(a) prohibits an SRO or broker-dealer from transmitting data to a vendor or user any sooner than it transmits the data to a Network processor.” Then, in Order 67857 in 2012, the Commission stated that “exchanges have an obligation under Rule 603(a) to take reasonable steps to ensure—through system architecture, monitoring, or otherwise—that they release data relating to current best-priced quotations and trades through proprietary feeds no sooner than they release data to the Network Processor, including during periods of heavy trading.” I think the interpretations of 17 CFR §242.603(a) is incomplete and requires clarification or appropriate updates.

As highlighted in TABB Group’s market data revenue analysis, “faster access” isn’t only about accelerating SIP’s processing speed, but there are issues with the current aggregation distance.

“Transmitting or releasing data no sooner than to a Network processor (SIP)” only describes one of the aspects of “fair and reasonable” and “not unreasonably discriminatory” principles required by Reg. NMS.  It omitted the fact that market data is highly valuable (it reflects the price discovery created by exchanges) and it requires proper security protection. Hence, the secured delivery (in-motion and at-rest) and retrieval of data in a timely manner are equally important. So, instead of introducing greater competitive forces into the “dissemination” of core data that would result in multiple best-prices confusing the market, the SEC should mandate the use of time-lock encryption (see this for the general concept, and rest assure this is not another speed bump). It would allow proprietary feeds and SIP consolidated data to be “available” securely in synchronized time.

Time-lock encryption is a method to encrypt data such that it can only be decrypted after a certain deadline has passed. The goal is to protect data from being decrypted prematurely. There are various ways to build time-lock encryption for different protection requirements. The architecture needs precise calibration of time with an independent time aware atomic clock, such as the NIST.  Besides, we don’t want to push the bottleneck to an arms-race of using high-performing computers to decrypt data. Hence, computational resources and the type of data contents must also be considered in the design of a reliable encryption scheme.    

In terms of SIP’s data content, I agree that depth-of-book information is needed. However, too much information is going to drag the processing time of SIP from a technical perspective. Slowing the SIP’s processing time would mean delaying the availability of proprietary feeds – if everything was tied to a synchronized decryption mode. Therefore, optimizing between processing speed and contents’ richness is the optimal choice. Quantum computing may fully address the aggregation distance/ location differential issues; however quantum computing may inadvertently cause new concerns, such as the lack of an audit trail. For that, I looked for clues in the stories of MP3, Napster, and iPod to see how the audio sector is able to achieve optimization.

I found that MP3 is indeed a lossy compression type, while human ears couldn’t distinguish almost any difference from lossless music. Lossy methods yield a substantially greater compression ratio (60% or more of the original stream) as compared to traditional techniques (only 5-20%) that exploit statistical redundancy, Huffman coding, or probability method to represent and compress market data.

My patent-pending methods reduce data storage and booster the efficiency in data distribution, while also enabling the replicate the depth-of-book information (relative strengths in bid/ask price and steepness of the price curve). I acknowledge that the buy-side also wants the SIP to include all the odd-lot details amid some hidden cost for high priced stocks. My response is: When we are in the midst of systemic reform, asking for too much or insisting on “complete” transparency, may indeed be detrimental to price discovery and the sustainable development of a healthy market. We do want to strike the appropriate balance in order to avoid a “No fish can survive when the water is too clear” situation. Given that, we’ll preserve the richness of contents to the best we can, while making the tool fast, easy, secure and fit for the effective monitoring of trade activities.

Since a thousand trades can occur in between the 50 +/- millisecond tolerance time allowed by Consolidated Audit Trail (CAT), the inexactitude in trade sequencing would cause analytic results based on vector measurements/ visualized heat-maps to be erroneous. To overcome this inherent problem of data imprecision, my suite of patent-pending inventions applies a “music plagiarism detection” method to achieve higher tolerance to the unsynchronized clock issue., and is capable of recognizing patterns more quickly (up to 50 milliseconds top speed, as compared to taking hours or days or even months for trade review). Aside from the accelerated speed to decipher what’s going on in the market, it has fewer false-positives/ false-negatives than the traditional techniques. It makes implementation of preventive controls in real time possible; and there are other benefits such as the ease of trade reconstruction, order book replay simulation, backstop assurance, case management capabilities, crowd computing methods, and more.

Last but not least, Thomas Petterffy’s handheld computers transformed/ digitized the marketplace over 36 years ago. Now we must anticipate changes, consider how the equity market should behave, and advance the NMS so that it will last for the next 30 years in upholding our market integrity standards. Policy makers are now given two recommendations:

(a)     Ask Exchanges to spin-off data and technologies business at fair values. This separation would replace undesirable outcomes of a distorted economy of scope with efficiency gains in capital formation.

(b)    Mandate the use of “audio compression and time-lock encryption” approach, so as to allow proprietary feeds and SIP consolidated data to be available securely in synchronized time.

I trust the SEC would wisely consider these two ideas, which aren’t necessarily mutually exclusive. I have no suggestion to change up the SROs governance of SIP NMS Plans because FINRA and other SROs should own up the consequences of a market they have designed jointly with the SEC. I welcome any questions, feedback or constructive criticism.

Next and final in the series, I’ll talk about ‘best interest’ and the ultimate goals of market structure design, so stay tuned.

Revisit previous articles in the series by clicking below links: