Clock Synchronization – A Challenge That Need New Solution

March 15, 2017 is the next milestone target date for the U.S. Securities and Exchange Commission (SEC)s Consolidated Audit Trail (CAT) project – the worlds largest data repository. By then, Self-Regulatory Organizations (SROs) and Broker-dealers (BDs) shall comply with 11.6820(the Rule) to synchronize their clocks in milliseconds with the atomic clock of National Institute of Standards and Technology (NIST). Sadly, the CAT billion dollars budget does not include firms wanting to have NTP (Network Time Protocol) servers upgrade to GPS (Global Positioning System) time source for better clock accuracy.

March 15, 2017 is the next milestone target date for the U.S. Securities and Exchange Commission (SEC)s Consolidated Audit Trail (CAT) project – the worlds largest data repository. By then, Self-Regulatory Organizations (SROs) and Broker-dealers (BDs) shall comply with 11.6820 (the Rule) to synchronize their clocks in milliseconds with the atomic clock of National Institute of Standards and Technology (NIST). Sadly, the CAT billion dollars budget does not include firms wanting to have NTP (Network Time Protocol) servers upgrade to GPS (Global Positioning System) time source for better clock accuracy. This paper aims to discover new solution to solve this clock synch challenge and minimize the industrys burden.

(1) Rationales to push for the extreme clock synch precision

Clock synch is huge issue because the usefulness of CAT data depends on the ability to sequence trade activities on a play-by-play basis to unveil intelligence to address the flash crash. If one player is 50 milliseconds faster, while another player is 50 milliseconds slower than NIST and/or exchanges server, a lot can happen (including unfair advantages) in between the 100 milliseconds total time difference. So people in risk analytics are advocating for the extreme precision in clock synch wherever possible.

(2) Are the CAT clock synch requirements being reasonable

Reference to an empirical report – International Comparison of NTP Servers, most NTP servers clocks seem fine synching with NIST, unless one really gets technical about the occasionally out-of-synch causes.The 50 milliseconds maximum divergence is not unattainable. It is already more lenient than Europe – MiFID II RTS 25 requirements.

The only trouble I have is regarding manual orders. The prescribed 1 second tolerance limit may be artificially represented because it probably takes more than a second to measure and record time manually. Yet, one should consider if longer tolerance limit may introduce too much noise and/or overly distorted signals for market surveillance and manipulation detection purposes. Bottom line is: any time stamp granularity requirements must be reasonably enforceable, or else the standard would be meaningless.

(3) Out-of-synch causes and how strict the enforcement should be

Should firms require adjustments to their business clocks for leap seconds cumulative differences, I am not aware of any guidance by the SEC. Other out-of-synch cause, include non-reciprocity of the paths of the NTP packets. Given the capital markets commonly use sophisticated routing algorithms to control the lag (e.g. IEX 350 microseconds delay / speed bump) of the packet at the router and the next leg of the route, the delay can differ significantly and I foresee challenges for SEC analysts to cater for all these routing nuances proliferated by Reg NMS. On top of that, there are also unknown traffic congestion, Ethernet links availability during severe conditions (e.g. cyberattack, power outage, etc.). Therefore, itll be hard for the SEC / plan processor to strictly enforce clock synch requirements under such circumstances. In fact, I think SROs should be held to a higher standard. They should establish a system of traceability to NIST and there should also be review of its compliance at least on annual basis.

(4) What its like to achieve sub-microsecond performance

Per FCA paper #16, I concur that the merits of HFT in a market depend on its character. In my opinion, Thesys Technologies (the winner of CAT bid and an affiliate of HFT firm Tradeworx) whom built the MIDAS system for SEC is a creditable act in use of ultra-fast technology. Whereas for other HFT firms, can someone use HFT to do manipulative just-in-time process (i.e. sell first and then buy back within lightning speed)? Would the research on quantum computing by Goldman Sachs, RBS, CME, and Guggenheim be an effective solution to cope with flash crashes and related consequences? No-one knows at this point if any well intent tech development may inadvertently become threats to financial stability.

Prior to the approval of CAT, I have said the CAT is bad because I despise locking valuable data in a centralized vault and it lacks real-time scrutiny of the markets. Regulatory access of CAT data in T+5 days simply wont be fast enough to catch rogue traders whom operate in nanoseconds. Should regulators slow down the industry in pursuit of speed? This is like asking why the U.S. doesnt have freeways with no speed limit (i.e. a moot point).

(5) Is it necessary to perfect the exactitude of clock synch

It is not wrong for the SEC and other regulators around the world to encourage SROs and BDs to up their risk management capabilities. Risk and compliance need to match up with the super-fast trading algorithms by equipping themselves with much faster surveillance tools. However, we live in a less than ideal world and not all firms have GPS time source. The challenge to sequence CAT reportable events is all about the likelihood of irregular activities being detected within a time interval. Prevailing practices by determining the shape of a pattern have lots of modeling errors, and it is a resources-draining exercise.

Since measuring vectors graphically is such an erroneous process, can we stop doing it? If we can find a much faster way to do onset detection, would it enable preventive risk controls to be done in real-time rather than after-the-fact investigations? If such new alternate process wouldnt introduce too much noise and/or overly distorted signal for market surveillance and manipulation detection purposes, does that mean there is no necessity to perfect the exactitude of clock synch?

In our case, we found a new solution that inspired by concepts of music plagiarism. Imagine someones singing is a bit out of tune, out of beat and rhythm. You may still be able to recognize what song he or she may be singing. Thats the beauty of audio pattern recognition. Click here to read the full whitepaper and see how we eliminate troubles to perfect the exactitude of clock synch!

Kelvin To is the Founer and President of Data Boiler Technologies

(1) Rationales to push for the extreme clock synch precision

Clock synch is huge issue because the usefulness of CAT data depends on the ability to sequence trade activities on a play-by-play basis to unveil intelligence to address the flash crash. If one player is 50 milliseconds faster, while another player is 50 milliseconds slower than NIST and/or exchanges server, a lot can happen (including unfair advantages) in between the 100 milliseconds total time difference. So people in risk analytics are advocating for the extreme precision in clock synch wherever possible.

(2) Are the CAT clock synch requirements being reasonable

Reference to an empirical report – International Comparison of NTP Servers, most NTP servers clocks seem fine synching with NIST, unless one really gets technical about the occasionally out-of-synch causes.The 50 milliseconds maximum divergence is not unattainable. It is already more lenient than Europe – MiFID II RTS 25 requirements.

The only trouble I have is regarding manual orders. The prescribed 1 second tolerance limit may be artificially represented because it probably takes more than a second to measure and record time manually. Yet, one should consider if longer tolerance limit may introduce too much noise and/or overly distorted signals for market surveillance and manipulation detection purposes. Bottom line is: any time stamp granularity requirements must be reasonably enforceable, or else the standard would be meaningless.

(3) Out-of-synch causes and how strict the enforcement should be

Should firms require adjustments to their business clocks for leap seconds cumulative differences, I am not aware of any guidance by the SEC. Other out-of-synch cause, include non-reciprocity of the paths of the NTP packets. Given the capital markets commonly use sophisticated routing algorithms to control the lag (e.g. IEX 350 microseconds delay / speed bump) of the packet at the router and the next leg of the route, the delay can differ significantly and I foresee challenges for SEC analysts to cater for all these routing nuances proliferated by Reg NMS. On top of that, there are also unknown traffic congestion, Ethernet links availability during severe conditions (e.g. cyberattack, power outage, etc.). Therefore, itll be hard for the SEC / plan processor to strictly enforce clock synch requirements under such circumstances. In fact, I think SROs should be held to a higher standard. They should establish a system of traceability to NIST and there should also be review of its compliance at least on annual basis.

(4) What its like to achieve sub-microsecond performance

Per FCA paper #16, I concur that the merits of HFT in a market depend on its character. In my opinion, Thesys Technologies (the winner of CAT bid and an affiliate of HFT firm Tradeworx) whom built the MIDAS system for SEC is a creditable act in use of ultra-fast technology. Whereas for other HFT firms, can someone use HFT to do manipulative just-in-time process (i.e. sell first and then buy back within lightning speed)? Would the research on quantum computing by Goldman Sachs, RBS, CME, and Guggenheim be an effective solution to cope with flash crashes and related consequences? No-one knows at this point if any well intent tech development may inadvertently become threats to financial stability.

Prior to the approval of CAT, I have said the CAT is bad because I despise locking valuable data in a centralized vault and it lacks real-time scrutiny of the markets. Regulatory access of CAT data in T+5 days simply wont be fast enough to catch rogue traders whom operate in nanoseconds. Should regulators slow down the industry in pursuit of speed? This is like asking why the U.S. doesnt have freeways with no speed limit (i.e. a moot point).

(5) Is it necessary to perfect the exactitude of clock synch

It is not wrong for the SEC and other regulators around the world to encourage SROs and BDs to up their risk management capabilities. Risk and compliance need to match up with the super-fast trading algorithms by equipping themselves with much faster surveillance tools. However, we live in a less than ideal world and not all firms have GPS time source. The challenge to sequence CAT reportable events is all about the likelihood of irregular activities being detected within a time interval. Prevailing practices by determining the shape of a pattern have lots of modeling errors, and it is a resources-draining exercise.

Since measuring vectors graphically is such an erroneous process, can we stop doing it? If we can find a much faster way to do onset detection, would it enable preventive risk controls to be done in real-time rather than after-the-fact investigations? If such new alternate process wouldnt introduce too much noise and/or overly distorted signal for market surveillance and manipulation detection purposes, does that mean there is no necessity to perfect the exactitude of clock synch?

In our case, we found a new solution that inspired by concepts of music plagiarism. Imagine someones singing is a bit out of tune, out of beat and rhythm. You may still be able to recognize what song he or she may be singing. Thats the beauty of audio pattern recognition. Click here to read the full whitepaper and see how we eliminate troubles to perfect the exactitude of clock synch!

Kelvin To is the Founer and President of Data Boiler Technologies