Quant Traders Focus on Quality Data

Timely data that is primary-sourced to meet strict compliance practices and updated throughout the day is much sought after in capital markets, according to Michael Raines, Director of Quant Data Solutions at Wall Street Horizon.

Michael Raines

The data must be readily explainable – everything from the methodology in sourcing the data, to the content of the data, to access to the data analysts and engineers who research, update and deliver the data, he told Traders Magazine. 

“Also something that is overlooked and should be considered a given: are there detailed technical specifications provided that will explain all content. Can trial data be readily provided – either history or live data?” he said.

On October 3, 2023, Wall Street Horizon (WSH), a provider of market-moving global corporate event data, has expanded its historical data for quantitative trading and academic research. 

Quantitative researchers can gain valuable insights from historical data to use in strategy development and model back-testing.

Wall Street Horizon’s historical event data covers 10,000 publicly traded companies worldwide, the majority archived from 2006.

Raines said that history evolves from the practices and has to be archived as it was published. 

“The research will be meaningless if the history is back-filled after the event,” he stressed.  

“So does the history reflect what was known at the time of the event; and did the provider archive the history as they published it?” he questioned.  

“This second point is important for sourcing what was known, when was it known and for compliance reasons did this information come from the company itself,” he added.

For use cases, according to Raines, risk or trading strategies are developed from the quant’s own history of their trades and practices. That is, the data that they were using at the time of the strategy, fundamental data as well as other bespoke data sets, and then compared with the history of the data they now are researching.  

“Would having known this data, alongside of their data, improved their practices? For that matter, by how much improvement?,” he asked.

“Because even if there is increased alpha, for example, is it enough of an increase to justify changing/altering current strategies. Or finally, was there flat or little additive improvement from the history data being examined?” he added.  

Raines said that researchers are constantly looking for clean, ready to examine data: “Too much time and too many resources can be spent on poorly organized data sets.”   

The firm supplying the data, must provide technical, content and reference assistance in working with the researchers as they conduct their investigations, he added.  

Raines said that the researchers have to be supplied with all of the materials upfront so that they can readily understand what the data is and if they want to pursue an investigation. 

“If the documentation doesn’t speak for itself, then why bother working with the provider,” he argued.  

Subsequently, the researchers must be allowed to have serious question and answer sessions with the provider, he added.  

“If these two conditions cannot be met, it is a no go,” Raines said.  

Once into the investigation, the firm already has a rich store of news/charts/quotes/technical data that they use and a programmatic comparison between what they knew and what the additional datasets tell them can be used across multiple data sets, he commented.