Data Evaluation Challenges Industry

Given volatile times and uncertainty, portfolio managers and risk managers are looking at a lot more sources of data more frequently, according to Elizabeth Pritchard, Founder of White Rock Data Solutions.

Elizabeth Pritchard

“We’re creatures fed by data so that we can understand the markets at a macro level,” she said at the Wall Street Horizon’s Sept. 8 webinar “Data Minds: Watch for Bear Traps! Using Data-Driven Signals in a Down Market”.

The Wall Street Horizon panel was moderated by Steven Levine, Senior Market Analyst at Interactive Brokers, and it focused on incorporating data into trading and risk strategies in a bear market.  

According to Pritchard, It’s been interesting to be part of the industry right over the last five to ten years, to see how the industry has systematized the search for an evaluation of data. 

“This is a big problem for the industry to solve this evaluation of datasets because there’s such a proliferation of datasets, and it’s just too costly to try to evaluate them all,” she said.

Christine Short, VP of Research at Wall Street Horizon, said that data strategy is supposed to serve investment strategy.

“A lot of times, I may have an idea what I’m looking for, but there’s no vendors out there providing the quality of data,” she stressed.

Jeremy Payne, former Chief Product Officer at Canalyst, said the costs to onboard the data for the purposes of evaluation are nearly if not exactly the cost of onboarding the data for production purposes.

Jeremy Payne

“People are put in a position where you have to buy it before you try it in a lot of cases,” he argued. 

He thinks that’s where the industry and technology have an opportunity to provide ways to provide insight into their own data set even without technology. 

“Do the hard work, do the research, evaluate your own data in an objective way and make that information available as part of your marketing,” he said.

On the on the flip side, he added there are technology opportunities – cloud based solutions that will supplant the ubiquitous FTP delivery, that to this day is how most datasets, especially large ones, are transferred from one environment to another. 

“Even in a cloud based world, people still have to pick up the files, bring them to their environment, unpack them and load them,” he said.

There is a tremendous amount of data, but it’s not easy to connect a lot of these datasets, Payne said.

The data is out there, he said, adding that except for a small number of firms that have the capability and have the internal technology resources, the markets are relying on the industry for these solutions. 

“What we’re seeing in the last five years is a democratization of what was literally only done in a handful of firms and now being able to be done at more and more firms, because some of these frictions are starting to fall by the wayside,” he said.

According to Pritchard, there are “all these different models across the street”. 

She thinks that low code, no code tools continue to advance as the data tech FinTech continues to develop out solutions that can be automated. 

Pritchard said that the industry hasn’t fully matured and believes that it’s going to mature into software solutions that can be utilized very broadly across the industry. “We haven’t gotten there yet,” she stressed.

“We’re going to see much more democratization and we’re going to see even smaller firms be able to bring in more datasets, and evaluate more datasets much more efficiently at a lower cost,” she added.