Data Analytics in Focus

With Kevin O’Connor, Head of Analytics and Workflow Solutions, Virtu Financial

Describe Virtu in terms of its mission and its core clients.

Virtu Financial is focused on using advanced technology to deliver liquidity to global markets and innovative, transparent trading and analytics solutions to our clients. Virtu’s core clients are firms that participate in the global financial markets. In Virtu’s Analytics and Workflow Solutions, the core client base is weighted towards the buy side and end users are typically portfolio managers, traders and analysts. Our mission is to provide products and services that help clients interact with markets more efficiently so that they can mitigate risk and lower implementation costs across multiple asset classes.

Describe your role and responsibilities as Head of Analytics and Workflow Solutions?

Kevin O’Connor, Virtu

The Analytics and Workflow Solutions teams are focused on providing solutions that help clients become more efficient. I help manage our experienced product development and client services teams who are tasked with developing and supporting our solutions.  The integration of our analytics with our trading technology is a critical aspect of our product plan and we are focused on aligning resources in support of our clients’ process automation and analysis goals across multiple asset classes. 

Describe the evolution of Virtu’s Transaction Cost Analysis (TCA) product from inception to current day?

Virtu has performed TCA in some form since 2000. The original ITG product was focused on customized solutions for trade measurement and modeling. With the acquisition of the Plexus Group in late 2005, the product expanded to a more standardized implementation shortfall and peer analyses. While most TCA work was focused on post-trade equities, the groundwork was established for an expansion into other asset classes and different points of the investment lifecycle. In 2008, FX was added and in 2010, real-time measurement and the linking of pre-trade, real-time and post-trade analysis. Futures and fixed income coverage was built in over the last few years, so that clients looking to do compliance-focused trade oversight and process improvement style TCA could include a broader range of transaction types. 

Now that the ability to estimate transactions costs and measure them in real-time and historically is entrenched, the next logical step is to enable clients to access their data programmatically in support of trade automation, execution strategy development and customized analysis. 

The concept of TCA is evolving into a secure repository of transaction data, market data, benchmarks, cost models, market metrics and other analytics across multiple asset classes. It can also be accessed via a flexible front-end and/or an API layer that supports standard and customized machine learning libraries. As the product continues to advance, so too has the service organization. Ten years ago, our client services team might have been focused on report generation for clients; nowadays they help clients write code as they seek to extract insights from their own data. 

There is a lot of technology terminology out there. What exactly are “Big Data Analytics Utilities”?

To answer this, we need to first establish how Virtu defines big data. If you google big data you’ll come back with a many different responses, but in our view, it’s quite simply the volume of data you have combined with the complexity of that data.  

But just having the data isn’t going to win the race —you need to access, explore and analyze it. 

This could mean taking in structured trade data and merging in with unstructured data that might be contained in a free form text file. It could also be the intake of large amounts of trade data from both an OMS and EMS system. Virtu’s Big Data Analytics Utilities include: Open Intell, Open Python and Open Technology for API, which help subscribers ingest, normalize and analyze large, complex data sets—in seconds. 

What is Open Python?

Virtu’s Open Python product was inspired by a successful client webinar series we held last year, “Everyone Can Code Python”. Individuals from across the investment lifecycle participated over the course of seven weeks and interest spanned across portfolio managers, traders, analysts and operations. Post-series, we learned that while participants were eager to begin coding and help automate their busywork, like manual excel-based tasks, their intentions were stymied by internal IT issues—either they weren’t allowed to install Python or their versions were outdated and they were struggling to upgrade. Open Python solves this issue.  By leveraging Amazon Web Services, Virtu maintains a coding environment on behalf of subscribers and also provides coding support as they start migrating tasks to Python.  Subscribers are empowered to ask ad hoc questions, and we help make sure they get the granular answers they seek.  

What is Open Intell?

Open Intell is similar to Open Python in that we’re deploying existing Virtu Analytics capabilities, such as high-powered servers and machine learning resources, to our customers.

About five years ago, we began to explore machine learning’s possible applications within TCA and integrated our first machine-learning only model into Virtu’s Algo Wheel solution, for performance-driven broker evaluation. Since then, we leverage machine learning to help identify trade clusters and patterns that humans wouldn’t normally or efficiently be able to spot. A use case example is when we use an algorithm to facilitate the grouping of similar accounts for a trade strategy optimization process. Pre-machine learning, this would have required a human to examine either the strategies in each account, or the securities they traded. 

While we advocate for the possibilities of machine learning, it is but one tool our consultants rely on to support clients.  Depending on the case, a more traditional statistical technique or even an observational analysis, made by a subject matter expert, could be the better approach. At Virtu, it’s the optimal mix of tools and tactics balanced against the desired result that matters.

How is Virtu a market leader in data analytics, ie how are your offerings differentiated from other providers? 

The answer to this question has probably changed a bit post-Virtu merger, as we have embraced the firm’s principles of efficiency and scale. An important lesson for us over the last 20 + years, is that you can never anticipate where the next valuable data set will come from. Our Portal infrastructure was built around the premise that to be a successful analytics provider, we needed to quickly ingest data sources we had not interacted with previously and visualize them for our clients. The recent launch of our 606 Aggregation Service is such an example, in this case a client came to us with a few sample XML reports and within a month we returned an interactive dashboard. 

Also, in support of Virtu’s commitment to transparency, we recently launched our Open Technology platform to facilitate data portability through API access. Subscribers have access to our models and metrics, and clients can access their own data, for flexible in-house or third-party processing. We continue to build products that respond and adapt to an ever-changing landscape. 

What is the future of data analytics at Virtu?

We will continue to democratize and integrate our data-ingest and normalization tools internally and plan to make them available for buy-side consumption. Today we receive trade data, almost exclusively, however, we predict a near-term future where portfolio managers and traders can use the very same tools we do. This means they will be able to ingest relevant data, normalize and combine it and use our familiar and intuitive front-end to visualize and interpret the results. While we’re at it, why not extend this capability to other relevant internal groups that also rely on data such as middle/back offices, compliance and finance? For us, that’s the advantage of our Big Data Analytics utilities—empowering every part of an organization to make good decisions, quickly and independently.

Regardless of who you speak with in the Analytics space, they will tell you the same thing, 80% of their time is spent cleaning up your data and we think there is value in the experience we have gained in doing just that over the last 20+ years. 

Data Analytics in Focus first appeared in the Q1 2021 issue of GlobalTrading.