Algorithmic Trading Systems and Solutions – Q & A

The following Algorithm Q&A Special Report was crafted after conversations with the Buy and Sell sides of the Institutional Trading Community. This Report is not a re-hash of all things Algo, but rather the tough questions with candid answers from industry professionals. The purpose of the Report is to assist you in your product/solution purchasing decisions.

Q: What new type of algorithms is your firm working on right now? How important will customizing algorithms for clients be in the future?

Frank Brown, EdgeTrade:

EdgeTrade recently launched Sumo, a smart order execution algorithm that offers traders the ease of issuing single orders for aggressive, timely execution, while overcoming some of the inherent limitations of single market and limit orders. Sumo electronically micro-manages the process of determining how much of an order to send out at any one time for execution, and the destinations where these pieces should be sent. Customizing algorithms for clients is a cornerstone of EdgeTrade's business. Traders have different styles and trade for various reasons; they want algorithms that will support their approach. Traders are very open with us about their trading goals, as they know EdgeTrade's agency model means we will never take advantage of information they reveal to us.

Carl Carrie, JP Morgan:

We're very excited about the launch of our algorithm for portfolios-TAO. TAO, short for Trading Algorithmic Optimizer, is the industry's first algorithmic tool that integrates an interactive module for pre, post and concurrent analytics with an Optimizer that generates changing algorithmic parameters as the markets move. TAO handles a variety of trading constraints and can cash balance while managing risk and cost. To accomplish this, it uses a dynamically changing efficient trading frontier and a proprietary high-speed optimizer, bolstered by a high performance distributed computing backend to ensure maximum speed and resiliency.

We are also preparing to introduce a new family of algorithms that includes 2nd generation liquidity-seeking techniques. These would leverage dark books, crossing engines and more sophisticated micro-order submission models.

John Coulter, Vhayu:

Vhayu VelocityTM is a platform for algorithmic trading which simultaneously analyzes and stores financial market data to enable trading applications to achieve best execution with zero latency. CSFB, BofA, Goldman, and other industry leaders have a huge lead in providing algorithmic trading tools for distribution to their buy-side customers. They've spent decades of man-years and millions of dollars perfecting strategies and building the infrastructure capable of handing such massive amounts of tick data. Our algorithmic trading platform levels the playing field by giving brokers of all sizes an out-of-the-box solution without requiring an army of programmers. VelocityTM provides data feed handlers, a patented data store to allow the data to be analyzed and stored in memory simultaneously, flexible APIs to write and test your own algorithms, a customizable VWAP Trading engine and an Event Driven Interface for publishing complex analytics or automated transactions out to trading applications.

Brian Fagen, Morgan Stanley:

Our current development plan is focused in several different areas. First, we are developing algorithms that are able to access alternative liquidity pools such as crossing networks, within the context of an optimal execution plan. Secondly, we are looking at new algorithms that are responsive to different types of price movements. Third, we are looking at new algorithms specifically tailored for stocks of certain liquidity parameters, such as small and midcap stocks. Lastly, we are looking at portfolio level algorithms and how to tie the execution process and portfolio management closer together.

We believe that the customization of algorithms is best achieved through adjustable features and parameters to algorithms that have a defined execution goal. The development of a plethora of unique customized algorithms will create more confusion than benefit due to the inability to measure their effectiveness.

Tony Huck, ITG:

ITG Algorithms are moving forward on three major innovations in algorithmic trading. First, ITG's Dark Server, is designed with the objective of seamlessly accessing pools of hidden liquidity within multiple ATSs and ECNs while protecting orders from being gamed. Second, a group of algorithmic capabilities helps traders and portfolio managers to trade lists, baskets, or programs, intelligently using dollar imbalance and risk reduction while attempting to minimize market impact. Finally, other new capabilities will be introduced soon to combine price and quote predictions with opportunistic trading in ATSs and ECNs, delivering executions along a spectrum from passive to aggressive. Even now, the vast array of parameters available in ITG Algorithms provides wide de facto customization capabilities. Going forward, ITG Algorithms are progressing towards even more sophisticated methods to help prevent gaming of orders while maximizing fill rates.

Richard Johnson, Miletus Trading

Our latest algorithm is called ROBE (Risk Optimized Basket Execution), which our traders have been using on the desk for 4 months and is currently being integrated into our MultiVerse product so that clients can access it directly from their desktop. ROBE incorporates real-time models of market impact and trading risk to generate an optimal trading strategy for the portfolio as a whole. Unlike some of our competitors' portfolio algorithms, which use static or periodically updated trading schedules, ROBE uses an innovative mathematical approach to generate a dynamic optimal trading schedule that continually updates based on current market conditions.

Since Miletus was founded nearly two years ago, we recognized that clients need to have algorithms that can be customized to suit their trading style. Our whole technology infrastructure has been designed for algorithmic trading and is highly scaleable: if a client wants to customize an existing algorithm we can usually do that in a few days; to build a more complex algorithm from scratch may take a few weeks.

David Liles, Bernstein:

The most recent addition to the Bernstein product line is PortFall', our portfolio-level implementation shortfall algorithm. Using proprietary optimization techniques, PortFall continually updates an optimal execution path that attempts to minimize spread costs, liquidity impact, market risk and information dissemination for the entire portfolio. We take advantage of the hedging effects in the basket by using a full correlation matrix rather than a multi-factor approach, and we update the matrix on a daily basis using timescales relevant to the trading horizon. Clients may also specify additional constraints including dollar/beta neutrality and index/ETF tracking error minimization.

Another recent addition is our Inline' algorithm. In sharp contrast to the typical participation-based arrival price algorithm, Inline is designed to proactively implement a trading decision. The assumption is that the user would be willing to execute the entire order quickly if liquidity is available. Only real-time data is used and there is no pre-determined trading plan; participation rates are not capped and can be high when the situation is appropriate. Inline models market impact in real-time and uses advanced statistical techniques to get smarter' about limit order placement for a given stock as it trades it, in effect using feedback to discern the supply/demand imbalance in the stock at the time each order is placed.

Derek Morris, BNY Brokerage:

The goal is to generate performance from any single algorithm, not race to produce more algorithms. We believe the market open and close offers fertile ground to develop a product that will help our clients improve their performance at those critical points in the trading day. This is the closest we care to come to offering generic "broad stroke" algorithms that, by their nature, invite a faster rate of performance decay because they attempt to be a "catch all" for trading volume.

Our pure agency stance positions us to have the kinds of discussions with clients they might be uncomfortable having with brokers who also have a proprietary business. The objective is to wed our knowledge of how clients trade with our understanding of market microstructure and automation. We try to customize tactics that replicate the ways they prefer to execute various orders. This is where we think we can add more value to help them in their quest to achieve best execution and improve the efficiency of their increasing workload in a post-Reg NMS hybrid marketplace.

To disclose specific types of algorithms would be tipping our hand too much to those market participants who work diligently to reverse-engineer algorithmic behavior and adversely affect our clients' performance.

David Mortimer, Piper Jaffray:

Algorithms are designed to meet or exceed a clients' execution benchmark. Piper Jaffray & Company's APT Group has delivered a full menu of algorithms designed to achieve this goal for the US Equities markets. As new benchmarks, market structure changes, or potential trading strategies emerge, we will be at the forefront in developing solutions to meet our client's needs. In fact, we have several clients who are measured by non-standard or proprietary execution benchmarks. In these instances, we have developed custom algorithms that will achieve or exceed the targeted performance goal.

Andrew F. Silverman, Goldman Sachs:

Over the past few years, we've witnessed a rapid adoption into single stock algorithms centered on minimizing slippage around a benchmark (VWAP, Implementation Shortfall). We believe the market will demand higher-touch' services such as portfolio trading algorithms and customized solutions to meet growing demands.

It is common buy-side practice to package orders into portfolios (program) to take advantage of decreased explicit (commissions) and implicit (market impact) costs through diversification. Until recently, buy-side desks have not had an algorithmic option. GSAT's PortX algorithm now offers true' portfolio trading capabilities by addressing portfolio level, rather than single stock, risk and cost characteristics. Through pre-trade analysis PortX allows the trader to define their preferences (i.e. what is your main objective, risk reduction or trading cost reduction), and takes into account several factors to minimize the risk of the execution profile such as cash, sector and/or beta balancing; basket constituent correlations; and volatility.

Clients acknowledge that algorithms are useful tools, but they seek guidance on which algorithm or set of parameters are most appropriate for individual orders. GSAT's Navigator, an algorithm of algorithms, addresses this need. Navigator sits above the GSAT suite, and serves as a customizable smart router' interpreting individual security and order characteristics, as well as the trader's view of short-term alpha, to intelligently route orders to a particular GSAT algorithm (Piccolo, 4CAST, Participate, VWAP or TWAP) based on suitability. Navigator is customizable and easy to install – routing takes place in back-end, eliminating order entry errors and the need for extensive OMS development work.

John Wightkin, QSG:

At Quantitative Services Group (QSG) we do not create algorithms. Rather we help our clients better understand the implementation of their algorithms. Through our patent-pending measures and the use of our tick-based metrics, we can provide our clients unique insights into the success of the algorithms they employee. What we have seen through our analysis is the need for algorithms to continually evolve. Algorithms appear to lose their advantages through time due to competition. We also believe that the goal of the algorithm should compliment the stock selection strategy. We have witnessed the dramatic impact that mismatches between the stock selection and execution strategy can have on the overall execution costs. These unintended consequences point to the need to customize algorithms for a firm's specific investment strategy.

Jarrod Yuster, Merrill Lynch:

We are working on a broad set of initiatives that include:

*portfolio level strategies

*multi-asset strategies

*enhanced global market strategies

Customization will be a key component of next-generation algorithms. At Merrill Lynch, we built our platform with customization in mind-our flexible architecture allows customization based on any of the 150+ algorithmic factors in our databases which include criteria checks, trade scheduling, limit order pricing and routing options.

Q: Much has been said about reverse engineering. First, do you believe that reverse engineering is happening? Second, if so, how have you designed your algorithms so they cannot be reverse engineered?

Jarrod Yuster, Merrill Lynch:

Yes, there are participants who try to sniff out large institutional trades prior to their execution-regardless of whether these trades stem from an algorithm, a block, or a program trade. At Merrill Lynch, we focus on three areas to prevent our clients from trading in a predictable manner: 1) utilizing varying submission sizes and intervals 2) using multiple exchange mnemonics and leveraging exchange-specific order types for concealing liquidity 3) rigorously analyzing performance results versus our pre-trade expectations to identify any patterns of underperformance.

John Wightkin, QSG:

Because of our ability to disentangle the direct impact effects from the market movements associated with a series of executions, we can more clearly identify predatory trading. From our research, we have definitely seen excessive information leakage associated with certain types of algorithms which points to the possibility of reverse engineering. To stay ahead of these "tape sniffers", algorithms will have to continue to evolve. Also, the users of these algorithms will have to utilize the proper measurements and analytical systems to monitor their algorithms for the potential risk of reverse engineering.

Andrew F. Silverman, Goldman Sachs:

We are certain that reverse-engineering occurs in the marketplace. Goldman Sachs continuously monitors its execution quality against a number of quantitative benchmarks. In order to remain competitive, our algorithms are frequently refined to enhance our execution quality through improved order placement and randomization. Because the marketplace is a zero sum game, our improving performance gives us some confidence that our algorithms are not being "gamed". Goldman's algorithms are extremely adaptive to changing market conditions, and an added layer of client specific customization, based on our clients trading styles and individual risk profile assessments, yields to a multi dimensional logic grid that results in effective randomization.

David Mortimer, Piper Jaffray:

We are certain that reverse engineering is happening, and we believe that it will grow further as the use of algorithmic trading and the internalization methods that accompany it gain greater market share. Reverse engineering is not a concern here at APT because of how we built the foundation of our business. All of our algorithms actually have several different components (algorithms) built into them. Two of those components determine (in milliseconds for each and every "child order" we send) an order placement strategy that is dependant on many various market factors, including the market micro-structure at that moment-in-time. Therefore, even in our VWAP algorithms, we employ price-predictive techniques which make each child order inherently unpredictable and random by its very nature. Our fully price-predictive ESP Algorithm (Execution thru Statistical Prediction) will work our clients order using a suite of statistical analysis methodologies, seeking to achieve best price over one hour, one day, or several days.

Derek Morris, BNY Brokerage:

Reverse engineering can manifest itself in many ways. In its most complex and difficult-to-detect form, market participants can discern and exploit repetitive patterns of "canned" algos using automated models that "learn as they go". This is the biggest challenge to so-called "canned" algos. There will always be a player sophisticated enough to exploit the market and profit at the expense of someone else's use of algorithmic technology. By developing customized algorithms with shorter shelf lives, we can make his task far more challenging. It is imperative, therefore, to have the tools available to monitor performance, and, importantly, recognize decaying performance as soon as possible.

For clients who take advantage of our DEx platform offerings, we offer an additional level of anonymity and security with our hosted access to all major liquidity destinations, in other words, true anonymous direct market access. We improve overall performance by connecting our Smart Plus order routing and market access gateway to their execution platform to optimize the search for liquidity.

David Liles, Bernstein:

There are always going to be market participants who try to gain an advantage by discerning the actions of others. This is nothing new; all trading methodologies are potentially vulnerable, and there is no reason to expect that algorithmic trading would be the exception. That being said, the risk can be minimized through proper design. Much of the concern regarding reverse engineering is due to the obvious footprint left by poorly designed versions of first generation algorithms such as those of the time slice or target volume variety. Our algorithms, on the other hand, are based on rigorous quantitative models with a large number of inputs. This makes their actions much harder to detect than those of heuristic, rules-based strategies. We also employ probabilistic elements in our models to introduce a random aspect into the order generation pattern, further reducing the chances of leaving a detectable trail.

Richard Johnson, Miletus Trading

At Miletus, we did a study of the 500 largest listed stocks and the 500 largest OTC stocks, looking at normalized intra-day price volatility. We were amazed by the volatility spikes we observed at discrete time intervals-large spikes at 15- and 30-minute intervals and smaller spikes at 5-minute intervals. This is clearly an indication of very basic time-slicing and VWAP algorithms at work. These algorithms divide the day into bins and have a target quantity to trade per bin determined by historical U-curves; they start trading passively with limit orders, becoming more aggressive until at the end of the bin, they have to complete their target quantity and whatever is remaining is sent out using market orders. This type of behavior is very noticeable and can easily be reverse engineered.

To prevent this it is important to devise a strategy that is not bin-driven, that uses randomization of time and order size, and utilizes many different order types and liquidity sources.

Tony Huck, ITG:

Reverse engineering, and more importantly, the detection and gaming of algorithms have always been a concern. Firms that do not continually change their underlying methodology open themselves up to reverse engineering and gaming. ITG employs numerous methods such as randomization to thwart efforts to reverse engineer and game the ITG Algorithms. Ultimately, the best defense is a good offense; ITG Algorithms constantly evolve to stay at least one step ahead of this issue.

Brian Fagen, Morgan Stanley:

The dramatic growth of algorithmic trading as a percentage of the overall volume has likely attracted other market participants to attempt to spot stocks being traded in a systematic manner. We are continuously refining and updating our trading process to prevent our orders being spotted in the market. This requires an intensive market microstructure research effort as well as continual detailed analysis of our execution data.

We have a world-class team of researchers who are focused on this issue and others that relate to the overall quality of our execution.

John Coulter, Vhayu:

Since approximately 80 percent of algorithmic trading is being done using VWAP it seems unlikely that many developers are devoting valuable time reverse engineering other strategies. For buy-side traders the most ubiquitous question asked of brokers is "what differentiates your algorithms from your competitors?" I don't think the industry has gotten a tangible response. Comparison is usually done by kicking tires because there really is no easy way to understand different strategies without looking at the underlying code. That's the major reason why real-time transaction cost analysis will continue to become a necessity on the trading desk to get instantaneous feedback. Our VelocityTM product enables brokers and hedge funds to analyze and store each tick simultaneously to devise their own strategies, back-test against historical data, validate against real-time data, publish automated trades to an OMS and calculate real-time TCA to measure the results, all with zero latency.