Progress always has its downside. Algorithms, when used properly, indisputably reduce transaction costs. These mathematical marvels can lead to better executions. However, flawed algorithms – in a nightmare example of math made too simple – can also be read by trading sharpies. They can read the patterns of some algorithms and front run. "There's a lot of fear that reverse-engineering and pennying problems are causing people using algorithms to incur greater costs than they normally would," says Ben Sylvester, senior trader at money manager David L. Babson & Co. "You have to be cautious about how you use algorithms."
However, the algorithms of recent times are not an accident. They are a
Progress always has its downside.
Algorithms, when used properly, indisputably reduce transaction costs. These mathematical marvels can lead to better executions.
However, flawed algorithms – in a nightmare example of math made too simple – can also be read by trading sharpies. They can read the patterns of some algorithms and front run. "There's a lot of fear that reverse-engineering and pennying problems are causing people using algorithms to incur greater costs than they normally would," says Ben Sylvester, senior trader at money manager David L. Babson & Co. "You have to be cautious about how you use algorithms."
However, the algorithms of recent times are not an accident. They are a response to an environment changed by decimalization and other market structure reforms. These changes fueled algorithmic trading as well as the popularity of ECN and crossing networks. They also promoted the development of direct market access and smart order routing. The reduction in average order size and the decrease in quoted liquidity made algorithmic trading a vital weapon.
A study last year found that 60 percent of buyside firms use algorithmic trading tools to execute orders. Usage is relatively small based on share volume, but it is growing. Traders say they use low-cost algorithmic tools for various reasons: primarily in liquid stocks and to manage workflow and focus on more difficult orders.
But what exactly is algorithmic trading? It covers automated trading in which large orders are broken up and sent into the marketplace according to predetermined quantitative rules. These rules could be based on a number of historical volume patterns, the current day's price and volume activity, as well as other trading signals.
Algorithms are designed to match execution benchmarks. Most traders are familiar with VWAP and TWAP engines. These try to achieve a stock's volume weighted average price, or time weighted average price. Other algorithms aim to match the previous night's close or an implementation shortfall-type benchmark. For example, this could be the portfolio manager's decision price, or the arrival price. This is the price of the stock when the trader receives the order.
Some of the largest providers of algorithmic tools are Credit Suisse First Boston, Goldman Sachs, Bank of America, ITG Inc. and Instinet. Business is good. Algorithmic trading on the buyside may see its share of total order flow rise from five percent in 2004 to 13 percent in 2006, according to the Tabb Group. In comparison, ECN usage over the same period is likely to increase at a slower pace, from 16 percent to 20 percent.
And those numbers may be conservative. A TowerGroup report expects buyside algorithmic trading, based on share volume, to rise from seven percent in 2004 to 21 percent in 2006. Total buyside and sellside algorithmic trading is forecast to increase from 17 percent to 27 percent.
But there is a potential downside to these mathematical wonders. Quantitative Services Group (QSG), a transaction cost consulting firm, released a study that identified "predators" taking advantage of nave algorithmic parceling strategies. The study – which looked at some 7,500 large money manager orders totaling $1.5 billion in the first half of 2004 – found that trades executed according to a random order placement strategy cost two basis points. But those executed in a non-random fashion cost 26 basis points. Both strategies were benchmarked to VWAP.
Information Leakage
John Wightkin, managing partner at QSG, notes that the startling 24-basis point difference is a sign of information leakage. It could also indicate front running, he warned. The study found that the probability of information leakage increased for all types of parceling strategies as the number of executions rose.
Many types of firms engage in front-running, including hedge funds and other proprietary trading operations, Wightkin says. They can pursue fast-turnaround strategies by applying high-frequency statistics to data streams. In this strategy, they can extract information about likely short-term price movements.
"The broad distribution of any algorithm should be approached with skepticism. The opportunity to reverse-engineer a technique is provided with every sales pitch that includes information about how the specific trading algorithm is designed to succeed," says Wightkin.
But there are other potential algorithm dangers. Some buyside traders also worry about specialists reading algorithmic order flow so they can identify patterns. Buyside traders say this foreknowledge could drive up a trader's market impact costs. Others say this is a minor problem. They point to the increased regulatory scrutiny specialists now face.
Nevertheless, market observers say many sophisticated professionals will attempt to identify the mysteries of these trading formulas. Wayne Wagner, chairman of transaction cost analysis firm Plexus Group, notes that "signal reading" is bread and butter to thousands of people in the marketplace. These pros include those at proprietary trading desks, hedge funds and day trading firms.
More Readable
At issue is whether algorithmic systems are more "readable" than traditional block trading. "My opinion is that they're much less readable because they chop trades into pieces that look retail," Wagner says. Of course, it's easy to set up a nave algorithm that will get noticed – for example, one that executes 433 shares every three minutes at 17 seconds after the minute. "But that's worse than nave, it's perverse," says Wagner. And it is ripe for exploitation, another trading official warns.
"If a behavior is predictable, it's logical that it can and will be gamed by some party," says Jana Hale, global head of the algorithmic trading group at Goldman Sachs. "Especially in an era of decimalization, traders must be very careful about order placement logic." She notes that Goldman randomizes the way it places orders into the market.
For example, take a customer who wants to buy 12,000 shares of a stock over 10 minutes. The individual orders that constitute those 12,000 shares, Hale says, would look different from each other when they're put into the marketplace, depending on the prevailing market conditions. Orders are randomized by size, time, and spread levels. These factors are based on the depth of the market at the time. They are also based on how the stock is trading compared to historical volume patterns.
The more sophisticated algorithmic providers work to make algorithms game-proof. The idea is to ensure a trader has a low market profile. In theory, this reduces impact costs. Unsophisticated algorithms could leave a trader exposed if he doesn't disguise the user's intentions. "An algorithm must randomize how it sends orders into the market, how it prices orders, and the way it cancels and replaces those orders," says Hitesh Mittal, product manager for algorithmic trading strategies at ITG. "Otherwise, someone could see that order coming and prepare to make money on it." And it is not difficult for veteran traders to do that.
The market's efficiency ensures that "discernible patterns are pretty quickly flushed out," says Bill Yancey, president of ATD Brokerage Services, a subsidiary of trading and technology company Automated Trading Desk. ATD builds algorithmic trading models used by its own traders and institutional clients.
Yancey adds that successful algorithmic trading models don't have to be complex. "We change our models often but because of the behavior of the market rather than a fear that the pattern will be discernible," he says. He notes that some algorithms have a long shelf life. Others have to be tweaked as markets change.
Despite advances in modeling, data-mining remains a huge concern, says Steve Brain, global head of systematic trading and head of the algorithmic trading division at Instinet. However, he's not worried about hedge funds or specialists reverse-engineering complex algorithms. "If that were happening, we'd see a degradation of trading performance from our algorithms," he says. "That's not the case."
Trust Factor
Brain mentions a different problem associated with traditional sellside brokers providing algorithms. "If a hedge fund were to decide to offer a VWAP algorithm for the buyside to trade with, would anyone trust them?" he asks. "Orders would be coming into that hedge fund and they'd see the order flow." Even if the fund set up an independent group that was segregated from its program and block traders, the hedge fund's performance would depend on its proprietary trading, Brain adds. "All it takes is one guy writing that algorithm to have lunch with a guy on the other side and you instantly know how the algorithm is working," he says.
Who are the big proprietary trading firms – the guy on the other side – he's referring to? "A number of them are the investment banks offering these very services to the buyside," Brain says. "It's harder to reverse-engineer something if you don't already know how it works. But if someone will tell you, it's pretty easy."
And if that's happening, sellside firms could essentially be data-mining their own algorithmic trading group, he says. "You don't need to know what stocks your algorithmic trading group is trading and you don't need to know sizes," Brain says. "If you know the footprints and how that algorithm participates in the market, it's probably significantly easier to exploit that pattern without having anything the regulators would consider insider information."
ITG's Tony Huck, managing director of electronic trading, says clients spend a lot of time worrying about being as "stealthy" as possible. "But they are not always aware of who's touching that order flow or who's seeing it," he says.
Brain adds that agency-only brokers, such as Instinet and ITG, don't have the built-in conflicts of interest, such as proprietary trading, that are associated with traditional sellside brokers.
For their part, many sellside brokerages contend that they put up Chinese walls between algorithmic and proprietary trading groups as well as block trading and program trading desks. Most algorithmic units are separate departments within the brokerages' equities division, according to sellside trading executives. Therefore, traders on other desks can't in theory pull up algorithmic order flow, or peek into the black boxes that spell out the algorithm's logic.
Sign of Confidence
Consider the example of J.P. Morgan. Last summer, it hired eight quantitative modelers to build new algorithms for buyside clients. Robert Kissell, vice president in the algorithmic trading group at J.P. Morgan Securities, stresses that the algorithms were developed by his group and not by the program trading desk. However, they are used internally as well as by external clients. In contrast, Goldman's Hale says it's a sign of confidence that algorithms, whose accuracy and reliability have been demonstrated on her firm's program desk, have been extended to buyside clients.
For buyside traders, a big challenge is making sense of the plethora of algorithmic tools. Many traders are sometimes hard pressed to tell the difference in various algorithms that chase the same benchmark. If, for example, half a dozen firms offer VWAP algorithms, which one is best for the trader? They're also unsure how similar algorithms react under different market conditions.
Some of this unfamiliarity will possibly fade as traders become more experienced using and customizing algorithmic tools. Transaction cost analysis will also help, trading executives say. Post-trade analysis of algorithmic executions should enable traders to identify the market impact costs of these tools.
Performance Statistics
Plexus's Wagner notes that clients are beginning to ask for performance statistics on specific broker-provided algorithms. To this end, his firm is now gathering data on algorithmic trading. In his view, traders should want to weigh up the merits of each algorithm.
Meanwhile, sellside firms are aware of perceptions about information leakage. But whether a trade goes through a block desk or an algorithmic tool, the best safeguard may be buyside traders themselves. "A trader knows when an order doesn't feel right – the way the fills are coming back, how long it's taking, and if there's too much slippage," says Tim Christiansen, equity trader at Sawgrass Asset Management. "You develop a feel for which brokers know how to handle the tougher orders."
Indeed, prudence is critical. "You may want to launch smaller pieces into an algorithm and, as that gets done, maybe pipe in some more or route into another algorithm you're comfortable with," Christiansen says. Sawgrass has had mixed results with some sales traders placing orders into their own algorithms. "Algorithms are often only as good as the inputs or constraints," says Christiansen. "If someone puts in a percentage-of-volume target that's off the mark for a thin name, you'll quickly feel the pain."
One trading executive warns that the human element in trading should not be overlooked. "Algorithms will chase the business, but human relationships will always do a better job," says Packy Jones, chairman and CEO of JonesTrading Institutional Services, a firm that stresses its own network of human traders. Still, he concedes that for difficult or less liquid stocks, "there's a cost associated with not knowing what's going on."
One firm helping cautious buyside desks ease into algorithmic trading is Electronic Specialist. It is an anonymous, execution-only network that launched an algorithmic aggregation portal last November. The portal offers centralized clearing and allows clients to benchmark multiple broker algorithms in real time.
"Customers worry about sending orders into a black box and not knowing who's seeing those orders," says senior managing director Scott Kurland. Buyside traders want to be anonymous and minimize market impact, but uncertainty about modeling makes some anxious, he adds.
If a trader has a large block of stock to work over a few days, "the fear is that someone can pattern that and effectively adapt trading to that particular performance," Kurland says. To counter that risk, some clients spread large orders across a few algorithmic providers so they're represented differently on exchanges and ECNs.
More Popular
As algorithms become more popular, information leakage and gaming have also become part of the debate about execution benchmarks. VWAP has been criticized for its pursuit of the "average" price. In fact, it's now coming under more fire for potentially driving up trading costs.
Babson's Sylvester points out that algorithms based on relative-value benchmarks, such as VWAP, are often bucket-oriented.' That means they want to get a certain volume done within a certain time frame. "Even if brokers randomize that, you have a constant presence in the marketplace," he says. "There's a push to that since you never let the stock relax. Patterns emerge."
Goldman's Hale agrees that VWAP-benchmarked strategies could leave a trader exposed to those searching for patterns. "If a model's logic is very determined and it must get a certain quantity done because it's falling behind its benchmark, it will be more rigid in its order placement," she says. "The more able the algorithm is to absorb real-time inputs, to react and modify itself, the more random the order placement will be and the better the execution is likely to be."
Over the last year the sellside has done more to educate the buyside about algorithms. The crucial part is helping traders figure out "what's the most appropriate algorithm to use for each trade, based on the investment goals, the benchmark and the trader's risk constraints," says J.P. Morgan's Kissell.
Pre-trade Analytics
Meanwhile, sellside firms are expanding pre-trade analytics. These give traders a more detailed idea of what they can expect with their order, based on historical and expected volume patterns. Customizable parameters and toggles are being added.
Algorithmic providers are also developing feedback and diagnostic tools that would eventually enable traders to take greater control of algorithmic orders. And algorithms themselves are becoming more responsive to real-time changes taking place in the markets. But there is an irony in all of this, according to some trading executives. As trading becomes more automated, there will be a push to have algorithms perform like human traders.
(c) 2005 Traders Magazine and SourceMedia, Inc. All Rights Reserved.