Commentary

Elaine Wah

Modern Markets, Modern Metrics - A Blog By IEX

In this blog by IEX's Elaine Wah, the newest public exchange looks to refute public claims that the metrics it uses are designed to inflate its own volume numbers and mislead people.

Traders Poll

Do you think it's a good idea to conduct an access fee pilot to assess the pricing models used by many trading venues?




Free Site Registration

May 26, 2005

Could More Volatility Sabotage Algorithms?

By Peter Chapman

The surge in algorithmic trading will likely come to an end if the market becomes more volatile. That's what some top brokerage execs said at a trading and technology conference in New York City. Deustche Bank's Greg Sharenow and Dan Mathisson of Credit Suisse First Boston are expecting capital commitment to replace algorithmic trades if volatility levels rise.

"When volatility picks up," Sharenow told the TradeTech USA conference, "algorithms as they are today will be less effective." Sharenow said he would prefer more volatility. Volatility, as measured by the Chicago Board Options Exchange's Volatility Index, or VIX, is at its lowest level in eight years. It hit 15.38 recently, a level not seen since 1996. Low levels of volatility mean a more predictable trading environment. That permits computer-generated trading strategies to perform more effectively. With a choppier trading environment, buyside traders are more likely to pay up for brokers' capital rather than risk bad trades.

Mathisson agreed that higher levels of volatility will make his firm's algorithms less desirable. He suggested, however, that market volatility may have become permanently dampened. "Part of the reason we are seeing low volatility," Mathisson said, "is because of the algorithms." An algorithmic trade sends out only a few hundred shares each time into the market, Mathisson noted. Thus, the strategy is less likely to push stock prices dramatically up or down. CSFB says it processes about half its 200 million shares daily via algorithms.