Father of Algorithmic Trading Seeks Speed Controls

PROFILE: Thomas Peterffy of Interactive Brokers

Interactive Brokers founder Thomas Peterffy started the practice of sending coded instructions from a broker’s computer to an exchange’s terminal, using a typing cyborg. Now, he believes several “layers” of software are needed in the securities industry’s infrastructure to control high-speed trading.

Interactive Brokers founder Thomas Peterffy is an inadvertent provocateur. He started the computerized trading revolution in the 1980s, without that intention.

Petterfy, by trial and error, figured out how to hack into market data feeds, with crude computing technology, by today’s standards. He even used, in effect, a “trading cyborg,” with keyboard-pounding fingers, to type orders more rapidly than any human on the street.

In the process, he kicked off the use of algorithms, which now account for roughly two-thirds of all trading in the nation’s stock markets. Now, he says, some three to five “layers” of controls need to be placed on the computerized trading he is credited with launching a quarter century ago.Peterffy is, in effect and reality, the father of algorithmic trading. Here’s his story. And his recommendations about how to manage markets where all trades, eventually, are handled by coded instructions.

Petterfy took “the brains of the smartest traders and found a way to express those smarts in a series of algorithms. His programming included all the elements that a crack human trader weighed in making a decision,” writes Christopher Steiner in the book Automate This. How Algorithms Came to Rule the World.

That starts off chronicling Peterrfy’s attempts to hook desktop computers into the digitally-driven operations of electronic markets. “But the computer took far less time to do the math, check the prices and pull the trigger.”

The Hungarian immigrant began as a black box programmer on Wall Street in the 1960s. About a decade later, starting his own business, Peterffy ran afoul of Nasdaq, when he tied an IBM computer to the terminal that brought in quotes from that market and into which new orders were fed.

In effect, he had hacked into the terminal to get its data on trades as they took place and used hard-wired connections back to the terminal to pump in orders, by mathematical rules.

This was the first fully automated algorithmic trading system in the world, Steiner notes.

But lashing a computer to its terminal violated Nasdaq rules that required orders be typed into the terminal one by one. The self-regulating organization told him to unhook the wires from the Nasdaq terminal.

So Peterffy decided to take a different tack. He disconnected the IBM computer.

Then, he and his team of engineers affixed a large lens to the face of the Nasdaq terminal. That enlarged the text it displayed.

A foot away, a camera took in the data and fed it to an attached computer.

Then the data was decoded and plugged into the group’s algorithms.

How to get orders back into the Nasdaq terminal, one by one, using its keyboard only?

Metal rods, pistons and levers. An automated typing machine, as Steiner puts it. At that point, he could abide by Nasdaq’s rules, send out dozens of orders every 30 seconds and leave human traders, he felt, in his dust.

But starting a revolution? Not on his mind at all.


See also:

Algorithms Still Need Human Oversight

Q&A with Thomas Peterffy


 “To tell you frankly, I wasn’t even thinking about that,” Peterffy told Traders Magazine.

The business has changed dramatically since the days when Peterffy skirted Nasdaq rules. What are the next steps for algorithms?

Peterffy says all trading eventually will be done electronically, but he adds a dividing line: Trading based on technical analysis will be figured out and executed exclusively by algorithms. But financial analysis of stock portfolios still will be done by humans.

In either case, the human role will be to oversee the programs that literally carry out orders, he says.

“They will feed the order into the algorithms and the algorithms will execute the order,” he says. “At Interactive, we have algorithms for our customers that our customers can use for any kind of trading.”

Human intervention, he says, will still be needed to manage “position risk in the sense that we need human eyes because we are never sure if any of our algorithms can never go off the deep end.”

Going off “the deep end” is an apt description of what happened, for instance, to Knight Capital on August 1. Its out-of-control algos nearly caused its destruction.

The flood of erroneous orders its computers sent out resulted in $456.7 million of losses for Knight in under 45 minutes.

This week, Knight is expected to field offers for a potential takeover of its operations. In August, it gave away 70% of its shareholders’ equity to a series of investors, led by Jefferies & Co.

Indeed, Peterffy concedes that market events, such as what happened to Knight, could repeat because computer programs can never be trusted “100 percent.”

Here’s his prescription of what to do. In particular, he wants rules that would slow high frequency trades.

“I would like the SEC slow them down by say a second or half a second, or something like that,” Peterffy said.The exchanges, he contends, have already defined what are high frequency trades so that should not be an issue.

“And I wouldn’t slow down all trades; I would only slow down the liquidity-removing trade. There’s a big difference. I wouldn’t change the liquidity-providing trades,” Peterffy adds.

But there are various ways of slowing down trading. These alternatives include a transaction tax, as has been implemented in France; a minimum quote life, as recommended by the European Union; and, letting all buy and sell orders in a given second be matched up and prioritized against each other, eliminating the push to shave more microseconds off getting to the front of the queue.

All of these proposed solutions Peterffy opposes.

Instead, he calls for new layers of protection against erroneous trades. Peterffy notes that today there are various layers of protection for markets.

One set is the single-stock and market-wide circuit breakers put in place by the Securities and Exchange Commission after the flash crash of 2010.

 A second would enforce the market access rule also passed by the SEC in 2011. Its goal was to manage risks in electronic trading when brokers allow their customers to send orders directly to exchanges, Peterffy notes.

“And the rule basically says that there must be pre-established risk limits of each broker-dealer and no order should be transmitted by the broker-dealer or any of his customers when risk limits are exceeded,” he adds.

But these layers of protections weren’t enough to prevent these recent market events, he concedes.

 The market needs more layers of protection to ensure that bad orders don’t get through the system.

“To these two I would add another layer,” he explains.

“It would evaluate each order in terms of the resulting position as part of the pre-existing position and if such position could not be supported within the broker’s or its customer’s capital then the order would have to be rejected,” he told Traders Magazine. All these three layers would evaluate would operate on the broker’s side, according to Peterffy.

Peterffy says “that these software layers that I am proposing to protect the markets must be entirely independent of each other. They must be separately activated and may not contain conditions based on the status of another layer.”

All of them would operate at the broker’s side of a transaction, before orders are sent to market.

“In addition,” Peterffy says, “I would also like to see a safety layer implemented on the exchanges’ side. Exchanges have a more of less well defined description of what constitutes an erroneous trade. These definitions should be clarified and exchanges should program their systems to reject orders the execution of which would result in an erroneous trade.”

In September, NYSE Euronext, the Nasdaq Stock Market, BATS Global Markets and Direct Edge told the Securities and Exchange Commission they are prepared to set up limits on the amount of trading their members conduct in a given trading session and shut them down if they exceed pre-set peaks. The exchange-run controls would act as ‘kill switches’ on unusually high order or trade volume.

The regulators, he predicts, will have to adopt these extra protective layers of software, to avoid future technical disruptions, such as the Knight incident.

“Otherwise,” Peterffy says, “these kinds of things, these market events, will keep on happening.’’