The Beginning of Turmoil

Starting around 2007, two developments began which would slowly erode transparency, market stability and confidence in the market. It was also the beginning of an explosion in stock quotes (but not stock trades) and an era of micro-crashesmini-crashesflash crashes, IPO opening day disasters and algos running unchecked that could destroy a major Wall Street firm in 30 minutes.

Those two developments were:

  1. The passage of Reg NMS which fragmented liquidity and set up latency arbitrage opportunities by spreading it out from a few exchanges to over a dozen.
  2. Computers reached speeds where the largest source of trading latency was the speed of light, leading to exchanges building new trading facilities that offered co-location, a prominent characteristic of HFT.


To be sure, there is considerable debate about when HFT began, and this arises because HFT means different things to different people. Most HFT proponents latch on to this confusion and try to claim benefits from transparent electronic trading as benefits from HFT. But one thing most people agree on, is that HFT is a subset of electronic trading.

The term High Frequency Trading first shows up on Google Trends near the beginning of 2008, with wide-spread use starting July 2009. One common feature of HFT is co-location, which is the practice of running your trading computer in the same room as the exchange computers. Co-location became more wide-spread after the adoption of Reg NMS which provided fair and equal access of exchange market data to any market participant. Reg NMS was adopted in early 2007.

Up until about 2006, the primary source of latency, or delay in processing market trading data came from the speed of computers and networks. Wall Street has always bought the fastest computers available, being among the first industries to adopt the latest technology. Faster computers meant faster processing which meant less delay occurred between receiving market data and trading on it.

After about 2006, computers and networks reached speeds where the distance between the trading computer and exchange computer started to matter. This is because light, and thus information, has a distinct maximum speed of approximately 186 miles (300km) per millisecond (ms). If a computer takes 50 ms to process and act on market data, and the trading computer is 186 miles away, the actual time from information to trade is 52 ms (1 ms to receive + 50 ms to process + 1 ms to transmit). Some would add another millisecond for trade confirmation. Thus a trading computer located 186 miles from the exchange has a disadvantage of 2 ms or 4% due to the speed of light, which is probably not enough to matter. But if the processing time drops to 1 ms, the speed of light disadvantage balloons to 66%, and now location is paramount.

Once computer technology dropped processing time below about 10 ms, co-location offered a significant advantage to those within the exchange room versus those outside this “inner circle”. Left essentially unchecked2 from any market regulation or oversight, programmers exploited this advantage in every way possible, and we can see the results every trading day.

At first, the simple speed advantage was enough to earn fat profits. But as the HFT field became crowded with new participants, the easy profits were gone, and creating and exploiting new speed advantages and opportunities began. There are many ways to accomplish this. One way to do this is slow down some computers or networks by sending a higher number of orders at crucial time, a practice known as Quote Stuffing. Another way: send fake interest in a stock by rapidly changing the quoted price that investors see to determine if there is any interest in the stock – you have plenty of time to cancel the balance of an order before anyone from the outside can execute any real size against it. The list goes on and on.

How High Frequency Trading Harms Even Long Term Investors

  • Investors are warned against using market orders and stops because HFT can and will suddenly withdraw their quotes. This alone should tell you something is rotten at the core.
  • Some universities, such as Georgetown, can no longer afford to buy TAQ market data for their professors or students to analyze. This will lead to less academic oversight, guidance and involvement, as well as students who are less prepared for careers on Wall Street. Data has become prohibitively expensive because of all the excessive quotes generated by HFT.
  • Quote spreads are much wider and less stable during market open, which causes many micro flash crashes in individual stocks.
  • Misleading price quotes interferes with price discovery, one of the core functions of a stock exchange.
  • Mis-allocation of resources, both human and technological.
  • If left unchecked by regulators, traders who want to process quotes, will soon need super-computers, 10 gigabit connections and their own engineering staff to have the same basic level of trading information they needed in 2006.
  • HFT generated so much Quote Spam in the flash crash, that it took 5 months for the SEC to assemble the data.
  • During the flash crash, excessive quotes from HFTs overloaded quote data feeds, causing severe delays: stock quotes from some exchanges were behind over 30 seconds during the height of the flash crash.

Milliseconds, Microseconds and Nanoseconds

Capacity planning is based on peak rates. For the scenarios listed below, we will assume 1,000 symbols and 1 quote per time interval.

Trading at the Seconds Level

Rational human thought requires at least 1 second of time. In one second, a quote can reach everyone on Earth. A 1 Terabyte hard disk will store about 1.5 years of market data. Practically any Ethernet network is capable of receiving this information.

Trading at the Millisecond Level

Below one second, the realm of human perception teeters on the edge of rationality. A single blink of an eye takes 300-400ms. In one millisecond, a quote can travel 186 miles (300km). A 1 Terabyte hard disk will store about 2 days of market data. You will need a gigbit network to receive this information without delay. It takes the SEC a few months to investigate one days worth of data.

Trading at the Microsecond Level

In one millionth of a second, nothing a human does consciously, can be measured. A quote can travel about 1000 feet (300m). A 1 Terabyte hard disk will store about 10 seconds worth of market data, which means you need over 2,000 disks to store 1 day. To receive market pricing information, you would need a terabit network, which is currently a bleeding edge technology. You will be dead before the SEC finishes investigating an hours worth of data.

Trading at the Nanosecond Level

In one billionth of a second, light and therefore market data travels a mere 13 inches (30cm). That would require a new word similar to co-location, but meaning locating your machine inside another as in a blade-server. Just 1 second of market data would fill 100 Terabytes. Petabit networks don’t yet exist outside of highly advanced labs. Humans will be living on another planet before the SEC finishes investigating an hours worth of data.

Trading at the Picosecond Level

You will be forever branded a sensationalist by the mere mention of picosecond (or nanosecond) trading, so try to resist.


Summary of Trading Speed Characteristics

Time Period Quotes in a second Farthest Quotes
can travel
Farthest Computer location (round trip) Network Needed 1 Terabyte Hard Disk Holds
Second 1 186,000 miles (300,000km) Earth Ethernet 1.5 years
Millisecond 1 thousand 186 miles (300km) Philadelphia Gigabit 2 days
Microsecond 1 million 1000 feet (300m) Data-center Terabit 10 seconds
Nanosecond 1 billion 13 inches (30cm) Exchange Computer Petabit 0.01 seconds
Picosecond 1 trillion 0.01 inches (0.3mm) Exchange CPU core N/A 0.00001 secs.

1Not factored into this cost is the infrastructure needed to deliver the massive explosion of quote data that came with the arrival of HFT. It is this additional cost that raises the barrier to entry and thus narrows the diversity of market participants.

2Judging by the language in Reg NMS, regulators, and nearly all those who submitted public comments, were either unaware that the speed of light was limited, or that the limit was so high, that practically speaking, light (and thus information) traveled instantly from one location to another.