Algorithmic trading (algorithmic trading) is trading using the software for placing trading orders according to predetermined trading criteria, which can take into account time, price, trading volume. Algorithmic trading allows trading without human intervention.
Algorithmic trading by investment banks, pension funds, mutual funds and other institutional traders is especially widely used. It allows you to break large trades into much smaller ones and thereby reduce the market impact and attendant risk. Market makers and some hedge funds provide liquidity in the market by generating and executing orders automatically.
High frequency trading
A special class of algorithmic trading is called “high-frequency trading”. Many types of algorithmic trading can be called high-frequency. High-frequency trading strategies use computers to make complex decisions based on electronic information, which they manage to process much earlier than a person.
Algorithmic trading has forever changed the microstructure of the market, especially in relation to the proposed liquidity. Algorithmic trading can be used in any investment strategy, including market-timing, inter-exchange spread, arbitrage, or simply speculation (including trading strategies for following the trend). Investment decisions and their applications can be supplemented with algorithmic support, or they can be fully translated into automatic trading mode.
In 2006, a third of all stocks in the EU and the USA were traded using trade algorithms. In 2009, about 65% of the trading volume in the US markets was provided by high-frequency trading algorithms, although by 2012 their share had decreased to 50%.
In 2006, over 40% of all orders on the London Stock Exchange were carried out by trading algorithms. American and European markets in general have a higher share of algorithmic trading than other markets. In the Forex market, algorithmic trading is also used, approximately 20% of the trading volume on currency options is conducted using trading algorithms. Bond markets are also moving in this direction.
One of the main problems of high-frequency trading is that its profitability is hard to determine. A report released by the TABB Group in August 2009 states that 300 companies and hedge funds specializing in this type of trading were able to earn $ 21 billion in 2008, which, according to the authors of the report, is “small” and “surprisingly modest” relative to the total trading volume on the market.
Algorithmic trading and high-frequency trading are the subject of intense public debate. For example, in the United States, the Securities and Exchange Commission and the Commission for urgent exchange trading state that algorithmic trading conducted by mutual funds provoked a wave of sales and a short-term market decline in 2010. It is also noted that high-frequency trading can contribute to increased market volatility.
Computerization of the flow of orders in financial markets began in the early 1970s, when the New York Stock Exchange introduced the System for determining the turnover of securities (DOT and later Super DOT), which automatically sent orders to the right stock exchange trading counter, where they were executed manually. In the 1980s software trading has become widely used in stock and futures markets.
Financial markets with fully electronic execution and electronic trading system (ECN) developed in the late 1980s and 1990s. In the United States, the transition to the decimal system, which changed the minimum tick size from $ 1/16 ($ 0.0625) to $ 0.01, also supported algorithmic trading because the market microstructure was changed — it became possible to use smaller differences between the bid and ask prices . As a result, the trading advantages of market makers decreased, and market liquidity increased. This increased market liquidity has led institutional traders to split up their orders through trading algorithms to execute them at the best average price.
Algorithmic trading in financial markets was further developed in 2001 when the IBM development team published a report that presented the results of a laboratory experiment conducted using two trading algorithms (MGD from IBM and ZIP from Hewlett-Packard). It has been proven that trading algorithms can significantly exceed the achievements of ordinary traders (the report stated that the difference is billions of dollars).
Development of algorithms
As the markets became electronic, new trading algorithms were introduced. They responded better to temporary market price mismatches and were able to account for prices from several markets simultaneously. Thus, the work that was once performed by people was now transferred to computers. Therefore, the speed of a computer connection, which began to be measured in milliseconds (0.001 seconds) and even microseconds (0.000001 seconds), became very important.
The most automated sites in the United States, for example, NASDAQ (automatic quotations system of the National Association of Stock Dealers), Edge and BATS took a significant piece from less automated ones, for example, NYSE (New York Stock Exchange). Electronic trading with its scale led to a decrease in fees and commission fees, and also pushed the exchanges to international mergers and consolidation of financial markets.
Exchanges are now constantly competing for the best speed of processing orders. For example, back in June 2007, the London Stock Exchange launched a new system called TradElect. It is able to execute an order in 10 milliseconds, and the bandwidth is 3000 orders per second. But now some exchanges have reduced the speed of execution to 3 milliseconds. This is especially important for high-frequency traders because they need to be as fast as possible in order to be one step ahead of their competitors. The cost of computers and software in the financial industry back in 2005 already amounted to more than $ 26 billion, now the numbers are much higher.