Algorithmic trading 2016



Algorithmic trading, also called algo trading and blackbox trading, encompasses trading systems that are heavily reliant on complex mathematical formulas and high-speed, computer programs to determine trading strategies.[1][2] These strategies useelectronic platforms to enter trading orders with an algorithm which executes pre-programmed trading instructions accounting for a variety of variables such as timing, price, and volume.Algorithmic trading is widely used by investment banks, pension funds,mutual funds, and other buy-side (investor-driven) institutional traders, to divide large trades into several smaller trades to managemarket impact and risk.

Algorithmic trading may be used in any investment strategy or trading strategy, including market making, inter-market spreading,arbitrage, or pure speculation (including trend following). The investment decision and implementation may be augmented at any stage with algorithmic support or may operate completely automatically.
Many types of algorithmic or automated trading activities can be described as high-frequency trading (HFT), which is a specialized form of algorithmic trading characterized by high turnover and high order-to-trade ratios.[6] As a result, in February 2012, theCommodity Futures Trading Commission (CFTC) formed a special working group that included academics and industry experts to advise the CFTC on how best to define HFT.[7][8] HFT strategies utilize computers that make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe. Algorithmic trading and HFT have resulted in a dramatic change of the market microstructure, particularly in the way liquidity is provided.
Profitability projections by the TABB Group, a financial services industry research firm, for the US equities HFT industry were US$1.3 billion before expenses for 2014,[10]significantly down on the maximum of US$21 billion that the 300 securities firms and hedge funds that then specialized in this type of trading took in profits in 2008,[11] which the authors had then called "relatively small" and "surprisingly modest" when compared to the market's overall trading volume. In March 2014, Virtu Financial, a high-frequency trading firm, reported that during five years the firm as a whole was profitable on 1,277 out of 1,278 trading days,[12] losing money just one day, empirically demonstrating thelaw of large numbers benefit of trading thousands to millions of tiny, low-risk and low-edge trades every trading day.
A third of all European Union and United States stock trades in 2006 were driven by automatic programs, or algorithms.[15] As of 2009, studies suggested HFT firms accounted for 60-73% of all US equity trading volume, with that number falling to approximately 50% in 2012.[16][17] In 2006, at the London Stock Exchange, over 40% of all orders were entered by algorithmic traders, with 60% predicted for 2007. American markets and European markets generally have a higher proportion of algorithmic trades than other markets, and estimates for 2008 range as high as an 80% proportion in some markets. Foreign exchange markets also have active algorithmic trading (about 25% of orders in 2006).[18] Futures markets are considered fairly easy to integrate into algorithmic trading,[19] with about 20% of options volume expected to be computer-generated by 2010.[dated info][20] Bond markets are moving toward more access to algorithmic traders.[21]
Algorithmic trading and HFT have been the subject of much public debate since the U.S. Securities and Exchange Commission and theCommodity Futures Trading Commission said in reports that an algorithmic trade entered by a mutual fund company triggered a wave of selling that led to the 2010 Flash Crash.[22][23][24][25][26][27][28][29] The same reports found HFT strategies may have contributed to subsequent volatility by rapidly pulling liquidity from the market. As a result of these events, the Dow Jones Industrial Average suffered its second largest intraday point swing ever to that date, though prices quickly recovered. (See List of largest daily changes in the Dow Jones Industrial Average.) A July, 2011 report by the International Organization of Securities Commissions (IOSCO), an international body of securities regulators, concluded that while "algorithms and HFT technology have been used by market participants to manage their trading and risk, their usage was also clearly a contributing factor in the flash crash event of May 6, 2010."[30][31] However, other researchers have reached a different conclusion. One 2010 study found that HFT did not significantly alter trading inventory during the Flash Crash.[32] Some algorithmic trading ahead of index fund rebalancing transfers profits from investors.
Computerization of the order flow in financial markets began in the early 1970s, with some landmarks being the introduction of the New York Stock Exchange's “designated order turnaround” system (DOT, and later SuperDOT), which routed orders electronically to the proper trading post, which executed them manually. The "opening automated reporting system" (OARS) aided the specialist in determining the market clearing opening price (SOR; Smart Order Routing).
Program trading is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over US$1 million total. In practice this means that all program trades are entered with the aid of a computer. In the 1980s, program trading became widely used in trading between the S&P 500 equity and futures markets.
In stock index arbitrage a trader buys (or sells) a stock index futures contract such as the S&P 500 futures and sells (or buys) a portfolio of up to 500 stocks (can be a much smaller representative subset) at the NYSE matched against the futures trade. The program trade at the NYSE would be pre-programmed into a computer to enter the order automatically into the NYSE’s electronic order routing system at a time when the futures price and the stock index were far enough apart to make a profit.
At about the same time portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black–Scholes option pricing model.
Both strategies, often simply lumped together as "program trading", were blamed by many people (for example by the Brady report) for exacerbating or even starting the 1987 stock market crash. Yet the impact of computer driven trading on stock market crashes is unclear and widely discussed in the academic community.[36]
Financial markets with fully electronic execution and similar electronic communication networks developed in the late 1980s and 1990s. In the U.S., decimalization, which changed the minimum tick size from 1/16 of a dollar (US$0.0625) to US$0.01 per share, may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers' trading advantage, thus increasing market liquidity.
This increased market liquidity led to institutional traders splitting up orders according to computer algorithms so they could execute orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time-weighted average price or more usually by the volume-weighted average price.
A further encouragement for the adoption of algorithmic trading in the financial markets came in 2001 when a team of IBMresearchers published a paper[38] at the International Joint Conference on Artificial Intelligence where they showed that in experimental laboratory versions of the electronic auctions used in the financial markets, two algorithmic strategies (IBM's ownMGD, and Hewlett-Packard's ZIP) could consistently out-perform human traders. MGD was a modified version of the "GD" algorithm invented by Steven Gjerstad & John Dickhaut in 1996/7;[39] the ZIP algorithm had been invented at HP by Dave Cliff (professor) in 1996.[40] In their paper, the IBM team wrote that the financial impact of their results showing MGD and ZIP outperforming human traders "...might be measured in billions of dollars annually"; the IBM paper generated international media coverage.
As more electronic markets opened, other algorithmic trading strategies were introduced. These strategies are more easily implemented by computers, because machines can react more rapidly to temporary mispricing and examine prices from several markets simultaneously. For example, Chameleon (developed by BNP Paribas), Stealth (developed by the Deutsche Bank), Sniper and Guerilla (developed by Credit Suisse[41]), arbitrage, statistical arbitrage, trend following, and mean reversion.
This type of trading is what is driving the new demand for Low Latency Proximity Hosting and Global Exchange Connectivity. It is imperative to understand what latency is when putting together a strategy for electronic trading. Latency refers to the delay between the transmission of information from a source and the reception of the information at a destination. Latency has as a lower bound determined by the speed of light; this corresponds to about 3.3 milliseconds per 1,000 kilometers of optical fibre. Any signal regenerating or routing equipment introduces greater latency than this lightspeed baseline.