event

In the early 2010s, the UK government commissioned a report on the future of computer trading in financial markets, bringing together insights from researchers, professionals and policy-makers in finance. Ten years on, the Systemic Risk Centre hosted a reunion of contributors to that report to explore its impact, failures and successes; how technologies and market structures have developed over the past decade; and the economic and financial consequences for liquidity, resilience, efficiency, international competition, fairness, the costs of transacting and the cost of capital.

The International Foresight Project, which reported in 2012 was commissioned by the UK’s Government Office for Science to explore how computer-based trading (CBT) in financial markets across the world would evolve over the next ten years, identifying potential risks and opportunities, notably in terms of financial stability but also other market outcomes, such as volatility, liquidity, price efficiency and price discovery. The report drew on research evidence, scientific expertise and insights from stakeholders to provide advice to policy-makers, regulators and legislators on the options for addressing present and future risks while realising potential benefits.

A key message of the report was that: ‘despite commonly held negative perceptions, the available evidence indicates that high frequency trading (HFT) and algorithmic trading (AT) may have several beneficial effects on markets. However, HFT/AT may cause instabilities in financial markets in specific circumstances. This Project has shown that carefully chosen regulatory measures can help to address concerns in the shorter term. However, further work is needed to inform policies in the longer term, particularly in view of likely uncertainties and lack of data. This will be vital to support evidence-based regulation in this controversial and rapidly evolving field.’ (Foresight, 2012).

A little over ten years on, Sir John Beddington (who was the government’s chief scientific adviser at the time of the report), together with Kevin Houstoun of Rapid Addition and Jean-Pierre Zigrand of the London School of Economics (both members of the project’s lead expert group), organised a conference to review what has happened since, the extent to which that matches the report’s expectations, as well as future prospects. The event, which was hosted by the Systemic Risk Centre at LSE in the spring of 2023, brought together many of the original contributors to the Foresight project.

The impact of technological developments

Kevin Houstoun outlined key elements of the prospective technological environment when the Foresight report was written. First, it noted that there would be increasing availability of substantially cheaper computing power, particularly through cloud computing. Those who embraced this technology would benefit from faster and more intelligent trading systems.

Second, special purpose silicon chips would gain ground from conventional computers. The increased speed would provide an important competitive edge through better and faster simulation and analysis, and within transaction systems.

Third, computer-designed and computer-optimised robot traders would become more prevalent. In time, they could replace algorithms designed and refined by people, posing new challenges for understanding their effects on financial markets and for their regulation.

Fourth, opportunities would continue to open up for small and medium-sized firms offering ‘middleware’ technology components, driving further changes in market structure. Such components can be purchased and plugged together to form trading systems that were previously the preserve of much larger institutions. Houstoun noted: ‘We’re a company that actually followed up on one of the recommendations in the report, which was that the opportunity exists for middleware providers to provide software to banks and broker dealers, and some of the world’s biggest banks now use our software.’

Houstoun commented that the report’s expectations about the growing role of computers, clouds and artificial intelligence in financial markets have largely proved to have been right, but that two particular areas were missed: the emergence of digital currencies; and the increasing provision of software, time and other things as a service. He then went on to describe three key challenges arising from future technological developments.

First is the extent to which different markets embrace new technology will critically affect their competitiveness and therefore their position globally. The new technologies mean that major trading systems can exist almost anywhere. Emerging economies may come to challenge the long-established historical dominance of major European and US cities as global hubs for financial markets if the former capitalise faster on the technologies and the opportunities they present.

Second, the new technologies will continue to have profound implications for the workforce required to service markets, both in terms of numbers employed in specific jobs and the skills required. Machines can increasingly undertake a range of jobs for less cost, with fewer errors and at much greater speed. As a result, for example, the number of traders engaged in on-the-spot execution of orders has fallen sharply in recent years, and it is likely to continue to fall further in the future. But the mix of human and robot traders is likely to continue for some time, although this will be affected by other important factors, such as future regulation.

Third, markets are already ‘socio-technical’ systems, combining human and robot participants. Understanding and managing these systems to prevent undesirable behaviour in both humans and robots will be key to ensuring effective regulation. While the Foresight report demonstrated that there had been some progress in developing a better understanding of markets as socio-technical systems, greater effort is needed in the longer term. This would involve an integrated approach combining social sciences, economics, finance and computer science.

The impact of HFT and market structure

Speakers during the rest of the conference focused on the impact of CBT on a range of market outcomes including liquidity, speed, transactions costs, retail order flow and financial stability. Oliver Linton of the University of Cambridge pointed out that despite a great deal of negative commentary about HFT, the best evidence suggests that CBT in general and HFT in particular have several beneficial effects on average market quality (Linton and Mahmoodzadeh, 2018).

First, they have contributed to improvements in the liquidity of markets as measured by spreads and other metrics. Second, they have contributed to improvements in transaction costs for both retail investors and institutional investors, mostly due to changes in market structure that are related to the developments of HFT. Third, they have contributed to improvements in market efficiency by getting new information into prices faster and by linking fragmented market places together, thereby enabling competition.

But, Linton added, while liquidity has improved overall, there appears to be the potential for increased periodic illiquidity or liquidity crises such as the US flash crash of 6 May 2010 and various other events since then. Such issues can arise through feedback loops generated within the computerised trading process itself that once started can amplify over time even within well-intentioned management and control processes.

Spyros Skouras of the Athens University of Economics and Business focused on market speed, drawing on his work on financial ecology with Doyne Farmer, which they developed as part of the Foresight project. Their ecological perspective involves thinking about algorithms as species and studying the interaction of a diverse range of algorithms, each of which has a specialised function, and all of which interact strongly with each other to bring about systemic outcomes that would not be obvious and less understood at the level of the interactions of all these algorithms simultaneously (Farmer and Skouras, 2013).

Skouras noted that algorithms in the last few decades have become increasingly fast: time is now measured in nanoseconds, which would have been unimaginable 20 or 30 years ago. He and his co-author examined the extent to which this is a useful development. They distinguished between two types of speed: relative speed, which is being faster than other people; and absolute speed, which is the speed at which the market as a whole functions.

Their analysis was different to that of others who had considered the value of speed. They looked at the value of being first in the queue of a limit order book: whether a trader would be executed first or second when an order came in, to execute the price at which they had sent an order. Others had studied the value of speed when trying to pick off a limit order or when arbitraging markets that trade similar products.

It turns out that the numbers in the traditional calculation are far smaller than what Farmer and Skouras find. They conclude that the value of being relatively faster than other people is incredibly large because the value of being first in the queue is so big. This means that relative speed has even larger private value than is appreciated, while increases in absolute speed no longer improve markets.

In the Foresight report, these researchers suggested giving up the continuous limit order book mechanism and replacing it with a system of ‘smart call auctions’ that happen very regularly to ensure that the market functions very quickly. In that way, absolute speed is retained, but the value of being relatively faster is taken away so that people can compete much more directly with HFT firms.

At the conference, Skouras explained: ‘If it is designed in a clever way, then you can ensure that your access to markets is similar to that of the most sophisticated players, with only a limited budget of a few tens of thousands of pounds per year. That was our idea to design the markets in a way where they would be more democratic and accessible, and there would be less of a dynamic towards an oligopolistic structure. People took up that idea after us, and now it’s an idea that is recognised as a potentially important way to organise markets.’

The impact of market microstructure

Khaladdin Rzayev of the University of Edinburgh explored the impact of a phenomenon that has proliferated over the period since the Foresight report was published: off-exchange, ‘dark’ trading venues for financial assets. He said that dark trading has gone from 15% of total shares traded in the United States in 2013 to over 40% in 2022.

Dark trades are facilitated by ‘dark pools’ (in contrast with ‘lit’ traditional exchanges), a growing class of platforms that do not offer pre-trade transparency. In other words, market participants, other than the submitter and the pool operator, are unaware of the existence of orders submitted prior to their execution. Traders do not have to make public either the price or number of shares of a dark order. But once executed (that is, the order becomes a trade), they must be made public in a timely fashion.

Research has explored the impact of dark trading on market quality, concluding that the results are mixed (Ibikunle and Rzayev, 2022). But what Rzayev focused on at the conference was how dark trading affects corporate policies through market quality. His work shows that firms hold less cash when their shares are traded more in dark venues; and the reduction in corporate cash concentrates in more financially constrained and poorly governed firms.

Dark trading is also associated with greater capital raising, stronger CEO turnover-performance sensitivity, less overinvestment when excess cash is high, and a higher value of cash. Overall, the findings suggest that dark trading enhances market quality and thus reduces cash demand through alleviating finance constraints and improving governance.

Rzayev concluded: ‘There’s a huge discussion around whether we should have restrictions on dark trading. Our study says that we should not look only at market quality. We need to think about what happens on the corporate side to implement or not to implement restrictions.’

Computer-based trading and financial stability

Gbenga Ibikunle of the University of Edinburgh spoke about what’s happened with the speed of financial trading since the Foresight report, and the impact it’s had on various indicators of market quality. He began: ‘2012 seems now as some sort of entry point. Since 2012, we’ve seen an increased use of microwave networks. Prior to that, we’ve seen use of fibre optic cables to connect the distance between different markets and different platforms, because the reality now is that a stock or any instrument is not trading on just one platform, it’s trading on multiple platforms at one time.’

‘So this fragmentation of the computerised trading market creates opportunities for arbitrage. Basically, you can see the price at one location and try to take advantage of that price change in another location. So because of that, we’re seeing increased use of speed, and fibre optics is no longer enough for connecting the markets within Europe because every single person that is fast right now wants to be faster.’

‘They want to open up some sort of speed differential. And those that are faster right now don’t want to close that gap. They want to ensure that they retain the speed differential as well. So we have moved into this situation where we’re leaving fibre optics behind. We’re now using microwave technology predominantly for transmitting messages across markets.’

To illustrate this market development physically, Ibikunle described a village in Belgium that’s on a straight line between Frankfurt and London, and where there is a disused army installation with a mast that is perfect for use with microwave networks. Several HFT firms were keen to buy the facility at an auction to increase their trading speed.

Research by Ibikunle and colleagues analyses information transmission latency between exchanges in Frankfurt and London, and speed-inducing technological upgrades. Their results show that when cross-market latency arbitrage opportunities are linked to the arrival of information, HFT activity impairs liquidity and enhances price discovery by facilitating the incorporation of public information into prices.

Conversely, when cross-market latency arbitrage opportunities are driven by liquidity shocks, HFT improves liquidity and reduces trading costs, thus providing incentives for information acquisition and trading with private information. These findings underscore the complex nature of the association between trading speed and market quality, and they reconcile the mixed evidence in previous research (Rzayev et al, 2023).

What are the lessons for regulators seeking to promote market quality? At the conference, Ibikunle concluded: ‘It’s complex. Trying to disentangle the effects of HFT on market quality characteristics: it depends. There is no definitive takeaway other than the fact that it all depends on the conditions in the market And what is true for one market may not be true for another one. What we do hope, though, is that if regulation tries to come in and do anything, they should try to put in policies that will encourage faster market-makers.’

HFT and retail order flow

Anna Pavlova of London Business School presented research with colleagues on the use of CBT by small private investors, specifically the recent boom in retail investor trading in options, driven by young and tech-savvy, yet inexperienced, US investors (Bryzgalova et al, 2022). The study notes that retail investors typically enter the options market for speculative reasons, often associated with trading in ‘meme’ stocks such as GameStop. These investors, whose share of overall stock trading in the United States has doubled since 2019, account for nearly half of trading volume in options.

Such investors prefer options with very short maturities, primarily calls – bets that an asset’s price will rise. These contracts have high relative bid-ask spreads, making the options business a very lucrative one for wholesalers that execute retail order flow. This is further supported by the ballooning ‘payment for order flow’ (PFOF) for options received by retail brokerages.

Frequent trading produces large order flow and revenue from PFOF for retail investing platforms. Trading assets that are less liquid, such as options, enhances these profits further. This may create an incentive for retail brokerages to encourage more trading in less liquid asset classes or securities. Indeed, the researchers find not only that retail investors in such options have lost $2.1 billion in aggregate, but also, since trading costs in options are orders of magnitude higher than stock investing, there have been further losses of over $7 billion.

The roles of trust and standardisation

In the final session of the conference, Alistair Milne of Loughborough Business School returned to a central issue that he and Kevin Houstoun had explored in the Foresight report on CBT: the role of standard-setting in capital markets. He explained: ‘One of our recommendations was the need for greater standardisation. That’s a foundation for automated computer trading. Other industries like nuclear, aerospace and electrical engineering do this better. They’re more organized in the way they do standards. They have more employees committed to developing them.’

‘We should be putting more resources into developing standards. And it’s about having grown-up conversations where strategically, industry puts its self-interest aside for different firms and says, okay, we’re going to collaborate to make sure we’ve got agreement on data and how to exchange it and what it means so that we can save costs and serve our customers efficiently across the board. It’s happening a bit in financial services, but there’s a long way to go.’

Asked finally about how things have changed in CBT over the past ten years and what his big takeaway from the conference was, Milne said: ‘I think what really stood out for me, which seems to be a tendency across more than one market, is that the technology, far from making markets more competitive, is actually giving greater market power to a few major players because the high costs of setting up the technology in order to do computerised trading is actually shrinking rather than expanding the market. Again, I think that’s possibly something which better standard-setting could address.’

References

Bryzgalova, Svetlana, Anna Pavlova and Taisiya Sikorskaya (2022) ‘Retail Trading in Options and the Rise of the Big Three Wholesalers‘, Journal of Finance.

Farmer, Doyne, and Spyros Skouras (2013) ‘An Ecological Perspective on the Future of Computer Trading’, Quantitative Finance.

Foresight (2012) ‘The Future of Computer Trading in Financial Markets: An International Perspective’, Government Office for Science.

Ibikunle, Gbenga, and Khaladdin Rzayev (2022) ‘Volatility and Dark Trading: Evidence from the Covid-19 Pandemic’, British Accounting Review.

Linton, Oliver, and Soheil Mahmoodzadeh (2018) ‘Implications of High-Frequency Trading for Security Markets‘, Annual Review of Economics.

Rzayev, Khaladdin, Gbenga Ibikunle and Tom Steffen (2023) ‘The Market Quality Implications of Speed in Cross-platform Trading: Evidence from Frankfurt-London Microwave Networks’, Journal of Financial Markets.