Premium Market Analysis, Trader Education, Software, and Trading Strategies

Quantitative trading

GPT Will Not Offer You a Trading or Investing Edge

Photo by Sebastian Arie Voortman

There is excitement in the financial community that GPT will offer a trading or investing edge. Nothing can be further from the truth.

Advanced information processing tools increase market efficiency and decrease the significance of any edges for investing and trading. In my book Profitability and Systematic Trading, published by Wiley in 2008 and is free to download here, I wrote in the preface:

One could also assert that the numerous technological advancements,
the abundance of information, and improved means of processing have
increased the complexity and the difficulty of trading for a profit rather
than making the life of traders easier. This assertion is partly due to the
fact that markets have become more efficient while opportunities are
becoming scarce and increasingly difficult to identify.

Technical analysis emerged in the early twentieth century. Early users were able to capitalize on it and profit. However, as soon as computing power became cheap and technical analysis became popular through books and later the web, it lost any edge it offered, and its use even caused a significant regime change in the market, as explained in this article of the blog. I have argued that the compression of the classical technical analysis patterns turned into a wealth-redistribution scheme from uninformed market participants to professionals. How could anyone otherwise justify the significant profits of CTAs and other professionals in the 1980s and 1990s given that trading, especially futures and forex, is a zero-sum game?

In the late 1990s, neural networks and genetic programming, a subset of the tools that we may today include with machine learning, received the attention of traders and investors, and hopes for the Holy Grail reemerged. The failure of those simplistic approaches and methods to trading and investing was spectacular because of data-mining bias and multiple comparison problems, as explained in this blog article. Specific applications may be able to identify edges, but those remain proprietary, and the publicly available free tools and libraries naturally cannot offer any edge unless that edge is in the application and remains elusive.

As far back as 2008, I warned traders and investors not to rely on public information processing tools to find edges, simply because those are available to everyone, and when a tool has widespread use, it loses its significance. Some people on social media may think that what will differentiate them is their unique application of those tools. This is probably due to a cognitive bias: in the era of social media and free blogging, combined with the availability of talent, those edges, even if they exist, will be short-lived.

Tools such as GPT represent a significant paradigm shift in processing information, and this means any edges people will be able to identify with their use will be quickly arbitraged out, or their performance will drop significantly due to overcrowding. In the beginning, some spectacular successes may be reported, as was the case with classical technical analysis in the past, but effectively, these tools will increase information efficiency and make markets harder to trade, as noted 15 years ago in the book I mentioned above. I anticipated these developments because it is not hard to do so, and I also anticipate further and even more significant ones in the field of information processing. I do not expect any singularities in the foreseeable future. Probably the signal-to-noise ratio of artificial intelligence will experience a singularity, but not the world anytime soon.

Note that I refer to GPT as a tool for information processing, rather than artificial intelligence, because this is what it is. Information processing is a function of artificial intelligence, albeit one of the most important ones. However, artificial intelligence encompasses many more functions than information processing, some of which do not even have a sound theoretical foundation at this stage. For example, we do not know what “sentience” is or how it emerges, despite numerous speculations that these tools may become sentient in the future. For every argument that sentient machines will emerge, there is a counterargument that this will not happen. Therefore, any reference to sentience at this early stage of the information processing revolution is hype and adds to the noise, while it may distort the expectations of people who are not familiar with the foundational issues of artificial intelligence.

In my opinion, traders and investors should focus on what they do best: the former on risk management and discipline in identifying and executing edges, and the latter on finding a competent adviser who can navigate the noise and offer superior risk-adjusted returns. Ignoring subjects with a signal-to-noise ratio near zero and being part of click-bait campaigns, or in the case of academia, a tenure credit rush, is of paramount importance to success. Focus on what you do best while you are ahead and ignore the noise. This is my opinion.


Premium Content

Online Books
Premium Articles
Systematic Market Signals
Trading Strategies

By subscribing you have immediate access to hundreds of articles. Premium Articles subscribers have immediate access to more than two hundred articles and All in One subscribers have access to all premium articles, books, premium insights, and market signals content.

 

 

Free Book

Subscribe for free notifications of new posts and updates from the Price Action Lab Blog and receive a PDF of the book “Profitability and Systematic Trading” (Wiley, 2008) free of charge.

 

 

Disclaimer:  No part of the analysis in this blog constitutes a trade recommendation. The past performance of any trading system or methodology is not necessarily indicative of future results. Read the full disclaimer here.