СƵ

Skip to content

Shelly Palmer - The AI-Driven Stock Market: A Ticking Time Bomb?

SASKTODAY's newest columnist, Shelly Palmer has been named LinkedIn’s “Top Voice in Technology,” and writes a popular daily business blog.
shellypalmer230813
Illustration created by Midjourney with the prompt “a highly stylized photo of SEC Chair Gary Gensler preaching in a very high-tech hall of AI supplicants. –ar 16:9 –v 5.2”

As we stand on the precipice of an overwhelmingly AI-driven financial world, we must ask ourselves: Are we prepared for the potential fallout? Gary Gensler, the SEC chair, has sounded the alarm bells, suggesting that AI could be at the heart of future financial crises. His insights, drawn from a  he penned in 2020, shed light on the risks of AI in financial markets and the limitations of regulatory bodies in addressing these challenges.

The AI “Black Box” Dilemma

One of the most glaring risks is the rise of AI-powered “black box” trading algorithms. These algorithms, driven by deep learning models, make decisions that are often opaque to human understanding. The most obvious danger? An improperly tuned algorithm could cause properly tuned algorithms to start panic trading, leading to a market crash.

Gensler also points out the “apprentice effect,” the phenomenon where individuals with similar training backgrounds tend to develop models with similar characteristics. This homogeneity in AI models, coupled with potential regulatory constraints, could lead to a synchronized market response, amplifying the risk of a crash.

The Regulatory Challenge

Regulating these AI models is going to be exceptionally hard. AI algorithmic processes are not easily explainable, making it challenging for regulators to prevent market crashes. As Gensler aptly puts it, “If deep learning predictions were explainable, they wouldn’t be used in the first place.”

The risks of AI extend beyond trading. Many AI systems are now tasked with assessing creditworthiness. The opaque nature of those systems makes it difficult to determine if they’re making discriminatory judgments. Moreover, the evolving nature of AI means that a model that was unbiased yesterday might become biased today.

Gensler suggests that the growing adoption of deep learning in finance is likely to escalate systemic risks. Gensler thinks that part of a solution might be to increase capital requirements for AI-dependent financial institutions. I’m not sure how those would be defined because everyone will be AI-dependent in a year or so. His other suggestion, to subject AI-generated results to a “sniff test” from more transparent linear models, is problematic for the same reasons. What’s clear is that Gensler believes regulatory measures may not be enough.

The Data Conundrum

AI’s voracious appetite for data presents another challenge. As AI models begin using the same massive training sets, they risk inheriting and amplifying any inherent weaknesses in the data. Gensler warns of the dangers of models generating correlated predictions, leading to market crowding and herding. He also warned that monopolistic data governance can result in too many single points of failure and said that even the most extensive datasets have their limitations, noting that many lack the historical depth to cover a complete financial cycle.

The Road Ahead

The integration of AI into our financial systems is inevitable. The potential of AI to optimize trading strategies, assess creditworthiness, and predict market trends is undeniable. Gensler’s warnings underscore the intricate dance between innovation and regulation. As AI algorithms, with their inscrutable logic, become the backbone of our financial systems, the challenges those algorithms pose are multifaceted, from the unpredictability of “black box” trading to potential biases in credit assessments. The convergence on shared datasets further complicates the picture, introducing vulnerabilities that could ripple through markets. While Gensler offers potential mitigations, they serve as a stark reminder that navigating the AI-infused future requires not just technological acumen but also regulatory foresight and a deep understanding of systemic interdependencies.

If you want to more deeply understand how AI models are likely to impact our financial system, please sign up for our free online course .

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. I am not a financial adviser. Nothing contained herein should be considered financial advice. If you are considering any type of investment, you should conduct your own research and, if necessary, seek the advice of a licensed financial adviser.

[email protected]

ABOUT SHELLY PALMER

Shelly Palmer is the Professor of Advanced Media in Residence at Syracuse University’s S.I. Newhouse School of Public Communications and CEO of The Palmer Group, a consulting practice that helps Fortune 500 companies with technology, media and marketing. Named  he covers tech and business for , is a regular commentator on CNN and writes a popular . He's a , and the creator of the popular, free online course, . Follow  or visit . 

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks