SEC warns AI could spark market panic | Inquirer Technology

SEC warns AI could spark market panic

12:01 AM July 19, 2023

SEC chief Gary Gensler warned artificial intelligence could start a widespread market panic. During a National Press Club speech, he declared it could lead investors to make similar decisions as they follow the same signal. As a result, it “could exacerbate the inherent network interconnectedness of the global financial system.”

Our economies are intertwined nowadays, connected by the Internet. People would inevitably implement technologies like artificial intelligence to improve that system. However, we must understand their potential impact to mitigate risks. Consequently, it could help us prevent disasters like the 2008 Financial Crisis.

This issue can help you adapt to artificial intelligence’s growing global influence. That’s why we will discuss Gary Gensler’s main AI concern. Then, I will cover the SEC chief’s other points about artificial intelligence risks.

ADVERTISEMENT

The SEC’s top 5 AI issues

  1. Financial stability
  2. Privacy and intellectual property
  3. Rent extractions
  4. Deception
  5. Explainability and bias

1. Financial stability

This represents the lack financial stability.

Gary Gensler discussed AI threats during his speech for the National Press Club. He understands ChatGPT sparked a tech revolution, and we must prepare.

He said the technology’s potential rivals the impact of mathematical geniuses like Isaac Newton and Rene Descartes. Hence, the SEC website calls his speech “Isaac Newton to AI.”

He mentioned several concerns, but his biggest one is financial stability. Gensler cited a research paper he wrote with Lily Baily called “Deep Learning and Financial Stability.”

It said a few AI platforms may dominate the world as people only choose a handful for daily tasks. For example, investors may rely on AI models to choose portfolio assets by predicting their growth potential.

These platforms may lead investors to make poor decisions if they malfunction. As a result, AI algorithms may lead to a financial disaster like the one from 2008.

It refers to the financial crisis that started in the United States. Many banks lent subprime mortgages excessively, causing a real estate market crisis that echoed throughout the world.

ADVERTISEMENT

2. Privacy and intellectual property

This represents someone abusing privacy and intellectual property.

Gary Gensler also discussed how artificial intelligence could threaten intellectual property and privacy. The SEC head admitted we help train AI models as they collect our sensitive data.

These machines could also train from other applications. Then, who owns the data? Gensler declared this debate rages on in Hollywood, social media companies, and software developers.

You may also like: Apple stock drops after Apple Vision Pro announcement

He said, “This raises questions about economic rents at the macro level as well. The platforms may garner greater economic rents if they become dominant.”

“Rents” refer to product and service prices. Nowadays. Some AI models study market data to charge consumers the maximum amount they’re willing to pay.

Hence, the SEC must “promote competitive, efficient markets.” Also, the Securities and Exchange Commission must assess this technology for this objective.

3. Rent extractions

These are people who had to move out due to excessive rent prices.

Gensler elaborated on how AI-powered rent extractions could impact global markets. The financial expert said artificial intelligence could “shift consumer welfare to producers.”

AI models considering business and consumer interests could cause conflicts of interest. For example, financial advisers or brokers could optimize investment systems to prioritize their interests ahead of investors.

You may also like: NY renter uses ChatGPT to get repairs from landlord

An investor may ask these experts to only invest in safe assets. Then, they could use an AI system to invest in high-risk, high-yield ones to meet investor goals quickly, endangering their client’s finances.

That scenario is similar to the financial disaster caused by the algorithmic stablecoin UST. The digital asset uses mathematical formulas to maintain its price against the LUNA cryptocurrency.

The algorithm failed, causing many to lose their money. Gary Gensler is trying to avoid such a scenario by asking SEC staff to recommend ways to address AI issues.

4. Deception

This represents an online scammer.

The SEC chief noted how artificial intelligence facilitates online misinformation and fraud. Surprisingly, he cited a personal experience regarding this issue.

A communications director contacted Gensler and asked, “Lots of rumors on the Internet today that you resigned. You’d tell me if you did, right?” Later, he realized the rumor came from AI-generated text from a website.

He warned malicious people could use AI to “influence elections, the capital markets, or spook the public.” Nonetheless, Gensler said fraud is fraud under the securities law.

“The SEC is focused on identifying and prosecuting any form of fraud that might threaten investors, capital formation, or the markets more broadly,” he said.

5. Explainability and bias

This represents humans overwhelmed by the complexity of AI.

The SEC leader knows AI models consist of complex mathematical models. They plot words on a graph and use embeddings to form logical connections.

Algorithms link user queries to these models to produce human-like responses. Yet, that complexity prevents most people from understanding how AI systems analyze information.

You may also like: How AI is already transforming our lives

This phenomenon is the “black box problem,” which explains why AI companies struggle to explain how their programs work. For example, ChatGPT creator OpenAI has an AI detection program, but it isn’t reliable due to this issue.

Gensler added the ability to predict doesn’t mean AI models are always accurate. They could have latent or hidden biases that affect their predictions and, eventually, our actions.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

Conclusion

The SEC warned artificial intelligence could impact global markets as more people rely on them. They could mislead investors and lead to financial disaster.

However, Gary Gensler says the Securities and Exchange Commission accepts the challenges of artificial intelligence. He and his teams will tirelessly protect consumer interests and the global economy.

Everyone must do their part by understanding AI and its effects further. Start by checking out more digital tips and trends at inquirer Tech.

Frequently asked questions about the SEC and AI

What is the SEC?

Investopedia says, “The U.S. Securities and Exchange Commission (SEC) is an independent federal government regulatory agency responsible for protecting investors, maintaining fair and orderly functioning of the securities markets, and facilitating capital formation. It was created by Congress in 1934 as the first federal regulator of the securities markets.”

What are AI’s risks?

SEC chief Gary Gensler warned artificial intelligence might harm the global financial system. It could mislead investors as more rely on these tools for daily tasks. Also, some educators worry it could eliminate critical thinking skills as students depend on artificial intelligence programs for everything.

How can we prevent AI’s risks?

The United States SEC is taking proactive steps to mitigate potential AI risks by studying the technology further. Also, governments must enact laws to regulate AI development and penalize potential harm to people and society. You can do your part by learning about the technology further.

TOPICS: evergreen, interesting topics, SEC, Trending
TAGS: evergreen, interesting topics, SEC, Trending

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.