Elon Musk, Apple Co-Founder, Tech Experts Issue Warning on ‘Giant AI Experiments’: ‘Dangerous Race’

There is so much the general public does not understand about AI or its risks to society and civilization.

Imagine it in the wrong hands.

Elon Musk, Apple co-founder, other tech experts call for pause on ‘giant AI experiments’: ‘Dangerous race

Musk, Wozniak, other tech innovators sign open letter urging temporary pause in the development of AI systems more powerful than OpenAI’s GPT-4, citing risks to society and civilization

By Chris Pandolfo | Fox News

Elon Musk, Steve Wozniak, and a host of other tech leaders and artificial intelligence experts are urging AI labs to pause development of powerful new AI systems in an open letter citing potential risks to society.

The letter asks AI developers to “immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.” It was issued by the Future of Life Institute and signed by more than 1,000 people, including Musk, who argued that safety protocols need to be developed by independent overseers to guide the future of AI systems. GPT-4 is the latest deep learning model from OpenAI, which “exhibits human-level performance on various professional and academic benchmarks,” according to the lab.

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable,” the letter said.

The letter warns that at this stage, no one “can understand, predict, or reliably control” the powerful new tools developed in AI labs. The undersigned tech experts cite the risks of propaganda and lies spread through AI-generated articles that look real, and even the possibility that Ai programs can outperform workers and make jobs obsolete.

“AI labs and independent experts should use this pause to jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts,” the letter states.

“In parallel, AI developers must work with policymakers to dramatically accelerate development of robust AI governance systems.”

ARTIFICIAL INTELLIGENCE ‘GODFATHER’ ON AI POSSIBLY WIPING OUT HUMANITY: ‘IT’S NOT INCONCEIVABLE’

Tesla CEO Elon Musk and more than 1,000 tech leaders and artificial intelligence experts are calling for a temporary pause on the development of AI systems more powerful than OpenAI’s GPT-4, warning of risks to society and civilization.

Tesla CEO Elon Musk and more than 1,000 tech leaders and artificial intelligence experts are calling for a temporary pause on the development of AI systems more powerful than OpenAI’s GPT-4, warning of risks to society and civilization.

The signatories, which include Stability AI CEO Emad Mostaque, researchers at Alphabet-owned DeepMind, as well as AI heavyweights Yoshua Bengio and Stuart Russell, emphasize that AI development in general should be not paused, writing that their letter is calling for “merely a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities.”

According to the European Union’s transparency register, the Future of Life Institute is primarily funded by the Musk Foundation, as well as London-based effective altruism group Founders Pledge, and Silicon Valley Community Foundation.

Musk, whose electric car company Tesla uses AI for its autopilot system, has previously raised concerns about the rapid development of AI.

Since its release last year, Microsoft-backed OpenAI’s ChatGPT has prompted rivals to accelerate developing similar large language models, and companies to integrate generative AI models into their products.

Notably absent from the letter’s signatories was Sam Altman, CEO of OpenAI.

AUTHOR

RELATED ARTICLES:

Bing’s AI bot tells reporter it wants to ‘be alive’, ‘steal nuclear codes’ and create ‘deadly virus’ | Fox News

Microsoft AI says it wants to steal nuke codes, make deadly virus

EDITORS NOTE: This Geller Report is republished with permission. ©All rights reserved.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *