Close Menu
Alpha Leaders
  • Home
  • News
  • Leadership
  • Entrepreneurs
  • Business
  • Living
  • Innovation
  • More
    • Money & Finance
    • Web Stories
    • Global
    • Press Release
What's On
Malaysia’s economy minister sees 2026 as a year of ‘execution’ as Anwar administration tries to lock in policy gains

Malaysia’s economy minister sees 2026 as a year of ‘execution’ as Anwar administration tries to lock in policy gains

8 February 2026
The Super Bowl made scarcity its superpower

The Super Bowl made scarcity its superpower

8 February 2026
Dorsey’s Block cutting up to 10% of staff in efficiency push

Dorsey’s Block cutting up to 10% of staff in efficiency push

7 February 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Alpha Leaders
newsletter
  • Home
  • News
  • Leadership
  • Entrepreneurs
  • Business
  • Living
  • Innovation
  • More
    • Money & Finance
    • Web Stories
    • Global
    • Press Release
Alpha Leaders
Home » Prince Harry, Richard Branson, Steve Bannon, and ‘AI godfathers’ call on AI labs to halt their pursuit of ‘superintelligence’—warning the technology could surpass human control
News

Prince Harry, Richard Branson, Steve Bannon, and ‘AI godfathers’ call on AI labs to halt their pursuit of ‘superintelligence’—warning the technology could surpass human control

Press RoomBy Press Room23 October 20253 Mins Read
Facebook Twitter Copy Link Pinterest LinkedIn Tumblr Email WhatsApp
Prince Harry, Richard Branson, Steve Bannon, and ‘AI godfathers’ call on AI labs to halt their pursuit of ‘superintelligence’—warning the technology could surpass human control

A new open letter, signed by a range of AI scientists, celebrities, policymakers, and faith leaders, calls for a ban on the development of “superintelligence”—a hypothetical AI technology that could exceed the intelligence of all of humanity—until the technology is reliably safe and controllable.

The letter’s more notable signatories include AI pioneer and Nobel laureate Geoffrey Hinton, other AI luminaries such as Yoshua Bengio and Stuart Russell, as well as business leaders such as Virgin cofounder Richard Branson and Apple cofounder Steve Wozniak. It was also signed by celebrities, including actor Joseph Gordon-Levitt, who recently expressed concerns over Meta’s AI products, will.i.am, and Prince Harry and Meghan, Duke and Duchess of Sussex. Policy and national security figures as diverse as Trump ally and strategist Steve Bannon and Mike Mullen, chairman of the Joint Chiefs of Staff under Presidents George W. Bush and Barack Obama, also appear on the list of more than 1,000 other signatories.

New polling conducted alongside the open letter, which was written and circulated by the nonprofit Future of Life Institute, found that the public generally agreed with the call for a moratorium on the development of superpowerful AI technology.

In the U.S., the polling found that only 5% of U.S. adults support the current status quo of unregulated development of advanced AI, while 64% agreed superintelligence shouldn’t be developed until it’s provably safe and controllable. The poll found that 73% want robust regulation on advanced AI.

“95% of Americans don’t want a race to superintelligence, and experts want to ban it,” Future of Life president Max Tegmark said in the statement.

Superintelligence is broadly defined as a type of artificial intelligence capable of outperforming the entirety of humanity at most cognitive tasks. There is currently no consensus on when or if superintelligence will be achieved, and timelines suggested by experts are speculative. Some more aggressive estimates have said superintelligence could be achieved by the late 2020s, while more conservative views delay it much further or question the current tech’s ability to achieve it at all.

Several leading AI labs, including Meta, Google DeepMind, and OpenAI, are actively pursuing this level of advanced AI. The letter calls on these leading AI labs to halt their pursuit of these capabilities until there is a “broad scientific consensus that it will be done safely and controllably, and strong public buy-in.”

“Frontier AI systems could surpass most individuals across most cognitive tasks within just a few years,” Yoshua Bengio, Turing Award–winning computer scientist, who along with Hinton is considered one of the “godfathers” of AI, said in a statement. “To safely advance toward superintelligence, we must scientifically determine how to design AI systems that are fundamentally incapable of harming people, whether through misalignment or malicious use. We also need to make sure the public has a much stronger say in decisions that will shape our collective future,” he said.

The signatories claim that the pursuit of superintelligence raises serious risks of economic displacement and disempowerment, and is a threat to national security as well as civil liberties. The letter accuses tech companies of pursuing this potentially dangerous technology without guardrails, oversight, and without broad public consent.

“To get the most from what AI has to offer mankind, there is simply no need to reach for the unknowable and highly risky goal of superintelligence, which is by far a frontier too far. By definition, this would result in a power that we could neither understand nor control,” actor Stephen Fry said in the statement.

Meta openAI policy
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link

Related Articles

Malaysia’s economy minister sees 2026 as a year of ‘execution’ as Anwar administration tries to lock in policy gains

Malaysia’s economy minister sees 2026 as a year of ‘execution’ as Anwar administration tries to lock in policy gains

8 February 2026
The Super Bowl made scarcity its superpower

The Super Bowl made scarcity its superpower

8 February 2026
Dorsey’s Block cutting up to 10% of staff in efficiency push

Dorsey’s Block cutting up to 10% of staff in efficiency push

7 February 2026
The U.S. construction industry will need half a million new workers next year

The U.S. construction industry will need half a million new workers next year

7 February 2026
No, judge tells Trump. You can’t cripple  billion in funding for New York City and New Jersey

No, judge tells Trump. You can’t cripple $16 billion in funding for New York City and New Jersey

7 February 2026
Elon Musk warns the U.S. is ‘1,000% going to go bankrupt’ unless AI and robotics solve debt crisis

Elon Musk warns the U.S. is ‘1,000% going to go bankrupt’ unless AI and robotics solve debt crisis

7 February 2026
Don't Miss
Unwrap Christmas Sustainably: How To Handle Gifts You Don’t Want

Unwrap Christmas Sustainably: How To Handle Gifts You Don’t Want

By Press Room27 December 2024

Every year, millions of people unwrap Christmas gifts that they do not love, need, or…

Walmart dominated, while Target spiraled: the winners and losers of retail in 2024

Walmart dominated, while Target spiraled: the winners and losers of retail in 2024

30 December 2024
Moltbook is the talk of Silicon Valley. But the furor is eerily reminiscent of a 2017 Facebook research experiment

Moltbook is the talk of Silicon Valley. But the furor is eerily reminiscent of a 2017 Facebook research experiment

6 February 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Latest Articles
No, judge tells Trump. You can’t cripple  billion in funding for New York City and New Jersey

No, judge tells Trump. You can’t cripple $16 billion in funding for New York City and New Jersey

7 February 20261 Views
Elon Musk warns the U.S. is ‘1,000% going to go bankrupt’ unless AI and robotics solve debt crisis

Elon Musk warns the U.S. is ‘1,000% going to go bankrupt’ unless AI and robotics solve debt crisis

7 February 20260 Views
Ilhan Omar’s husband is rich. The Republican oversight chairman is investigating why

Ilhan Omar’s husband is rich. The Republican oversight chairman is investigating why

7 February 20261 Views
Anthropic cofounder says studying the humanities will be ‘more important than ever’ and reveals what the AI company looks for when hiring

Anthropic cofounder says studying the humanities will be ‘more important than ever’ and reveals what the AI company looks for when hiring

7 February 20261 Views
About Us
About Us

Alpha Leaders is your one-stop website for the latest Entrepreneurs and Leaders news and updates, follow us now to get the news that matters to you.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks
Malaysia’s economy minister sees 2026 as a year of ‘execution’ as Anwar administration tries to lock in policy gains

Malaysia’s economy minister sees 2026 as a year of ‘execution’ as Anwar administration tries to lock in policy gains

8 February 2026
The Super Bowl made scarcity its superpower

The Super Bowl made scarcity its superpower

8 February 2026
Dorsey’s Block cutting up to 10% of staff in efficiency push

Dorsey’s Block cutting up to 10% of staff in efficiency push

7 February 2026
Most Popular
The U.S. construction industry will need half a million new workers next year

The U.S. construction industry will need half a million new workers next year

7 February 20261 Views
No, judge tells Trump. You can’t cripple  billion in funding for New York City and New Jersey

No, judge tells Trump. You can’t cripple $16 billion in funding for New York City and New Jersey

7 February 20261 Views
Elon Musk warns the U.S. is ‘1,000% going to go bankrupt’ unless AI and robotics solve debt crisis

Elon Musk warns the U.S. is ‘1,000% going to go bankrupt’ unless AI and robotics solve debt crisis

7 February 20260 Views
© 2026 Alpha Leaders. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.