Scientists reveal the exact date when technology will surpass human intelligence - and there's not long to wait
Share this @internewscast.com

Since homo sapiens first emerged, humanity has enjoyed an unbeaten 300,000–year run as the most intelligent creatures on the planet.

However, thanks to rapid advances in artificial intelligence (AI), that might not be the case for much longer.

Many scientists believe that the singularity – the moment when AI first surpasses humanity – is now not a matter of ‘if’ but ‘when’.

And according to some AI pioneers, we might not have much longer to wait.

A new report from the research group AIMultiple combined predictions made by 8,590 scientists and entrepreneurs to see when the experts think the singularity might come.

The findings revealed that AI experts’ predictions for the singularity keep getting closer and closer with every unexpected leap in AI’s abilities.

In the mid–2010s, scientists generally thought that AI couldn’t possibly surpass human intelligence any time before 2060 at the earliest.

Now, some industry leaders think the singularity might arrive in as little as three months’ time.

The singularity is the moment that AI's intelligence surpasses that of humanity, just like Skynet in the Terminator films. This might seem like science fiction, but experts say it might not be far away

The singularity is the moment that AI’s intelligence surpasses that of humanity, just like Skynet in the Terminator films. This might seem like science fiction, but experts say it might not be far away 

What is the singularity?

In mathematics, the singularity refers to a point where matter becomes so dense that the laws of physics begin to fail.

However, after being adopted by science fiction writer Vernor Vinge and futurist Ray Kurzweil, the term has taken on a radically different meaning.

Today, the singularity usually refers to the point at which technological advancements begin to accelerate well beyond humanity’s means to control them.

Often, this is taken to refer to the moment that an AI becomes more intelligent than all of humanity combined.

Cem Dilmegani, principal analyst at AIMultiple, told Daily Mail: ‘Singularity is a hypothetical event which is expected to result in a rapid increase in machine intelligence.

‘For singularity, we need a system that combines human–level thinking with superhuman speed and rapidly accessible, near–perfect memory.

‘Singularity should also result in machine consciousness, but since consciousness is not well–defined, we can’t be precise about it.’

Scientists' predictions about when the singularity will occur have been tracked over the years, with a trend towards closer and closer predictions as AI has continued to surpass expectations

Scientists’ predictions about when the singularity will occur have been tracked over the years, with a trend towards closer and closer predictions as AI has continued to surpass expectations 

When will the singularity arrive?

Earliest predictions: 2026

Investor Prediction: 2030  

Consensus prediction: 2040–2050

Predictions pre–ChatGPT: 2060 at the earliest 

What is the earliest the singularity might arrive?

While the vast majority of AI experts now believe the singularity is inevitable, they differ wildly in when they think it might come.

The most radical prediction comes from the chief executive and founder of leading AI firm Anthropic, Dario Amodie.

In an essay titled ‘Machines of Loving Grace’, Mr Mamodei predicts that the singularity will arrive as early as 2026.

He says that this AI will be ‘smarter than a Nobel Prize winner across most relevant fields’ and will ‘absorb information and generate actions at roughly 10x–100x human speed’.

And he is not alone with his bold predictions. 

Elon Musk, CEO of Tesla and Grok–creators xAI, also recently predicted that superintelligence would arrive next year.

Speaking during a wide–ranging interview on X in 2024, Mr Musk said: ‘If you define AGI (artificial general intelligence) as smarter than the smartest human, I think it’s probably next year, within two years.’

CEO and founder of AI firm Anthropic, Dario Amodei (pictured), predicted in an essay that AI would become superintelligent by 2025

CEO and founder of AI firm Anthropic, Dario Amodei (pictured), predicted in an essay that AI would become superintelligent by 2025

Likewise, Sam Altman, CEO of ChatGPT creator OpenAI, claimed in a 2024 essay: ‘It is possible that we will have superintelligence in a few thousand days.’

That would place the arrival of the singularity any time from about 2027 onwards.

How likely is it that these predictions are right?

Although these predictions are extreme, these tech leaders’ optimism is not entirely unfounded.

Mr Dilmegani says: ‘GenAI’s capabilities exceeded most experts’ expectations and pushed singularity expectations earlier.’

The key to this revaluation of AI’s potential is that the power of leading AI models has grown exponentially, roughly doubling once every seven months.

If this exponential growth starts to accelerate, it could kick off a chain reaction that would lead to a sudden intelligence explosion.

For this reason, some AI leaders think that the singularity could arrive incredibly quickly once the right conditions are met.

Sam Altman (pictured), CEO of ChatGPT creator OpenAI, has argued that AI will surpass humanity by 2027-2028 at the earliest

Sam Altman (pictured), CEO of ChatGPT creator OpenAI, has argued that AI will surpass humanity by 2027–2028 at the earliest 

The confidence of tech leaders is based on the rapid increase in the power of AI models. These graphs show how various types of 'Large Language Models' have rapidly increased their computing power over the last decade

The confidence of tech leaders is based on the rapid increase in the power of AI models. These graphs show how various types of ‘Large Language Models’ have rapidly increased their computing power over the last decade 

What is Artificial General Intelligence (AGI)?

Artificial General Intelligence (AGI) is currently seen as one of the necessary preconditions for the singularity.

AGI describes a type of computing system that is as good as a human at a wide range of different tasks.

At the moment, some AIs are better than humans at certain tasks but not at every task. Experts call this Narrow Artificial Intelligence.

Once AGI is achieved, experts think it will be between two and 30 years until the AI surpasses the collective intelligence of humanity.  

However, the expert consensus is that the singularity will not arrive for many more years.

Mr Dilmegani says that if the singularity does arrive next year, he will ‘happily print our article about the topic and eat it.’

Over–optimism about AI is nothing new, and countless exaggerated predictions have been proven wrong in the past.

For example, Geoffrey Hinton, the so–called ‘godfather of AI’, predicted that hospitals would not need radiologists by 2021 because they would be replaced by AI.

Even as far back as 1965, AI pioneer Herbert Simon boldly claimed: ‘Machines will be capable, within twenty years, of doing any work a man can do.’

As Mr Dilmegani points out, AI’s capabilities are currently nowhere near what the human mind is capable of.

However, business leaders like Mr Altman and Mr Amodei have good reasons to overstate how soon the singularity will arrive.

Mr Dilmegani says: ‘An earlier singularity timeline places current AI leaders as the ultimate leaders of industry.

Based on the rapid growth of computational power available, Elon Musk has predicted that AI will surpass humanity by the end of this year

Based on the rapid growth of computational power available, Elon Musk has predicted that AI will surpass humanity by the end of this year 

The support for this theory is that the abilities of AI appear to have grown exponentially. This graph shows how the speed of AI agents has doubled once every seven months

The support for this theory is that the abilities of AI appear to have grown exponentially. This graph shows how the speed of AI agents has doubled once every seven months

AI expert Cem Dilmegani, of AIMultiple, told Daily Mail that he would print out and eat his research if Elon Musk's predictions (illustrated) came true

AI expert Cem Dilmegani, of AIMultiple, told Daily Mail that he would print out and eat his research if Elon Musk’s predictions (illustrated) came true

‘The company that reaches singularity could always remain as the world’s most valuable company.

‘This optimism fuels investment, and both of these CEOs run loss–making companies that depend on investor confidence.’

When is the singularity likely to arrive?

To get a picture of when the singularity is really likely to occur, Mr Dilmegani and his colleagues combined surveys covering 8,590 AI experts.

This showed that, although predictions have moved a lot closer since the release of ChatGPT, most experts think the singularity is probably about 20 years away.

For the singularity to occur, experts say that AI will first need to reach a state known as Artificial General Intelligence (AGI), in which it has human–like abilities across a wide range of tasks.

Although the results of some studies vary, the consensus is that this will happen by around 2040.

Some groups, like investors, were more confident and placed this moment a little earlier, usually centring on some point around 2030.

However, most experts believe that the singularity will arrive sometime around 2040-2060. Investors are more bullish, putting this point around 2030. However, the overall consensus is that the AI singularity is coming, but not next year

However, most experts believe that the singularity will arrive sometime around 2040–2060. Investors are more bullish, putting this point around 2030. However, the overall consensus is that the AI singularity is coming, but not next year

Once AGI has been achieved, the experts think that the singularity will follow very quickly as the AI rapidly evolves to reach ‘superintelligence’. 

In one poll, scientists assigned a 10 per cent probability to the singularity arriving two years after AGI and a 75 per cent chance of this happening within the next 30 years.

However, although the experts’ opinions put the singularity much further back than leaders like Mr Altman and Mr Musk, most agree that the singularity is coming.

That may mean humanity does not have much longer to enjoy our position as the smartest creatures on Earth.

Elon Musk’s hatred of AI explained: Billionaire believes it will spell the end of humans – a fear Stephen Hawking shared

Elon Musk pictured in 2022

Elon Musk pictured in 2022

Elon Musk wants to push technology to its absolute limit, from space travel to self-driving cars — but he draws the line at artificial intelligence. 

The billionaire first shared his distaste for AI in 2014, calling it humanity’s ‘biggest existential threat’ and comparing it to ‘summoning the demon’.

At the time, Musk also revealed he was investing in AI companies not to make money but to keep an eye on the technology in case it gets out of hand. 

His main fear is that in the wrong hands, if AI becomes advanced, it could overtake humans and spell the end of mankind, which is known as The Singularity.

That concern is shared among many brilliant minds, including the late Stephen Hawking, who told the BBC in 2014: ‘The development of full artificial intelligence could spell the end of the human race.

‘It would take off on its own and redesign itself at an ever-increasing rate.’ 

Despite his fear of AI, Musk has invested in the San Francisco-based AI group Vicarious, in DeepMind – which has since been acquired by Google – and OpenAI, creating the popular ChatGPT program that has taken the world by storm in recent months.

During a 2016 interview, Musk noted that he and OpenAI created the company to ‘have democratisation of AI technology to make it widely available’.

Musk founded OpenAI with Sam Altman, the company’s CEO, but in 2018 the billionaire attempted to take control of the start-up.

His request was rejected, forcing him to quit OpenAI and move on with his other projects.

In November, OpenAI launched ChatGPT, which became an instant success worldwide.

The chatbot uses ‘large language model’ software to train itself by scouring a massive amount of text data so it can learn to generate eerily human-like text in response to a given prompt. 

ChatGPT is used to write research papers, books, news articles, emails and more.

But while Altman is basking in its glory, Musk is attacking ChatGPT.

He says the AI is ‘woke’ and deviates from OpenAI’s original non-profit mission.

‘OpenAI was created as an open source (which is why I named it ‘Open’ AI), non-profit company to serve as a counterweight to Google, but now it has become a closed source, maximum-profit company effectively controlled by Microsoft, Musk tweeted in February.

The Singularity is making waves worldwide as artificial intelligence advances in ways only seen in science fiction – but what does it actually mean?

In simple terms, it describes a hypothetical future where technology surpasses human intelligence and changes the path of our evolution.

Experts have said that once AI reaches this point, it will be able to innovate much faster than humans. 

There are two ways the advancement could play out, with the first leading to humans and machines working together to create a world better suited for humanity.

For example, humans could scan their consciousness and store it in a computer in which they will live forever.

The second scenario is that AI becomes more powerful than humans, taking control and making humans its slaves – but if this is true, it is far off in the distant future.

Researchers are now looking for signs of AI reaching The Singularity, such as the technology’s ability to translate speech with the accuracy of a human and perform tasks faster.

Former Google engineer Ray Kurzweil predicts it will be reached by 2045.

He has made 147 predictions about technology advancements since the early 1990s – and 86 per cent have been correct. 

Share this @internewscast.com
You May Also Like

“Elon Musk’s Massive Acquisition of Southern Land Uncovered”

Elon Musk’s Texas real estate empire is far more extensive than previously…