OpenAI launched its latest video generator app Sora on September 30, described as being "more physically accurate, realistic, and more controllable" than ever before.
Share this @internewscast.com

Globally, social media enthusiasts might have observed an increase in AI-crafted videos populating their timelines recently. Experts caution that this trend might deter some users from engaging with these platforms.

On September 30, OpenAI unveiled its newest video generation tool, Sora, touting it as more “physically accurate, realistic, and controllable” than any previous versions.

“Sora 2 can achieve feats that were extraordinarily challenging, if not downright impossible, for older video generation models,” stated OpenAI.

OpenAI launched its latest video generator app Sora on September 30, described as being "more physically accurate, realistic, and more controllable" than ever before.
Social media users around the world may have noticed a surge of AI-generated videos on their feeds in the past few weeks. (Instagram)

Currently, Sora is accessible, but it’s being offered on an invite-only basis until the company expands access to all users.

Already, numerous deepfakes have surfaced, showcasing clips like newborns dashing from hospitals, animals soaring into tornadoes, and elderly women being flipped over, among the most popular.

These creations also feature prominent figures such as the late Queen Elizabeth II, Michael Jackson, and U.S. President Donald Trump.

Melbourne University Centre for AI and Digital Ethics Deputy Director Marc Cheong said these videos were a progression of the “AI slop”, what he describes as low-quality content that clogs up people’s feeds, being created out of OpenAI’s Dall-E program last year.

“Like Jesus Christ with a shrimp on the beach or something along the lines in an impossible image,” he said.

An AI-generated image of Jesus Christ with a shrimp.
An AI-generated image of Jesus Christ with a shrimp. (X)

Sora videos can be a little tricky to tell apart at first glance, particularly if they depict more realistic scenarios.

But there is an app-identifying watermark that is included in all the videos. 

Cheong added that users can count the number of fingers on a hand or check if timestamps actually change to decipher which videos have been generated by AI.

“Basically, if something is too good to be true, it’s important to just treat it with a healthy dose of scepticism and just look into it a bit more,” he said.

Sam Altman, co-founder and chief executive of OpenAI.
Sam Altman, co-founder and chief executive of OpenAI. (AP)

Cheong said he was not surprised by how realistic the videos were, but rather that it was available to use by the general public.

And some of the Sora deepfakes have already landed OpenAI in trouble. 

Sora users have targeted civil rights activist Martin Luther King Jr, with some of the videos showing him fighting fellow crusader Malcolm X, making his famous “I have a dream” speech with some obvious changes and in reportedly racist depictions.

These have caused concern with his estate, which requested OpenAI address the use of his likeness.

OpenAI acknowledged there had been “disrespectful” depictions and paused images of King late last week as it “strengthens guardrails for historical figures”.

“While there are strong free speech interests in depicting historical figures, OpenAI believes public figures and their families should ultimately have control over how their likeness is used,” the company said.

“Authorised representatives or estate owners can request that their likeness not be used in Sora cameos.”

Zelda Williams, the daughter of Robbie Williams, was also forced to make a public plea for users to stop sending her deepfakes of the late comedian and actor.

Zelda Williams, Robin Williams== "House of D" New York Premiere== Loews Lincoln Square, New York== April 10, 2005== ©Patrick McMullan== Photo-Jimi Celeste/PMc==
Zelda Williams and Robin Williams in 2005. (Patrick McMullan via Getty Image)

“To watch the legacies of real people be condensed down to ‘this vaguely looks and sounds like them so that’s enough’, just so other people can churn out horrible TikTok slop puppeteering them is maddening,” she wrote in an Instagram story.

“You’re not making art, you’re making disgusting, over-processed hotdogs out of the lives of human beings, out of the history of art and music, and then shoving them down someone else’s throat hoping they’ll give you a little thumbs up and like it. Gross.”

King’s daughter Bernice followed with her own public plea: “I concur concerning my father. Please stop.”

With a number of deepfakes being troll content, particularly of a known figure, Cheong said the videos posed an issue to a deceased person’s right of reply and consent and autonomy.

“In the event that these AI creations are used for monetary gain, then obviously it is an even more ethically tricky issue, because you’re profiting from the deceased,” he said.

Melbourne University Centre for AI and Digital Ethics Deputy Director Marc Cheong
Melbourne University Centre for AI and Digital Ethics Deputy Director Marc Cheong. (Supplied)

Cheong said the influx of AI slop may end up turning some people away from social media.

“But at the same time, social media is optimised for engagement,” he said.

“There will be users who continue to be engaged with this content, because inherently, how social media gets their profits from is from your attention.”

Share this @internewscast.com
You May Also Like
PM orders review into law enforcement and intelligence agencies

Prime Minister Initiates Comprehensive Review of Law Enforcement and Intelligence Practices

Prime Minister Anthony Albanese has announced a review will be conducted into…

Australia’s Rising Terrorism Threat: Global Factors Leaving the Nation Vulnerable

As the first shots rang out on Sunday afternoon, Francisco “Paco” Chumacero…
IKEA's menu offers exclusive dishes to New Zealand, including the lingonberry pavlova - perhaps finally settling the infamous Aussie vs Kiwi 'pavlova-gate' debate

IKEA’s First New Zealand Store Stirs Debate in Australia Over Surprising Detail: ‘So Unfair!

IKEA has made its long-awaited debut in New Zealand, unveiling its first…
Inaccurate glucose readings pose direct risks. A falsely high reading may cause a diabetic to miss a dangerous low reading, while a falsely low reading can prompt an insulin overdose and severe hypoglycemia (stock)

Critical Alert: Faulty Diabetic Monitors Recalled Across 17 Countries – What You Need to Know

Abbott, a global leader in healthcare solutions, has initiated an extensive recall…
Adelaide prison security breach

Security Alert: Breach Uncovered at Two Adelaide Prisons

An investigation is under way into a suspected security breach at two…

U.S. Coast Guard Engages in High-Stakes Pursuit of Oil Tanker Near Venezuela Amid Heightened Tensions

The US Coast Guard is pursuing an oil tanker in international waters…

Tragic Mass Shooting in South Africa Claims Nine Lives, Leaves 10 Injured

Nine people were killed when gunmen opened fire at a bar outside…

Australia Reinforces Gun Safety: Albanese Launches Buyback Program Amid NSW’s New Restrictions

Prime Minister Anthony Albanese has announced that the government will introduce legislation…
Prince Andrew seen in Epstein files

New Epstein Documents Reveal Andrew Pictured in Controversial Lap Scene

A photograph of Andrew Mountbatten-Windsor lying on a couch has been released,…

Israel Greenlights 19 New West Bank Settlements in Strategic Move Against Palestinian Statehood

Israel has approved the establishment of 19 new settlements in the occupied…
Andy and Dawn Cook built a life in Australia, only to have it threatened by a shocking police check.

From Dream to Nightmare: Andy’s Fight Against False Accusations in Australia

Exclusive: Dawn Cook once believed that the ordeal she and her husband…
Rapper Nicki Minaj has made a surprise appearance at a gathering of conservatives in Arizona.

Nicki Minaj Commends US President, Surprising Conservative Circles

Rapper Nicki Minaj made an unexpected visit to a conservative event in…