Bryan Cranston and SAG-AFTRA say OpenAI is taking their deepfake concerns seriously
Share this @internewscast.com

Since the unveiling of Sora 2’s AI-generated video technology last month, a wave of concern has swept across actors, studios, agents, and the actors’ union, SAG-AFTRA. A prominent point of contention has been the unauthorized use of celebrity likenesses, as exemplified by Bryan Cranston’s appearance in a video alongside a digital rendering of Michael Jackson. In response to these concerns, OpenAI, along with Cranston and the union, has issued a joint statement announcing that the company will tighten its opt-in policy to safeguard likeness and voice usage.

The joint statement clarified that OpenAI feels remorseful for the inadvertent creation of such videos. This message was also endorsed by major talent agencies like United Talent Agency, the Association of Talent Agents, and the Creative Artists Agency, which have previously criticized the company for inadequate artist protections. Although OpenAI has yet to disclose specific changes to the app, they did not provide a response to The Verge‘s request for comment before the article’s publication.

OpenAI has pledged to bolster protections for those who do not wish to participate, affirming that “all artists, performers, and individuals will have the right to determine how and whether they can be simulated.” The company also promised to swiftly address any complaints regarding policy violations.

While Bryan Cranston expressed gratitude towards OpenAI for its policy updates and enhanced safeguards, SAG-AFTRA president Sean Astin emphasized the need for legal protection against “massive misappropriation by replication technology.” He highlighted the proposed Nurture Originals, Foster Art, and Keep Entertainment Safe Act, better known as the NO FAKES Act, as a legislative step forward.

Initially, OpenAI had introduced Sora 2 with an opt-out policy for copyright holders. However, following public backlash and the emergence of controversial videos, such as a depiction of Nazi SpongeBob, the company reversed its stance. OpenAI has now committed to providing rightsholders with more nuanced control over character generation, akin to the opt-in model for likeness, but with additional regulatory measures.

Share this @internewscast.com
You May Also Like

Meta Shutters Metaverse for Work Initiative: Shift in Strategy Sparks Industry Buzz

Meta has announced a significant shift in its business strategy, ceasing the…

Unveiling the Truth: Are 3D-Scanned Insoles Just High-Tech Placebos?

Optimizer is a weekly newsletter curated by Verge senior reviewer Victoria Song,…

The Evolution of Casting: Embracing a New Era

This edition of Lowpass by Janko Roettgers explores the dynamic intersection of…

Krafton on the Hunt for the Next Big Gaming Phenomenon After PUBG Success

In a recent town hall meeting, Krafton, the developer behind the widely…