Bryan Cranston and SAG-AFTRA say OpenAI is taking their deepfake concerns seriously
Share this @internewscast.com

Since the unveiling of Sora 2’s AI-generated video technology last month, a wave of concern has swept across actors, studios, agents, and the actors’ union, SAG-AFTRA. A prominent point of contention has been the unauthorized use of celebrity likenesses, as exemplified by Bryan Cranston’s appearance in a video alongside a digital rendering of Michael Jackson. In response to these concerns, OpenAI, along with Cranston and the union, has issued a joint statement announcing that the company will tighten its opt-in policy to safeguard likeness and voice usage.

The joint statement clarified that OpenAI feels remorseful for the inadvertent creation of such videos. This message was also endorsed by major talent agencies like United Talent Agency, the Association of Talent Agents, and the Creative Artists Agency, which have previously criticized the company for inadequate artist protections. Although OpenAI has yet to disclose specific changes to the app, they did not provide a response to The Verge‘s request for comment before the article’s publication.

OpenAI has pledged to bolster protections for those who do not wish to participate, affirming that “all artists, performers, and individuals will have the right to determine how and whether they can be simulated.” The company also promised to swiftly address any complaints regarding policy violations.

While Bryan Cranston expressed gratitude towards OpenAI for its policy updates and enhanced safeguards, SAG-AFTRA president Sean Astin emphasized the need for legal protection against “massive misappropriation by replication technology.” He highlighted the proposed Nurture Originals, Foster Art, and Keep Entertainment Safe Act, better known as the NO FAKES Act, as a legislative step forward.

Initially, OpenAI had introduced Sora 2 with an opt-out policy for copyright holders. However, following public backlash and the emergence of controversial videos, such as a depiction of Nazi SpongeBob, the company reversed its stance. OpenAI has now committed to providing rightsholders with more nuanced control over character generation, akin to the opt-in model for likeness, but with additional regulatory measures.

Share this @internewscast.com
You May Also Like

Oura Unveils Revamped App Featuring Enhanced Stress Monitoring Capabilities

Oura is unveiling a revamped app that comes loaded with enhanced tools…

Score Big Savings: Get Iniu’s Compact 10,000mAh Power Bank at Nearly 50% Off!

Compact power banks are widely available, but the Iniu P55-E2 portable charger…

Google Sets October 29th Deadline with Significant Implications for Epic

In a recent development, US District Court Judge James Donato has agreed…