Facebook is starting to feed its AI with private, unpublished photos
Share this @internewscast.com

For years, Meta has utilized the billions of public images uploaded by users onto Facebook and Instagram for training its AI programs. Now, the company aims to access images that users have not uploaded to these platforms. While Meta informs The Verge that it is not currently using these unuploaded photos for AI training, it refrained from commenting on potential future plans or the rights it might assert over images in users’ camera rolls.

On Friday, TechCrunch reported that some Facebook users attempting to post through the Story feature have encountered pop-up prompts asking if they’d like to enable “cloud processing.” This option would allow Facebook to “select media from your camera roll and upload it to our cloud periodically,” in order to generate “creative concepts like collages, recap videos, AI restyling, or specific themes such as birthdays or graduations.”

Agreeing to this feature, the message outlines, means users consent to Meta AI terms, allowing the AI to examine “media and facial features” within those unpublished photos, as well as note when the photos were taken and identify any other individuals or objects present. Additionally, users grant Meta the right to “retain and utilize” this personal information.

Meta recently acknowledged that it scraped the data from all the content that’s been published on Facebook and Instagram since 2007 to train its generative AI models. Though the company stated that it’s only used public posts uploaded from adult users over the age of 18, it has long been vague about exactly what “public” entails, as well as what counted as an “adult user” in 2007.

Meta tells The Verge that, for now, it’s not training on your unpublished photos with this new feature. “[The Verge’s headline] implies we are currently training our AI models with these photos, which we aren’t. This test doesn’t use people’s photos to improve or train our AI models,” Meta public affairs manager Ryan Daniels tells The Verge.

Meta’s public stance is that the feature is “very early,” innocuous and entirely opt-in: “We’re exploring ways to make content sharing easier for people on Facebook by testing suggestions of ready-to-share and curated content from a person’s camera roll. These suggestions are opt-in only and only shown to you – unless you decide to share them – and can be turned off at any time. Camera roll media may be used to improve these suggestions, but are not used to improve AI models in this test,” reads a statement from Meta comms manager Maria Cubeta.

On its face, that might sound not altogether different from Google Photos, which similarly might suggest AI tweaks to your images after you opt into Google Gemini. But unlike Google, which explicitly states that it does not train generative AI models with personal data gleaned from Google Photos, Meta’s current AI usage terms, which have been in place since June 23, 2024, do not provide any clarity as to whether unpublished photos accessed through “cloud processing” are exempt from being used as training data — and Meta would not clear that up for us going forward.

And while Daniels and Cubeta tell The Verge that opting in only gives Meta permission to retrieve 30 days worth of your unpublished camera roll at a time, it appears that Meta is retaining some data longer than that. “Camera roll suggestions based on themes, such as pets, weddings and graduations, may include media that is older than 30 days,” Meta writes.

Thankfully, Facebook users do have an option to turn off camera roll cloud processing in their settings, which, once activated, will also start removing unpublished photos from the cloud after 30 days.

The feature suggests a new incursion into our previously private data, one that bypasses the point of friction known as conscientiously deciding to post a photo for public consumption. And according to Reddit posts found by TechCrunch, Meta’s already offering AI restyling suggestions on previously-uploaded photos, even if users hadn’t been aware of the feature: one user reported that Facebook had Studio Ghiblified her wedding photos without her knowledge.

Correction, June 27th: An earlier version of this story implied Meta was already training AI on these photos, but Meta now states that the current test does not yet do so. Also added statement and additional details from Meta.

Share this @internewscast.com