Meta is testing a new feature that lets it access your phone’s camera roll photos—even the ones you haven’t shared. The company says it’s not training AI models on these unpublished images yet, but remains vague on future plans or rights over that data.
The issue started when Facebook users trying to post Stories saw prompts asking if they wanted to opt into “cloud processing.” This would upload selected photos from their camera roll regularly to Meta’s cloud, enabling features like AI-made collages, recaps, or themed suggestions (birthdays, graduations, etc.).
Users who accept give Meta permission to analyze “media and facial features,” dates, and objects in these private photos. They also grant Meta the right to “retain and use” this info.
Meta has confirmed it previously scraped all public Facebook and Instagram posts since 2007 to train its generative AI models. But the company’s definition of “public” and “adult user” remains murky. Here, Meta insists it’s not yet training AI on your unpublished photos.
Ryan Daniels, Meta public affairs manager, told The Verge:
“[The Verge’s headline] implies we are currently training our AI models with these photos, which we aren’t. This test doesn’t use people’s photos to improve or train our AI models.”
Meta’s comms manager Maria Cubeta added:
“We’re exploring ways to make content sharing easier for people on Facebook by testing suggestions of ready-to-share and curated content from a person’s camera roll. These suggestions are opt-in only and only shown to you – unless you decide to share them – and can be turned off at any time. Camera roll media may be used to improve these suggestions, but are not used to improve AI models in this test.”
Unlike Google Photos (which explicitly states it doesn’t use users’ personal photos for AI training), Meta’s AI terms don’t clarify if unpublished camera roll photos accessed via cloud processing could be used as training data later. Meta declined to clarify this point.
Users who opt in reportedly allow Meta to retrieve only the last 30 days of photos, but Meta admits cloud suggestions might include older media for themes like weddings or pets. Facebook users can disable this camera roll cloud processing and have their photos removed from the cloud after 30 days.
Meta’s feature is sparking backlash for invading users’ private photos without explicit posting consent. Reddit posts reveal some users found AI restyling suggestions applied to older photos without their prior knowledge, including a case where someone’s wedding photos were “Studio Ghiblified.”
This testing move raises fresh privacy concerns about Meta’s expanding AI data pipeline.
Correction: An earlier version implied Meta was already training AI on these photos. Meta now says this test does not use them for AI training yet. Additional Meta statements have been included.