Meta is cracking down on the AI “nudify” app Crush AI. The company just sued Joy Timeline HK, Crush AI’s maker, for running thousands of ads across Meta’s platforms while dodging ad reviews.
The lawsuit, filed in Hong Kong, says the app pushed “AI undresser” ads on Facebook and Instagram despite repeated removals. Meta called out Joy Timeline HK for ignoring ad policy and cycling through dozens of advertiser accounts with names like “Eraser Anyone’s Clothes” plus numbers. At one point, Crush AI even had a Facebook page promoting its service.
The app uses generative AI to make fake explicit images of real people without consent. According to the Faked Up newsletter’s Alexios Mantzarlis, Crush AI ran over 8,000 ads just in the first two weeks of 2025. Around 90% of its web traffic came from Facebook or Instagram.
The issue started as AI undressing app links exploded across social platforms, including X and Reddit. YouTube reportedly served millions of ads for similar services. Meta and TikTok banned keyword searches tied to these apps but struggled to remove them entirely.
Meta’s new approach includes tech that spots AI nudify ad content—even when there’s no nudity in the ad itself. The company expanded its flagged terms and built matching tech to root out copycats faster. Since 2025 began, Meta says it disrupted four networks pushing AI nudify ads.
Outside the apps, Meta joined the Tech Coalition’s Lantern program to share info on these harmful AI apps, providing over 3,800 suspect URLs since March.
Meta also plans to keep pushing for laws that give parents more control over their teens’ app downloads and continues backing the US Take It Down Act.
Meta stated in a blog post:
Meta has developed new technology to specifically identify ads for AI nudify or undressing services “even when the ads themselves don’t include nudity.”
The company said it is now using matching technology to help find and remove copycat ads more quickly, and has expanded the list of terms, phrases and emoji that are flagged by its systems.
Meta said it is also applying the tactics it has traditionally used to disrupt networks of bad actors to these new networks of accounts running ads for AI nudify services. Since the start of 2025, Meta said, it has disrupted four separate networks promoting these services.
Outside of its apps, the company said it will begin sharing information about AI nudify apps through Tech Coalition’s Lantern program, a collective effort between Google, Meta, Snap and other companies to prevent child sexual exploitation online. Meta says it has provided more than 3,800 unique URLs with this network since March.
On the legislative front, Meta said it would “continue to support legislation that empowers parents to oversee and approve their teens’ app downloads.” The company previously supported the US Take It Down Act, and said it’s now working with lawmakers to implement it.