What are “nudify” apps and why are they dangerous?
“Nudify” apps use AI to remove clothing from images or swap a real person’s face onto nude bodies. The result looks real. The subject never consents.
The Tech Transparency Project found 55 such apps on Google Play and 47 on Apple’s App Store. Many appeared under search terms like “nudify” and “undress.” The group tested them using clothed images of women.
These are not novelty filters. According to TTP, the apps are designed for non-consensual sexualization. They enable abuse in seconds.
What did Apple and Google do after the report?
After being contacted, Apple removed 28 apps. Some returned after developers submitted revised versions. TTP later found that only 24 were actually gone.
Google said it suspended several apps for policy violations. It did not share a number. The company said its review is ongoing.
Both stores already ban this behavior. Google forbids apps that “undress people.” Apple bans overtly sexual content. Yet dozens passed review.
Why this problem is growing fast
New AI models lower the barrier. Anyone can now generate realistic nude images using simple apps. CNBC previously tracked a case in the U.S. where over 80 women were targeted using similar tools. No crime was charged.
TTP says these apps have over 700 million downloads worldwide. They generated $117 million in revenue. Apple and Google take a cut.
Fourteen of the apps traced back to China. That raises data risks. China’s laws allow government access to company data. That means intimate deepfakes could travel far beyond the user.
Policy pressure is building
Lawmakers have urged Apple and Google to remove such services. The European Commission has opened an investigation into X over similar AI behavior. Even Grok admitted “lapses in safeguards.”
Yet the core issue remains. App stores still surface tools that can weaponize images.
Nudify apps show how fast AI can outpace platform safety. Until enforcement becomes real-time, ordinary photos will stay vulnerable.