A new investigation has placed Apple under pressure after revealing the widespread presence of AI-powered “nudify” apps on the App Store. According to a report by the Tech Transparency Project, dozens of apps still allow users to create non-consensual sexual images using artificial intelligence. While other tech firms have faced backlash over similar tools, Apple now finds itself in the spotlight for failing to act quickly enough.
The apps, often discovered through simple searches like “nudify” or “undress,” claim to offer harmless image editing features. However, researchers found that many could digitally remove clothing or generate explicit content despite stated restrictions.

Millions of Downloads and Serious Risks
The scale of the issue is striking. Investigators identified 47 such apps on the App Store, with millions of global downloads and significant revenue generation. As a result, Apple and Google reportedly earned millions in commissions from these apps.
More concerning, several apps appeared accessible to teenage users. Some were even rated suitable for users under 18. This raised fears about harm to minors, as well as the targeting of women through deepfake-style imagery.
Gaps in Enforcement and Review
Apple’s App Store Review Guidelines clearly prohibit offensive, sexual, or exploitative content. They also require strong safeguards for user-generated material. Yet, the report suggests that many of these apps passed Apple’s review process without meaningful barriers in place.
Although Apple later removed 28 apps after being alerted, critics argue the response came too late. Moreover, Apple declined to publicly comment on the investigation, which added to concerns about transparency.
Trust, Ethics, and Apple’s Next Move
Apple has long marketed itself as a company that prioritizes user safety and privacy. However, critics now say the company risks damaging that reputation. The report notes that Apple directly profited from these apps, which weakens its moral stance on harmful AI use.
As AI tools grow more powerful, pressure continues to mount on Apple to strengthen enforcement, improve app reviews, and clearly address abuses. Ultimately, how Apple responds may shape public trust in its App Store and its broader approach to artificial intelligence.











