Apple has removed a series of applications from its App Store following a new investigation by the Tech Transparency Project (TTP), which revealed that the platform’s own search and advertising systems were actively directing users toward "nu*ify" tools. These apps use artificial intelligence to generate non-consensual nude images, often bypassing the safety protocols both Apple and Google claim to enforce. While many of the flagged apps have been taken down, the report highlights a systemic failure in how app store algorithms and sponsored results handle explicit queries.
The investigation, released on Wednesday, suggests that both the Apple App Store and Google Play Store have struggled to curb the proliferation of deepfake technology. According to TTP researchers, searching for terms such as “nu*ify,” “undress,” or “deepfake” frequently surfaced sponsored listings for apps capable of rendering women nude or in a state of partial undress.
In some instances, the App Store’s autocomplete function reportedly assisted the process. For example, typing "AI NS" prompted suggestions like "image to video ai nsfw," leading users directly to explicit content. The report found that approximately 40% of the top 10 apps returned for these specific search terms across both platforms had the capability to generate realistic, sexualized deepfakes.
Apple Removes Apps Following Safety Concerns
Following the exposure of these findings, Apple took action by removing most of the apps identified by TTP. Although the company declined to comment officially on the report, the removal of these tools reflects an ongoing effort to address "overtly sexual or pornographic" material, which is strictly prohibited under Apple’s developer guidelines.
Testing conducted by TTP demonstrated the ease with which these apps operated. In one case, a sponsored app for "face swapping" allowed researchers to place the face of a clothed woman onto a topless body without any content moderation triggers. Some developers claimed they were unaware their AI models—some of which reportedly used xAI’s Grok—could produce such outputs, though they have since promised to tighten internal controls.
Apple and Google Profits from Explicit App Revenue
The financial scale of these apps remains significant. According to data from app analytics firm AppMagic, the nudity-related apps identified in the investigation have been downloaded 483 million times and have generated roughly USD 122 million in lifetime revenue. Since Apple and Google typically claim a commission of up to 30% on app earnings, the tech giants are facing criticism for indirectly profiting from software that violates their own safety policies.
Watchdog groups have pointed out that despite periodic "purges" of these apps, they often reappear under different names or with misleading descriptions to evade detection. This cycle has raised questions about the effectiveness of current automated review processes in the face of rapidly advancing generative AI.
Apple Facing Pressure Over Minor Safety and Age Ratings
A particularly concerning aspect of the report involves the age ratings assigned to these tools. TTP identified 31 "nu*ify" apps that were rated as suitable for minors, despite their primary function being the creation of sexually explicit imagery. This finding coincides with an increase in reported sexual deepfake incidents within educational institutions globally.
Regulators in the United States and Europe have already begun ratcheting up pressure on mobile marketplaces to implement more robust safeguards. As of Thursday, both Apple and Google are under increased pressure to transition from reactive removals to more proactive, AI-driven moderation that can identify the intent of an app before it reaches hundreds of millions of users.
(The above story first appeared on LatestLY on Apr 16, 2026 01:40 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).













Quickly


