Back
Technology

Investigation Finds App Store Search and Advertising Systems Direct Users to AI Nudify Apps

View source

Investigation Finds App Stores Direct Users to 'Nudify' Apps

An investigation found that search and advertising systems in the Apple App Store and Google Play Store direct users to applications capable of digitally removing clothing from images of women.

The investigation, conducted by the Tech Transparency Project (TTP), tested apps returned for specific search terms and found that approximately 40% of the top results had this functionality. According to data from AppMagic, the identified apps have been downloaded approximately 483 million times and generated over $122 million in lifetime revenue.

Investigation Methodology

The TTP conducted tests in March 2026 using newly created Apple and Google accounts with no prior activity. Researchers performed searches on an iPhone and an Android phone using the following terms: "nudify," "undress," "deepfake," "deepnude," "adult AI," "face swap," and "AI NSFW."

For each search, the top ten apps returned, including sponsored results when they appeared in the top ten, were downloaded and tested. Testing involved uploading AI-generated photos of clothed, fictional women and using the apps' features to attempt to remove clothing or swap faces onto nude bodies.

  • Apple App Store: 46 unique apps tested, with 18 (39.1%) found capable of generating nude or scantily clad images of women.
  • Google Play Store: 49 unique apps tested, with 20 (40.8%) found to have the same capability.

App Functionality and Examples

Testing revealed a range of functionalities across the identified apps:

  • Best Body AI — Fashion Editor (Apple App Store): Generated a topless image from a photo of a clothed woman when prompted to "remove all clothes."
  • AI Replace & Remove — Fill App (Apple App Store): Showed a thumbnail of a generated nude image after a user confirmed they were over 18; access to the full image required a paid subscription.
  • FaceTool: Face Swap & Generate (Google Play Store): Successfully swapped the face of a clothed woman onto a topless body. The app is rated suitable for all ages.
  • Talkie: Creative AI Community (Google Play Store): Refused a prompt to generate a topless image but complied with a request to generate an image of a woman in a bikini. The app is rated suitable for ages 13+.
  • Magic AI: Dream Image Maker (Google Play Store): Featured templates with titles including "AI remove clothes" and "forced sex."
  • SwapX PRO: AI & Video (Google Play Store): Generated a video of a woman removing her top in response to a prompt. The app is rated suitable for ages 13+.
  • Adult AI Chat, Uncensored: AIs (Apple App Store): Offered preset female AI companions, some described as resembling pre-teen or teen girls, with blurred images unlockable through in-app purchases.

The investigation noted that Grok, an app by xAI, appeared in search results but was not included in the tally of nudify apps, as testing found it blocked attempts to remove clothing from uploaded images.

Search Suggestions and Advertising

The investigation reported that the app stores' systems facilitated discovery of these apps beyond direct searches.

  • Autocomplete Suggestions: Both stores' search functions suggested terms that led to additional nudify apps.

    • Apple's App Store suggested terms like "image to video ai nsfw" and "adult ai photo editor."
    • Google Play suggested "nudie video apps" when "nudify" was typed and "deepfake video maker" when typing "deepfake."
  • Advertising Placements: Both companies ran advertisements for nudify apps within search results.

    • Apple displayed ads as the first result in searches for terms including "deepfake" and "adult AI."
    • Google displayed a "Suggested for You" carousel of sponsored apps in searches for "adult AI" and "AI NSFW," which featured dozens of apps, many described as containing pornographic content.

Platform Policies and Company Responses

The findings of the investigation contrast with the publicly stated policies of both companies.

Apple's App Store Review Guidelines prohibit apps that are "offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy," including "overtly sexual or pornographic material." Apple also states it prohibits ad content that "promotes adult-oriented themes or graphic content."

Google Play Developer Program Policies bar apps that "contain or promote sexual content," show "sexually suggestive poses in which the subject is nude, blurred or minimally clothed," or "degrade or objectify people, such as apps that claim to undress people or see through clothing." Google's advertising policies prohibit content showing "graphic sexual acts intended to arouse" or promoting nonconsensual sexual themes.

  • Apple declined to comment on the investigation.
  • Google spokesperson Dan Jackson stated that many of the apps identified by TTP have been suspended and that the company's enforcement process is ongoing. Jackson noted that the International Age Rating Coalition (IARC), not Google, sets age ratings for apps in the Google Play Store.

Following the investigation, Apple removed 14 apps after TTP and Bloomberg News shared them with the company. Google removed seven apps.

App Data, Ratings, and Developer Responses

  • According to data from app analytics firm AppMagic, the nudify apps identified in TTP's searches have been downloaded approximately 483 million times and have generated over $122 million in lifetime revenue.
  • Thirty-one of the identified apps were rated as suitable for minors.
  • Several app developers provided statements in response to the investigation, with some stating they were unaware of the functionality, had tightened moderation, or were reviewing advertising keyword associations.

Background and Previous Actions

A previous TTP report in January 2026 identified over 100 nudify apps across both app stores. After TTP and CNBC contacted the companies about those findings, Apple and Google each removed more than two dozen apps, and other apps increased their listed age ratings. The March 2026 investigation focused on the role of app store search and advertising systems in directing users to such apps.