The Tasalli
Select Language
search
BREAKING NEWS
Nudify AI Apps Alert Apple and Google Profit From Deepfakes
Technology Apr 16, 2026 · min read

Nudify AI Apps Alert Apple and Google Profit From Deepfakes

Editorial Staff

The Tasalli

728 x 90 Header Slot

Summary

A new report shows that Apple and Google are still hosting and even promoting "nudify" apps on their official stores. These apps use artificial intelligence to create fake nude images or pornographic videos of real people. Despite having strict rules against such content, both companies have allowed dozens of these apps to remain available for download. Many of these tools were even given age ratings that make them accessible to children, raising major safety concerns.

Main Impact

The biggest issue highlighted in the report is that tech giants are not just failing to catch these apps; they are actively helping users find them. When people search for specific terms related to undressing or AI editing, the app stores often suggest these tools. In some cases, the stores even show paid advertisements for apps that create sexualized deepfakes. This means the platforms are directly profiting from software that can be used to harass or harm individuals without their consent.

Key Details

What Happened

The Tech Transparency Project (TTP) conducted a study to see how easy it is to find "nudify" apps on the iOS App Store and Google Play Store. They found that searching for words like "undress" or "nudify" led directly to apps designed to create sexual images. Even though both Apple and Google have policies that ban pornographic material, these apps bypassed the review process. Some apps used misleading names or descriptions to stay on the platform, while others were quite open about what they could do.

Important Numbers and Facts

The TTP report identified 18 of these apps on Apple’s store and 20 on Google’s store. Together, these apps have been downloaded more than 483 million times. They have also made a huge amount of money, generating roughly $122 million in total revenue. Perhaps the most shocking fact is that many of these apps were rated "E" for Everyone or "Teen." This rating suggests the apps are safe for young users, even though they can be used to generate adult content.

Background and Context

This problem is part of a larger trend involving AI deepfakes. Deepfakes are fake images or videos created by computers that look very real. While AI can be used for fun things like filters or art, it is also being used to create "non-consensual" sexual content. This means someone’s face can be put onto a different body without their permission. This technology has become a major tool for online bullying and harassment. Because the technology is moving so fast, app store moderators are struggling to keep up, or they are not looking closely enough at the apps they approve.

Public or Industry Reaction

After the report was released, both Apple and Google took some action. Apple told reporters that it removed 15 of the apps mentioned by the TTP. Google also stated that it suspended several apps that violated its rules. However, critics say this is not enough. Katie Paul, the director of the TTP, pointed out that the companies are not just failing to review the apps properly; they are also making money from them. Government officials are also starting to step in. In the United Kingdom, regulators are calling for a total ban on apps that can create fake sexual images of children. In the United States, several states and the federal government are looking at new laws to punish people who create or share these deepfakes.

What This Means Going Forward

The presence of these apps shows a big gap between what tech companies say and what they actually do. Moving forward, Apple and Google will likely face more pressure to change how they search for and block harmful AI tools. We may see more advanced automated systems used to scan apps for hidden features. Additionally, as more countries pass laws against deepfakes, these platforms could face legal trouble or heavy fines if they continue to host and promote software that breaks the law. For users, it serves as a reminder that app store ratings are not always a perfect guide for what is safe.

Final Take

The ability to create fake, sexualized images with a single click is a serious threat to privacy and safety. While Apple and Google have the power to stop these apps from reaching millions of people, the current system is clearly failing. Removing a few apps after a report comes out is a temporary fix. To truly protect users, these tech giants must change their search algorithms and review processes to ensure that harmful AI tools are never promoted in the first place.

Frequently Asked Questions

What are nudify apps?

Nudify apps are mobile tools that use artificial intelligence to edit photos of people to make them appear nude. These are often used to create fake images without the person's consent.

Are these apps allowed on the App Store or Google Play?

No, both Apple and Google have strict policies against pornographic content and apps that create non-consensual sexual images. However, many of these apps manage to bypass the review process.

How can I protect myself from deepfake apps?

It is difficult to stop someone from using your public photos, but you can increase your privacy settings on social media. Additionally, reporting these apps to the store owners can help get them removed faster.