EN

Clothoff io

ClothOff

ClothOff is a highly controversial AI-powered “nudify” platform that uses advanced deep learning models, including GANs and diffusion algorithms, to digitally remove clothing from uploaded photos and create illusions of naked bodies. Accessible primarily via clothoff.net, it offers dedicated apps for Android, iOS, and MacOS, along with tools for generating hyper-realistic nude images (DeepNude AI), custom undress videos with realistic motion and expressive effects, face swaps (including standard, video, and porn-specific variants), multi-uploads, adjustable body parameters (such as breast and butt size), sex poses and sets, queue skipping, and an API for automated adult content creation. The service markets itself as “Your TOP-1 Pocket Porn Studio” and “The New Porn Generator,” providing free trials for basic photo and video undressing, with premium features unlocked through one-time purchases of VIP Coins (no subscriptions required) for higher quality outputs, faster processing, and additional options. It promotes benefits like “health and freedom,” citing claims about masturbation and sexual activity, while featuring endorsements from adult industry figures. ClothOff claims comprehensive privacy protections: “We do not save any data,” no distribution without consent, and technical safeguards making processing images of minors impossible (with automatic account bans for attempts). It strictly prohibits non-consensual use, illegal activities, and content involving anyone under 18, stating it condemns prohibited content and does not incite unlawful actions. The platform partners with Asulabel to donate funds supporting victims of AI abuse. Despite these assertions, ClothOff has faced widespread ethical condemnation and legal challenges for enabling non-consensual deepfake pornography and child sexual abuse material (CSAM). A major federal lawsuit in New Jersey (Jane Doe v. AI/Robotics Venture Strategy 3 Ltd., the platform’s operator, registered in the British Virgin Islands and believed to be run from Belarus) alleges it facilitated the creation and distribution of hyper-realistic fake nudes of a minor from her social media photos, invoking the TAKE IT DOWN Act to demand image removals, data destruction, AI training prohibitions, damages, and potential shutdown. Supported by Yale Law clinics, the case highlights severe harms like bullying, harassment, and emotional distress. Investigative reports from Der Spiegel, Ars Technica, Bellingcat, The Guardian, and others document the acquisition of multiple rival nudify services, link operations to regions in the former Soviet Union (including Belarus), and expose its role in numerous global abuse cases, particularly in schools. The platform has faced blocks in countries like Italy (by the Data Protection Authority for unlawful data processing) and restrictions elsewhere, advertising bans on Meta platforms, and removal of its Telegram bot, yet it continues to operate with millions of monthly users amid growing demands for stricter AI regulations worldwide. ClothOff denies liability for user misconduct.

Last updated on December 18, 2025