Someone Made an AI Deepfake Nude of Me: Is It Illegal, and How Do I Get It Taken Down Fast?
Finding an AI deepfake nude of yourself can hit like a punch. You might feel confused about what the law allows, scared of making a wrong move, and flooded with bad advice from social media. We wrote this to give you calm, plain-English steps so you can act fast, protect yourself, and keep your options open.
Table Of Content
- What changed recently (2025 to 2026)
- First: Are you safe and is anyone threatening you?
- If this is blackmail or sextortion, do this (and don’t do this)
- If you’re being stalked, harassed, or doxxed
- The fast takedown playbook (first 60 minutes)
- Preserve evidence (without spreading it)
- Report it where it’s hosted (the fastest win)
- Stop re-uploads: use hashing tools
- If you were under 18 in the image: use NCMEC Take It Down
- If you were 18+: use StopNCII.org
- How to get it removed from the internet (not just one site)
- Mirrors, re-uploads, and aggregator sites
- Social platforms vs adult sites: what to include in a report
- Escalation ladder if you’re ignored
- Is it illegal? Plain English, by region
- UK: deepfake nudes as intimate image abuse
- US: federal law plus state laws
- EU: platform duties and reporting under the DSA
- If you’re not sure where the uploader is located
- If you’re under 18 (or the image involves a minor)
- Why this happens (without turning this into a how-to)
- Recovery and prevention (so it doesn’t keep happening)
- Account security and privacy hardening
- What to tell your employer or school (short script)
- Mental health support
- What outcomes to expect (realistically)
- FAQs
- Is it illegal to make an AI deepfake nude of someone?
- Is it illegal to share or threaten to share deepfake nudes?
- What should we do in the first hour after we find it?
- How do we get a deepfake nude taken down fast?
- Can we remove it without sending the image to anyone?
- What is StopNCII and does it work for deepfakes?
- What is NCMEC Take It Down and who is it for?
- Should we pay if someone is threatening to post it?
- Disclaimer
This article focuses on non-consensual intimate imagery (NCII), meaning sexual images shared without consent. People also call it intimate image abuse, image-based sexual abuse (IBSA), “fake nudes,” “deepfake nude images,” or “AI-generated nudes.”
What changed recently (2025 to 2026)
The law and platform duties have moved quickly.
In the UK, the government has said it has fast-tracked work to bring into force rules that make it illegal to create or request deepfake intimate images of adults without consent, and it has linked this area to action under the Online Safety Act.
In the US, the federal TAKE IT DOWN Act (S.146) created new national rules that cover both authentic and computer-generated intimate images and pushes a notice-and-removal process for certain platforms.
In the EU, the Digital Services Act (DSA) strengthens “flag illegal content” systems and requires platforms to respond and explain decisions, with appeal options.
First: Are you safe and is anyone threatening you?
If someone is threatening you, treat it as a safety issue first. Don’t argue, don’t pay, and don’t send more images. Save the messages, lock down accounts, and report the threat and the content. When threats include money or demands, that’s often sextortion, and police can help.
If this is blackmail or sextortion, do this (and don’t do this)
Sextortion is a type of blackmail where someone threatens to share sexual images unless you pay or do something you don’t want to do.
Do:
- Save the threat messages, usernames, and timestamps.
- Stop replying once you’ve saved evidence.
- Report the account on the platform and report the threat to police.
Don’t:
- Don’t pay. Payment often leads to more demands.
- Don’t send new photos “to prove it’s you.”
- Don’t post the deepfake to “explain” it. That can spread it.
If you’re being stalked, harassed, or doxxed
Doxxing means sharing your private details (like address or workplace) to scare or target you. Treat it as urgent if you feel unsafe, if someone knows where you live, or if the person is escalating.
Save the evidence and ask your platform to act on harassment and impersonation too. Many removals happen faster when you report the full pattern, not just one image.

The fast takedown playbook (first 60 minutes)
Speed matters. So does evidence. We want both.
Preserve evidence (without spreading it)
Start a simple log. Think of it like keeping receipts when something goes wrong with a delivery.
Capture:
- The exact URL of the post or page.
- Usernames, profile links, and channel names.
- Timestamps, dates, and time zone if shown.
- Screenshots that show the image plus the page header and URL bar.
If you know how, save a copy of the page source or the page link data, but don’t take risks or break site rules to do it. Your goal is a clean record that shows what existed and where.
Report it where it’s hosted (the fastest win)
The hosting site or platform is often the quickest “stop the bleeding” point.
Use the site’s report flow, then also email if they list:
- abuse@
- support@
- legal@
Use the words platforms understand:
- “non-consensual intimate imagery (NCII)”
- “intimate image abuse”
- “AI deepfake nude” or “deepfake nude images”
- “posted without consent”
- “requesting removal”
If the site asks whether the content is “real,” don’t get stuck there. Lead with consent. Many rules focus on lack of consent, not whether the image started from a real nude.
Stop re-uploads: use hashing tools
A hash (or hash value) is a “digital fingerprint.” It’s a code made from an image. Matching copies produce the same code, so platforms can spot repeats without needing to see your image again.
Two tools matter most:
If you were under 18 in the image: use NCMEC Take It Down
Take It Down is for nude, partially nude, or sexually explicit images or videos made when you were under 18. You can stay anonymous and you don’t have to send the image to anyone. The tool makes a hash value on your device, and only the hash is shared.
That “under 18” line matters even if the image is synthetic media. In the UK, indecent images of under-18s can be illegal even if they are fake or “pseudo-photographs.”
If you were 18+: use StopNCII.org
StopNCII.org creates hashes (digital fingerprints) of selected images on your device. It says it does not download your images and shares only the hash with participating companies.
This can help with repeated uploads across participating platforms. It’s not magic, but it can cut the spread.
How to get it removed from the internet (not just one site)
One deletion rarely ends it. Re-uploads happen. Mirrors happen. Group chats happen.
We push a three-lane approach:
- Remove at the source (host or platform).
- Block repeats (hashing).
- Reduce visibility (search and social).
Mirrors, re-uploads, and aggregator sites
“Aggregator” sites copy content from elsewhere. “Mirrors” do the same to dodge takedowns.
When you report, include:
- The first URL you found.
- Any mirror URLs you found later.
- A note that the same image is spreading.
Platforms act faster when they can see a pattern.
Social platforms vs adult sites: what to include in a report
Social platforms often care about harassment, impersonation, bullying, and NCII.
Adult sites often focus on consent, age, and “illegal content.” Keep it simple:
- “This is me.”
- “I didn’t consent.”
- “It’s an AI deepfake nude.”
- “Remove it and any copies.”
Escalation ladder if you’re ignored
If you get no reply, escalate in order. Keep your log updated.
- abuse@ email and the report form
- legal@ email
- hosting provider or CDN (content delivery network)
- domain registrar
- regulator or law enforcement
In the EU, the DSA requires easy flagging systems and a response path, plus reasons for removals and appeal options. That gives you a stronger “you must respond” posture with EU-facing services.
Is it illegal? Plain English, by region
Laws vary by place and by age. The short answer is often “yes,” but the details matter.
UK: deepfake nudes as intimate image abuse
In the UK, it can be illegal to share or threaten to share intimate images without permission, and this can include deepfake images. It can also be illegal to create, or ask someone to create, fake intimate images without permission.
Recent government statements also point to making creation or requests for deepfake intimate images of adults without consent illegal, with links to Online Safety Act enforcement priorities.
US: federal law plus state laws
The TAKE IT DOWN Act (S.146) is a federal law aimed at non-consensual intimate imagery, including computer-generated depictions. It pushes a notice-and-removal process for certain covered platforms and includes a 48-hour removal expectation after valid notice.
States may also have their own NCII or “digital forgery” rules. That’s why a local lawyer can be helpful if you’re considering a report or claim.
EU: platform duties and reporting under the DSA
The DSA gives users clearer ways to report illegal content and requires platforms to respond and explain decisions, with ways to appeal.
That doesn’t mean every report succeeds. It does mean platforms should not ignore you.
If you’re not sure where the uploader is located
Focus on where the content is hosted and where it’s seen. Takedowns usually start with the platform, not the person.
If you later identify the uploader, you can then think about police reports or civil steps. Don’t wait for that to start removal.

If you’re under 18 (or the image involves a minor)
Move fast. Keep adults in the loop. Use the right tools.
Start with Take It Down and keep your evidence log. If classmates are sharing, the school can act on safeguarding and discipline even while platforms and police handle the wider piece.
Why this happens (without turning this into a how-to)
A deepfake is a type of synthetic media. It uses generative AI to make an image look real when it isn’t.
Many “nudification” or “undressing” websites take an ordinary photo and generate a sexual version. They often spread through links, saved screenshots, and re-uploads across different services.
The key point: you didn’t cause this by having photos online. The fault sits with the person who made or shared the content.
Recovery and prevention (so it doesn’t keep happening)
Lock down access. Reduce new material. Plan what you’ll say.
Account security and privacy hardening
Turn on two-factor login. Change passwords. Check for unknown logins.
Limit who can download, tag, or share your content. Remove old public photos if you think someone is pulling “seed images” from your profile.
What to tell your employer or school (short script)
Keep it plain. Keep it factual.
“I’m dealing with non-consensual intimate imagery made with AI. I’m working on takedown requests. If anything appears, please don’t share it. Please send me the URL and a screenshot of the page header only.”
Mental health support
This can feel like sexual abuse because it uses your likeness in a sexual way without consent. If you feel panicked, numb, or unsafe, ask for support from someone you trust.
What outcomes to expect (realistically)
Some takedowns happen in hours. Others take days. Mirrors can stretch it to weeks.
Expect progress, not perfection. Your log, your reports, and hashing tools give you the best chance of cutting spread quickly.
FAQs
Is it illegal to make an AI deepfake nude of someone?
Often yes, but the exact rule depends on where you live and the ages involved. In the UK, police guidance treats creating fake intimate images without permission as illegal in many cases. In the US, federal and state laws may apply, including newer rules covering computer-generated depictions.
Even where “creation” is tricky to prove, sharing or threatening to share often triggers clearer offences.
Is it illegal to share or threaten to share deepfake nudes?
In many places, yes. UK police guidance says it’s illegal to share or threaten to share intimate images without permission and this includes deepfake images. Threats linked to demands can also be blackmail or sextortion. In the US, newer federal rules also target posting or threats around NCII and deepfakes.
Save the threat messages. They often matter more than the image itself.
What should we do in the first hour after we find it?
Act fast, but don’t spread it. Save URLs, usernames, and timestamps, then take screenshots that show the page and address bar. Start a simple log. Report the post to the hosting site first, then use hashing tools to help block repeat uploads on participating platforms.
If threats are involved, treat it as safety first and report it.
How do we get a deepfake nude taken down fast?
Start where it’s hosted. Send a clear takedown request that says “non-consensual intimate imagery” and “posted without consent,” with links and screenshots. Then report copies on each platform where it appears. If you’re under 18 in the content, use Take It Down. If you’re 18+, use StopNCII.org to create a hash.
Fast wins come from clean evidence and repeat reporting, not long arguments.
Can we remove it without sending the image to anyone?
Sometimes, yes. Take It Down says you won’t have to send your images or videos to anyone and the image doesn’t leave your device because it creates a hash value locally. StopNCII.org also says it generates hashes on your device and shares only the hash, not the image itself.
That privacy detail can make reporting feel less overwhelming.
What is StopNCII and does it work for deepfakes?
StopNCII.org is a tool that creates a hash, meaning a digital fingerprint, from an intimate image on your device. It shares the hash with participating companies so they can spot and remove matching uploads. It can help with repeated posting, but it won’t cover every site and it won’t replace direct takedown requests.
Use it alongside reports to the host and major platforms.
What is NCMEC Take It Down and who is it for?
Take It Down is a free service for images or videos made when you were under 18. It can help remove or stop sharing on participating public or unencrypted platforms. It creates a unique hash value on your device, and only that hash is shared. You can stay anonymous.
If the content relates to adulthood, the service points adults toward StopNCII.org.
Should we pay if someone is threatening to post it?
Paying is risky and often makes things worse. Police guidance treats blackmail and sextortion as crimes, and payment can lead to more demands. Save the messages, stop replying after you capture evidence, and report the account and the threat. Ask the platform to act on harassment and extortion-linked threats.
If you feel unsafe, involve someone you trust and contact police.
Disclaimer
Mandatory disclaimer: This article is general information, not legal advice. It doesn’t create a lawyer-client relationship. Laws can change, and details matter, so consider getting advice from a qualified lawyer for your situation.



[…] Friendly fraud often begins with simple confusion. A customer sees an odd statement descriptor and thinks, “Not mine.” Visa calls this first-party misuse, where a cardholder disputes a real transaction they made, or someone in their household made. […]