November 25, 2025
If someone posted (or threatened to post) private photos or an AI “deepfake” of you, it probably felt like your life went sideways in seconds. For California residents who live a lot of life online, that fear is very real.
The TAKE IT DOWN Act is a new federal law aimed at stopping the spread of non-consensual intimate images (often called NCII) including AI-generated deepfakes and giving victims a faster way to get that content removed.
In this guide, we’ll walk through:
The goal is to give you clear, calm, and practical information so you can decide what to do next.
The full name is a mouthful: the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act. Lawmakers from both parties backed up it, and it passed Congress with overwhelming support.
At its core, the law does two big things:
This matters a lot in a state like California, where social media, content creation, and AI tools are part of everyday life.
Abuse involving deepfakes and “nudify” tools has exploded. One major trust-and-safety company reported a 105% surge in AI-generated non-consensual intimate imagery between 2022 and 2024, more than double in just two years.
For many victims especially women, minors, and LGBTQ+ youth these images lead to:
The TAKE IT DOWN Act is Congress’s attempt to catch up with that reality.
Under the Act, NCII covers more than just “revenge porn.”
It can include:
A key point: even if you once consented to take the photo or send it privately, the law treats public posting without consent as a different act. Consent to create or share in a private context does not equal consent to publish it to the world.
The law also carves out specific exceptions, such as:
The Act protects adult victims and minors, but it treats minors more strictly because of the added harm.
If explicit or intimate images of you go online without your consent, you may be protected, whether those images are:
The person behind the upload or even someone who re-shares it can face criminal charges if certain conditions are met.
The law is even tougher when the person in the images is under 18:
There’s also protection for people who are now over 18 but whose intimate content was created when they were minors. That’s especially important for college students and young workers in California trying to move forward from something that happened in high school.
Parents or guardians can often file on a minor’s behalf, though there may be situations where a teen files directly to protect their privacy.
The criminal side of the TAKE IT DOWN Act is meant to deter abuse and give prosecutors stronger tools. Here’s a simplified view of the potential penalties:
Publishing real intimate images (adult victim)
Up to 2 years in federal prison + fines
Publishing real intimate images (minor victim)
Up to 3 years in prison + fines
Publishing deepfake or AI-generated NCII (adult or minor)
Similar ranges (up to 2–3 years depending on age and facts)
Threatening to publish NCII (extortion, coercion, “send more or I’ll post this”)
For adults: up to 2 years in prison
Threats involving minors or deepfakes can carry higher ranges (up to 30 months)
On top of prison and fines, courts can order:
For a California victim, this means the person who harmed you faces real federal consequences, not just a slap on the wrist.
The most victim-focused part of the law is the takedown system. It’s designed to give you a faster, lower-bar way to get NCII removed.
Here’s how that process generally works for covered platforms:
Before you submit a request, gather:
You do not need a court order or a lawyer just to file a takedown request.
Platforms must provide clear instructions, usually a form or a dedicated reporting page where you can flag NCII and request removal.
When you submit, be as specific as you can. Include every link or account where you see the content.
Once a platform receives a valid request, it has about 48 hours to:
If a platform drags its feet or ignores the request, it risks enforcement by the Federal Trade Commission (FTC), including civil penalties that can reach roughly $51,000 per violation.
Bad actors sometimes repost under new accounts or move to other sites. You can:
The law applies to a wide range of services, not just big social networks.
Covered platforms include:
Platforms must:
For victims, this means more leverage. For platforms, it creates serious operational pressure, especially because the law allows enforcement even where traditional Section 230 protections would have blocked some claims.
No law is perfect, and experts have pointed out some real concerns.
What does this mean for you?
The TAKE IT DOWN Act is federal, but Californians also have state-level protections, including:
In many cases, both state and federal law can apply. A California case might involve:
This is one of the reasons many people choose to talk to a California-based attorney who understands both systems.
If you wake up to find your photo or a deepfake of you online, here are some immediate steps:
You don’t need a lawyer to submit a basic TAKE IT DOWN request, but legal help can make a big difference when:
A California attorney with experience in digital privacy, juvenile issues, and online harassment can help you:
On DefendCA, you can explore our pages on juvenile defense, cybercrime, and online harassment, and reach out for a confidential consultation if you or your child need help.
If someone shares or fabricates intimate images of you, it can feel like they’ve taken control of your story. The TAKE IT DOWN Act doesn’t fix every problem, but it gives you:
You don’t have to navigate this alone. If you’re in California and dealing with non-consensual images, deepfakes, or online threats, consider talking with a lawyer who understands the new law and how to protect your digital identity.
Leave a Reply
Your email address will not be published. Required fields are marked *