New Federal TAKE IT DOWN Act Gives Victims an Avenue for the Removal of Deepfake Images

On May 19, 2025, President Trump signed the TAKE IT DOWN Act (the “Act”) into law. TAKE IT DOWN doubles as an acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks. The Act targets non-consensual intimate imagery (“NCII”) by making it a federal crime to publish NCII of others. The Act also provides a mechanism to force “covered platforms” to remove any NCII hosted or shared on the platforms within 48 hours of receiving notice. In practical terms, the Act allows victims to have explicit images, whether taken or created by artificial intelligence (“AI”), removed from certain covered platforms.

The following is a summary of the Act’s key components.

Covered Platforms

The Act defines "covered platforms" as public-facing platforms but not private forums or encrypted peer-to-peer networks, meaning that it applies to websites, online services, or mobile applications that serve the public and that primarily provide a forum for user-generated content, like a social media platform, but not apps like Signal. The Act does not include broadband internet access service providers, e-mail providers, or websites, online services, online applications or mobile applications that consist primarily of non-user generated content that is preselected by the provider of the website, service, or application.

Criminal Liability for the Knowing Publication of NCII or Threats to Publish

Violations of the Act carry criminal penalties for knowingly publishing or threatening to share NCII, including AI-generated images that depict real people. These penalties include fines, forfeiture, restitution, and up to two years imprisonment.

The Act prohibits a knowing publication of NCII of an identifiable individual if (1) the visual depiction was obtained or created under circumstances in which the person knew or reasonably should have known the identifiable individual had a reasonable expectation of privacy; (2) what is depicted was not voluntarily exposed by the identifiable individual in a public or commercial setting; (3) what is depicted is not a matter of public concern; and (4) the publication is intended to cause harm or causes harm to the identifiable individual.

There are stricter criminal prohibitions for publishing authentic or AI-generated NCII of a minor with a maximum prison sentence to three years. The Act excludes images covered by existing criminal provisions applicable to depictions of children, meaning these special protections under the Act are meant to extend primarily to teenagers.

Individuals who intentionally threaten to publish NCII for the purpose of intimidation, coercion, extortion, or to create mental distress face penalties similar to those who knowingly publish NCII, except that threats involving AI-generated imagery carry slightly shorter maximum sentences.

Notice and Removal of NCII

The Act also requires covered platforms to set up a notice and removal process for NCII. Specifically, within one year of the Act’s effective date, covered platforms must establish a mechanism whereby an identifiable individual or their agent can notify and request that the covered platform remove NCII of the individual.

Covered platforms must also provide a clear and conspicuous notice of its notice and removal process. The notice must be written in plain language that is easy to read and must provide information regarding the responsibility of the covered platform, including a description of how an individual can submit a notification and request for removal.

Upon receiving a valid removal request, a covered platform must remove the NCII and make reasonable efforts to identify and remove any known identical copies no later than 48 hours after receiving the request. Failure to do so may trigger enforcement by the Federal Trade Commission (“FTC”), which would treat the violation as an unfair or deceptive act or practice that could result in civil penalties.

Exception for Individuals Acting in Good Faith

The Act exempts law enforcement, intelligence agencies, medical professionals, and others who disclose NCII reasonably and in good faith as part of a document production associated with a legal proceeding, medical treatment or another legitimate purpose, or to seek support or help with respect to the receipt of an unsolicited intimate visual depiction, among other ethical uses.

If you would like assistance with, or have any questions about, complying with the TAKE IT DOWN Act, or need assistance reviewing your data protection practices to ensure compliance with the TAKE IT DOWN Act, please contact one of our Data Protection & AI attorneys.