The TAKE IT DOWN Act: A Federal Response to Non-Consensual Intimate Imagery and AI-Generated Deepfakes
On April 28, 2025, Congress passed the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, or the TAKE IT DOWN Act. Having cleared both chambers of Congress with overwhelming bipartisan support (409–2 in the House and unanimous in the Senate), the legislation now awaits the signature of President Donald J. Trump. While the bill is hailed by many as a significant step toward protecting victims of digital sexual abuse, civil liberties advocates are raising red flags about its implications for speech, privacy, and the future of internet governance.
Statutory Purpose and Key Provisions
The TAKE IT DOWN Act is designed to address the proliferation of non-consensual intimate imagery (NCII), including sexually explicit content generated through artificial intelligence, commonly known as “deepfakes.” The Act prohibits any person from knowingly publishing or threatening to publish an intimate visual depiction of another individual—whether the depiction is of a real person or artificially generated—without the consent of the depicted individual or with the intent to cause harm. A violation of this provision constitutes a criminal offense and is subject to penalties including fines and imprisonment. The statute applies irrespective of the veracity or origin of the image and without regard to the age of the individual depicted.
Notably, the bill imposes obligations not only on perpetrators, but also on intermediary platforms—websites, apps, and social media companies that host user-generated content. Upon receiving a takedown request from an individual depicted in such content, platforms are required to:
Remove the material within 48 hours of receiving notice.
Delete duplicative or derivative content, including through automated means.
Establish accessible complaint mechanisms for victims.
These obligations apply broadly and irrespective of whether the image is of a minor or adult, or whether it was created using AI tools. Moreover, the Act derives its regulatory authority not from Section 230 of the Communications Decency Act—which immunizes platforms from most user-posted content—but rather from the Federal Trade Commission’s authority to police “unfair or deceptive trade practices.”
Origins and Legislative Motive
The legislation emerged in response to a disturbing trend: the viral spread of AI-generated intimate imagery targeting minors. Two teenage victims, Elliston Berry of Texas and Francesca Mani of New Jersey, became central figures in the bill’s development after they were subjected to fabricated nudes.
The bill garnered support from across the ideological spectrum, including Meta (Facebook, Instagram), the American Principles Project, and Melania Trump, who personally advocated for the law as part of her "Be Best" initiative.
Tech companies, including Meta and Snap, ultimately endorsed the bill, which notably avoids changes to Section 230 of the Communications Decency Act. Instead, enforcement is channeled through the Federal Trade Commission (FTC), leveraging its authority over deceptive and unfair trade practices.
Legal and Constitutional Concerns
Despite its laudable objectives, the TAKE IT DOWN Act raises several constitutional and operational concerns:
1. First Amendment Implications
The Act’s language concerning “intimate visual depictions” is arguably overbroad and vague, risking suppression of lawful speech. As the Electronic Frontier Foundation (EFF) and other civil society groups note, the definition could encompass:
Journalistic images of public nudity or protest;
Law enforcement images used to identify suspects;
Artistic or educational content involving nudity;
Legal and consensual adult pornography misreported as NCII.
The absence of clear statutory carve-outs or procedural safeguards for misidentification may render the law vulnerable to constitutional challenge under the First Amendment’s overbreadth doctrine, which prohibits laws that chill a substantial amount of protected expression relative to their lawful applications.
2. Lack of Procedural Due Process
By mandating takedowns within 48 hours of receiving a complaint—with no judicial oversight or verification standard—the law incentivizes preemptive removal without investigation. Smaller platforms may lack the resources to distinguish legitimate claims from bad-faith requests aimed at silencing critics or stifling political expression.
3. Compelled Monitoring and Encryption Risks
To comply with the takedown regime, platforms may be incentivized to proactively scan content for violations—including in encrypted environments. Such pressure risks undermining end-to-end encryption, a cornerstone of digital privacy and cybersecurity. Requiring monitoring of encrypted communications raises additional Fourth Amendment and privacy law implications.
FTC Enforcement and Practical Limits
The Act tasks the Federal Trade Commission with enforcement—a novel approach, but one not without limitations. The FTC’s bandwidth and legal toolkit are notoriously limited, especially after structural weakening during previous administrations. The agency may face challenges implementing the volume and complexity of investigations required under the Act’s short timeframes.
Further, enforcement under Section 5 of the FTC Act (covering unfair or deceptive practices) creates legal ambiguity, since platforms may contest whether delayed takedowns truly constitute “unfair” business practices under established precedent.
Toward Balanced Reform
While the TAKE IT DOWN Act represents a serious attempt to address the rise in AI-generated sexual abuse, a more balanced and constitutionally sound approach would:
Refine definitions of NCII to exclude public interest content and clearly protected speech;
Implement procedural safeguards, such as penalties for fraudulent takedown requests and an appeal process;
Establish a federal civil remedy for victims without placing primary liability on platforms;
Strengthen enforcement of existing state and federal privacy and harassment laws.
Congress would also do well to support technical and legal resources for victims, including funding for content removal tools and law enforcement training, rather than relying heavily on automated moderation systems and legal mandates.
Conclusion
The passage of the TAKE IT DOWN Act reflects Congress’s increasing willingness to regulate digital harms, especially those amplified by artificial intelligence. However, in its current form, the legislation threatens to infringe on fundamental rights, particularly free expression and due process, while placing significant compliance burdens on platforms large and small.
As this law moves toward implementation, courts, regulators, and potentially future legislative amendments will play a decisive role in balancing victim protection with the preservation of lawful speech and digital privacy.