Law and Policy

President Trump Signs “Take It Down Act” to Combat Deepfake Porn and Online Exploitation

Source

President Donald Trump has signed into law the Take It Down Act, a major bipartisan effort to combat one of the internet’s most disturbing and rapidly growing threats: the spread of non-consensual intimate imagery, including AI-generated deepfakes and revenge porn.

Passed with overwhelming support, 409 to 2 in the House and by unanimous consent in the Senate, the new law marks a rare moment of unity in Washington and a significant step toward protecting individuals, particularly women and children, from digital sexual abuse.

The bill gained momentum in large part due to the efforts of First Lady Melania Trump, who revived her Be Best initiative to advocate for children’s safety online. She lobbied lawmakers, hosted roundtables with survivors, and made public appeals to pass the measure.

At the White House signing ceremony, President Trump invited her to add her signature to the legislation, saying, “She deserves to sign it.” Though symbolic, since First Ladies hold no formal legislative authority, it underscored her visible role in bringing the bill to life.

The Take It Down Act makes it a federal crime to knowingly publish or threaten to share sexually explicit images or videos of someone without their consent. This includes content generated or altered with artificial intelligence, such as deepfakes.

Platforms that host user-generated content, including social media sites, are now legally required to remove such material within 48 hours of a verified takedown request from a victim. They must also make reasonable efforts to delete duplicates and reposts. If children are depicted, violators can face up to three years in prison; for adult victims, the penalty can reach two years.

The law authorizes the Federal Trade Commission to enforce compliance, treating failure to act within the 48-hour window as an unfair or deceptive practice. It includes provisions to balance enforcement with free speech protections, specifying that images must meet a “reasonable person” standard to be considered indistinguishable from real ones. Exceptions are allowed for educational, medical, or law enforcement use, provided they are carried out in good faith.

Despite the law’s promise, it has drawn criticism from some digital rights advocates and legal scholars who warn of several shortcomings.

Most notably, the law is reactive, it only steps in once harm has already occurred. There are no requirements for platforms to proactively detect or block this type of content before it spreads. And because the law applies only to public-facing platforms, it leaves a blind spot for content shared within private forums, encrypted channels, or peer-to-peer networks, places where much of this material often originates and circulates undetected.

Victims also bear the burden of initiating the takedown process, often while navigating the emotional trauma and humiliation of their exploitation. They must identify the content, prove it was shared without consent, and submit personal information, steps that can deter or re-traumatize those already harmed.

Critics also point to vague language that could be misused to silence lawful expression or censor legitimate content, especially in contexts involving LGBTQ+ representation, art, or journalism.

Another concern is that the law may be too lenient on perpetrators. If offenders claim the content caused no demonstrable harm or was shared for public concern, they might evade liability. The law does bar consent obtained through direct threats or coercion, but it does not account for more subtle forms of manipulation that often occur in unequal relationships.

Still, for many affected persons and advocates, the law offers a long-overdue acknowledgment of how devastating image-based sexual abuse can be, and a tool to fight back. Melania Trump called the act a “national victory” that will help protect children from the cognitive and emotional dangers of today’s digital landscape. President Trump added, “What’s happening online is just so horribly wrong. Today, we’re making it totally illegal.”

While the Take It Down Act is not perfect, it lays the groundwork for future reforms. Legal scholars and privacy advocates argue that lawmakers must go further, creating systems that prevent abuse before it happens and hold both platforms and offenders more fully accountable. In an age where a realistic AI-generated image can be made in minutes, swift and enforceable protections are not just a legal necessity, they’re a moral imperative.

Read more about the article here

Image Source

Show More

Related Articles

Back to top button