News Page

Main Content

Revenge Porn Bill Becomes Law with Melania Trump’s Support

Your Life Buzz's profile
Your Life Buzz
Apr 29

In a rare show of bipartisan unity, Congress has passed the Take It Down Act, a bill aimed at stopping the spread of non-consensual intimate imagery and deepfake pornography online.

Revenge Porn Bill Becomes Law with Melania Trump’s Support

The legislation sailed through the House in a 409–2 vote and cleared the Senate unanimously. It now heads to President Donald Trump's desk.

Melania Trump Pushes for Action on Revenge Porn

The Take It Down Act—officially titled the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act—makes it a crime to distribute non-consensual intimate images (NCII), whether they are real or created with AI.

Under the bill, websites must remove flagged content within 48 hours of a victim’s request. Platforms that fail to act could face heavy fines or criminal penalties.

First Lady Melania Trump publicly backed the bill, tying it to her longtime Be Best initiative, which promotes online safety.

"Today's bipartisan passage of the Take It Down Act is a powerful statement that we stand united in protecting the dignity, privacy, and safety of our children," Melania Trump said.

Deepfake Pornography: Unexpected Weapon of Abuse

In just a few years, deepfake technology has gone from harmless internet entertainment to a serious tool for exploitation.

By 2021, teenagers—especially girls— were discovering their faces edited into pornographic videos they had no part in creating.

The fake videos spread quickly across porn sites and social media, looking so real that even moderators often missed them. With no clear laws to lean on, victims had almost no way to fight back.

Gaps in the Law Left Victims Powerless

The rise of AI-generated abuse exposed serious holes in federal law. Victims had no reliable way to get images taken down, and prosecutors struggled to build cases.

Deepfakes weren’t always made with hacked or stolen content, so existing laws didn’t always apply.

Public hearings showed just how widespread the problem had become—girls as young as thirteen were being targeted, with abusers using nothing more than photos scraped from social media.

Law enforcement was swamped. Many cases involved anonymous users and platforms that lacked the tools—or the urgency—to respond.

States Stepped In First, But Protections Fell Short

By late 2023, states like California and New York passed laws to crack down on deepfake pornography. But because each state took a different approach, protections varied widely. In some parts of the country, victims were still left with no legal safety net.

The pressure for a nationwide response kept growing. Eventually, it pushed Congress to draft what would become the Take It Down Act.

A Federal Shift in How the Law Treats Digital Exploitation

The Take It Down Act puts deepfake porn and non-consensual intimate imagery in the same legal category as other forms of sexual exploitation. It gives victims a way to fight back—and holds platforms and offenders accountable, even when the content is AI-generated.

Senators Ted Cruz and Amy Klobuchar, who co-sponsored the legislation, say the law finally brings federal protections up to speed with the realities of the internet.

"As the dark side of technology advances, these unspeakable evils become part of the culture, and the law has to keep up," said Cruz during a Senate briefing.

The bill rapidly gained traction with Melania Trump's endorsement and strong bipartisan backing.

It includes strict obligations for online platforms. Repeat noncompliance could lead to enforcement by the Federal Trade Commission and Department of Justice.

By April 2025, the bill passed Congress with near-unanimous support and is now awaiting the President's signature.

Tech Giants and Civil Liberties Groups Clash Over Revenge Porn Bill

While many have praised the bill, not everyone is on board. Meta, Snap, and other tech companies have voiced support, calling the law a balanced approach to platform accountability.

But not everyone supports the bill. Civil liberties groups, including the Electronic Frontier Foundation, warn it could lead to government overreach. They argue that the broad language might censor legal content, weaken encryption, or force platforms to over-police user posts.

What the 'Take It Down Act' Means for Victims of Non-Consensual Imagery

For victims of non-consensual intimate imagery, the Take It Down Act finally offers a clear path to justice. It gives them the right to demand removal of explicit or AI-generated content that uses their likeness without consent.

"This is just the beginning," said Klobuchar. "The digital world is changing fast, but we can't leave people behind when it comes to safety."

The bill—passed with overwhelming bipartisan support and backed by First Lady Melania Trump—now awaits President Trump’s signature. If signed into law, it would mark a long-overdue shift in how the U.S. confronts digital exploitation.

Latest News

Around The Web