Skip to content
Get our Newsletter

“Take It Down Act”: A new U.S. federal law against non-consensual deepfake pornography


At a time when artificial intelligence is advancing by leaps and bounds, the risks and challenges it brings with it are also growing. One of the most alarming is the use of technology to create fake images for sexual purposes and without the consent of the person portrayed. This phenomenon, known as deepfake porn, has affected thousands of women, teenagers and even public figures like Taylor Swift. But finally, a new federal law promises to change the rules of the game.

Last May 19, President Donald Trump officially signed the Take It Down Act, legislation that makes it a federal crime to publish or threaten to publish non-consensual intimate images, including those generated with artificial intelligence. What was once a problem regulated by a few states, now becomes a national measure with real consequences for those who violate this law.

What is the Take It Down Act?

The full name of this law is “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks” (hence its acronym, TAKE IT DOWN). In short, it seeks to protect people -especially women and minors- from the digital abuse that occurs when intimate images are published or disseminated without their consent, whether real or artificially created.

The law imposes fines and even prison sentences on those who commit this type of crime. It also obliges websites, digital platforms and apps to remove illegal images within a maximum of 48 hours after a victim reports it. In addition, they must ensure that the content does not reappear in other versions or copies.

Melania Trump and the power of the first lady

One of the most visible drivers of this law was the first lady, Melania Trump, who used her political clout to support the bill from Congress. Although her signature on the document is symbolic (first ladies are not elected officials), she was part of the official ceremony in the Rose Garden of the White House, and her participation was key in the bipartisan approval of the bill.

In her speech during the ceremony, Melania described artificial intelligence and social networks as “the digital candy of the new generation,” warning that, although they may seem harmless, these resources can be used as weapons that affect the emotional and cognitive health of children.

His involvement aligns with his “Be Best” campaign, an initiative he launched during Trump’s first term to focus on children’s well-being, the use of social networks and the fight against opioid abuse.

A real problem affecting thousands

The Take It Down Act is not an isolated response: it is a reaction to a growing and very real problem. In January 2024, for example, fake images of singer Taylor Swift, created with artificial intelligence and highly explicit, went viral on social networks. The scandal forced the X platform (formerly Twitter) to block searches related to her name temporarily.

But this is not a problem that only affects celebrities. Many teenagers in different parts of the United States have been victims of this type of content. One of the visible faces of the campaign for the law was Elliston Berry, a young woman from Texas who was the victim of a deepfake generated by a schoolmate using a photo from her Instagram. She was invited by the first lady to the presidential address to Congress, and also attended the bill signing.

Every day I lived in fear that those images would come back out,” Berry said in an interview.“With this law, I won’t have to live with that fear anymore.”

What does the law say and why is it important?

The Take It Down Act marks a before and after in U.S. law.Until now, there was only a federal law protecting minors against sexual deepfakes, but there was no clear regulation to protect adults. 

Now, anyone can report the misuse of intimate images and demand their removal.

In addition, the law obliges technology platforms to act quickly, something that until now was not uniform. Although some companies such as Meta, Google and TikTok had already implemented forms to request the removal of this type of content, not all platforms cooperated or responded with the necessary urgency. With this new law, it is no longer optional.

Even companies like Apple and Google have begun to remove apps and services from their stores that allow manipulation of images to simulate nudity. The message is clear: there is no longer room to look the other way.

Criticism and concerns

Although the law passed almost unanimously (409 votes in favor and only 2 against in the House of Representatives), it is not without its critics. Some digital rights groups and free speech advocates warn that the legislation could be too broad and lead to unwanted censorship, affecting legal content such as LGBTQ+ material or consensual pornography.

Others fear it could be used as a pretext to monitor private communications or impose limits on free speech. Therefore, the challenge now will be to implement the law without stepping on fundamental freedoms, but keeping the focus on protecting victims.

A necessary step in the face of a new type of violence

In the words of Ilana Beller of Public Citizen: “Non-consensual intimate deepfakes are a clear harm, with no social benefit.It’s the kind of content that simply shouldn’t exist.

And while no law is a magic bullet, the Take It Down Act represents a huge step toward a safer and more responsible internet. It is, as Melania Trump said, a national victory and, more importantly, a way to say to victims: you are not alone.