Donald Trump has officially signed the Take It Down Act into law, giving victims of deepfake pornography the right to take legal action against those responsible.
With artificial intelligence tools becoming more widespread, it’s become alarmingly easy for bad actors to generate fake but convincing explicit images of real people, including celebrities.
One major case that sparked outrage was when AI-generated sexual images of Taylor Swift went viral last year. The incident led to public outcry, with fans demanding laws to stop this kind of digital abuse.
Many of these AI-manipulated images end up being shared on social media or uploaded to adult sites, causing serious emotional harm to the people depicted especially when they never gave permission in the first place.
Now, under this new act signed by Trump, victims finally have legal tools to respond and hold platforms and perpetrators accountable.
What is the Take It Down Act?
The Take It Down Act was created to tackle the fast-growing problem of non-consensual explicit content especially when fueled by artificial intelligence.
Congress explained the purpose of the new law as follows: “This bill generally prohibits the nonconsensual online publication of intimate visual depictions of individuals, both authentic and computer-generated, and requires certain online platforms to promptly remove such depictions upon receiving notice of their existence.”
How does it work?
Although the conversation has largely focused on AI-generated images, the law applies more broadly. It covers any intimate photo or video shared without the subject’s consent—what’s commonly known as ‘revenge porn’ and makes it a federal offense.
Under the new law, anyone involved in making or spreading these images faces serious consequences. That includes required restitution for victims and the possibility of jail time, fines, or both.
“Threats to publish intimate visual depictions of a subject are similarly prohibited under the bill and subject to criminal penalties.” the bill continues.
The law also puts pressure on tech companies and platforms. They’ll now be required to remove any reported non-consensual content within 48 hours after a victim notifies them, or face potential penalties.
In addition, all sites will have roughly a year to set up proper systems that let users report this type of content and request removal.