President Donald Trump, seated next to first lady Melania Trump and joined by lawmakers and victims of AI deepfakes and revenge porn, holds a copy of the Take It Down Act during a signing ceremony in the Rose Garden of the White House on May 19. Photo: Getty Images
Now over 100 days into his second term, President Trump has signed among the fewest bills of any president. But on Monday, Trump signed the bipartisan Take It Down Act, which criminalizes distribution of nonconsensual intimate imagery including as AI-generated, deepfake “revenge porn.” The new law renders distribution of this material a federal crime punishable with prison time; it also requires online platforms to establish request-and-removal systems to allow victims to have such photos taken down within 48 hours. The legislation comes at a crucial moment, as AI-based, cyber sexual harassment is rapidly on the rise, and previously, no federal laws addressed this crisis. Yet, the Take It Down Act has been met with fairly mixed reception.
“Take It Down’s success rests entirely on the shoulders of survivors,” Jenna Sherman, campaign director at the feminist organization UltraViolet, told Jezebel. “Survivors wrote the bill, championed the bill, and created the conditions for the bill’s passing. For that reason, UltraViolet is first and foremost meeting this moment with gratitude, pride, and celebration.” Sherman also celebrated that the bill “sets a precedent for the legal system moving quicker to keep pace with technology, especially when it comes to abuse.” But still, Sherman says her organization “is simultaneously meeting this moment with concern and caution.”
The new legislation could also “be used as a weapon for the MAGA regime, and any future administrations that abuse powers,” she said. “What’s more, we are concerned about selective interpretation of this law—it’s not lost on us that some of the tech industry’s biggest offenders of platforming and even profiting off non-consensual sexual deepfakes include some of Trump’s current closest allies.” Further, Sherman raised that “the law’s lack of adequate safeguards against false reporting could expose users to censorship of content that platform CEOs or politicians simply don’t like.”
To Sherman’s point, when President Trump signed the bill, he explicitly said, “I’m going to use that bill for myself, too, if you don’t mind, because nobody gets treated worse than I do online. Nobody.” Earlier this week, the 19th noted that Trump is notorious for his sweeping attacks on free speech and threats to his critics, creating concern “that the bill could be used to remove critical political speech, especially in the context of a wider crackdown by the current administration.” Trump’s statement “demonstrates the risk of political leaders using this bill for greater power and control rather than for protecting survivors,” Sherman said. In a similar vein, laws like the Kids Online Safety Act (KOSA)—another bipartisan bill—similarly claim to address unsafe, obscene material online for the protection of children. But advocates warn this bill will more likely be used to censor information and resources about LGBTQ identity and abortion access.
Trump’s invocation of survivors to push his own agenda—all while being, himself, a legally recognized sexual assailant and serially accused rapist—is entirely in line with the rest of his agenda, sexual violence researcher Dr. Nicole Bedera told Jezebel. This, after all, is the same president whose horrific campaigns against trans people and immigrants are both premised around the bigoted narrative that he’s protecting cis women and girls. “Trump using sexual abuse survivors to justify attacking his enemies—that’s what he does all the time,” Bedera said.
In some ways, Take It Down presents “a promising first step” on the matter of protecting people from cyber sexual abuse and nonconsensual deepfake porn—if the legislation does everything it claims to, Bedera said. But as it currently exists, there are concerning holes.
The law builds on existing federal code involving nonconsensual sexual images, defining these images as including “the uncovered genitals, pubic area, anus or post-pubescent female nipple of an identifiable individual” as well as graphic sexual intercourse, masturbation, and “graphic or simulated lascivious exhibition” of anuses, genitals, or the pubic area. But as I wrote earlier this week, concerningly, the law states that this imagery must be “indistinguishable from an authentic visual depiction” of an identifiable individual—a considerable threshold that could exclude many victims. Further, Bedera raised that all too often, legislation aimed at helping victims “tends to be really ambiguous and give people in power a huge amount of discretion in if and how they implement these laws.” When we leave application of these laws up to organizations like, say, online platforms, which are left to decide their own request-and-removal systems for deepfake sexual abuse material, Bedera fears that these entities “will promote their own interests, which all too often means protecting abusive men.”
“There’s a lot of potential to do good if this law is in the right hands, but it needs to be far more specific about these requirements,” Bedera said.
As a sexual violence researcher, Bedera says she’s conducted interviews with numerous survivors of cyber sexual abuse who have struggled to get content including both real and AI-generated, nonconsensually posted sexual images of themselves taken down—but upon reporting this content, they find that “the definitions tech companies have for what they will or won’t remove are so narrow.” The Take It Down Act, Bedera argues, similarly reflects this: The law’s stringent emphasis on nudity and highly graphic sexuality as requirements for material to be taken down is likely to exclude many victims.
“This framework is so focused on graphic nudity when it should be focused on consent—if someone’s posting images of you, certainly sexual images, without your consent, it shouldn’t matter the degree of nudity. That’s violating and should just be taken down,” Bedera said. Many of the victims she’s interviewed said they’re being impersonated, harassed, or subjected to sexualized online disinformation campaigns about themselves—but not always with photos that involve full nudity. Consequently, they aren’t getting the help and support they need from online platforms.
Meanwhile, people who often do find their posts taken down are women and sometimes sex workers who consensually post images of themselves that are flagged for removal on these same platforms, and even if their content adheres to the platforms’ standards of conduct. “There are real risks to free speech that come with this law’s enactment,” Sherman said. “Because this law does not contain adequate safeguards against false reporting, it could be used to censor content that platform CEOs or politicians simply don’t like. … And who are the people putting out sexual content or information about reproductive health? Primarily women, gender-expansive people, and sex workers, who are also most likely to be survivors themselves.” Organizations like the Electronic Frontier Foundation have also raised concerns about the new law and its potential impacts on sex workers. The law’s 48-hour takedown requirement, this organization says, may not be enough time for online platforms to verify whether content is nonconsensual and remove sexual content that is consensually posted for adult audiences.
The Take It Down Act’s criminal provisions will take effect immediately. But media platforms will have one year to set up the required request-and-removal systems. As we collectively continue to try to understand the new law’s layers and complexities, I recommend holding on to resources like the Cyber Civil Rights Initiative, which offers a free, 24/7 hotline to victims of image-based sexual abuse at 1-844-878-2274, as well as a list of contacts to reach out to if you’re trying to remove such images from online platforms.
In any case, while Sherman praised some aspects of Take It Down, she didn’t mince words about the circumstances under which it’s been signed: “Trump is… a political figure mired in sexual assault scandals. This paradox sends a concerning, ironic message that laws actually don’t matter if you have enough power to evade them.”
If you or someone you know are experiencing sexual abuse or domestic violence and seeking support, you can reach the National Domestic Violence Hotline online or at their free 24/7 phone line at 1-800-799-7233.