Share this @internewscast.com
The Take It Down Act is on its way to President Donald Trump’s desk after receiving overwhelming support from the House with a 409-2 vote. This bill mandates that social media companies remove flagged nonconsensual sexual content, including those generated by AI. Trump has committed to signing it.
Among the few online safety legislations to successfully clear both chambers amid ongoing concerns over deepfakes and child safety issues, critics worry the law could be misused against content disapproved by the administration or its supporters. The legislation criminalizes the distribution of nonconsensual intimate images, whether authentic or AI-made, and compels social media platforms to remove such content within 48 hours after it is reported. In a Congress address this year, Trump joked that he intended to use the bill himself, pointing out his own unfavorable treatment online.
The surge of AI tools capable of crafting realistic images has escalated worries about the spread of harmful deepfake content, particularly in schools, creating new avenues for bullying and harassment. Critics acknowledge the significance of addressing these concerns but fear that the approach taken by the Take It Down Act could potentially be exploited to cause harm in unintended ways.
The Cyber Civil Rights Initiative (CCRI), which was created to combat image-based sexual abuse, said that it can’t cheer the Take It Down Act’s passage. “While we welcome the long-overdue federal criminalization of NDII [the nonconsensual distribution of intimate images], we regret that it is combined with a takedown provision that is highly susceptible to misuse and will likely be counter-productive for victims,” the group writes. It fears that the bill, which empowers the Federal Trade Commission — whose Democratic minority commissioners Trump fired in a break with decades of Supreme Court precedent — will be selectively enforced in a way that ultimately only props up “unscrupulous platforms.”
“Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII,” they write. “Platforms attempting to identify authentic complaints may encounter a sea of false reports that could overwhelm their efforts and jeopardize their ability to operate at all.”
“Platforms may respond by abandoning encryption entirely”
Because of the quick turnaround for platforms to remove content flagged as nonconsensual intimate imagery, the Electronic Frontier Foundation (EFF) warns that especially smaller platforms “will have to comply so quickly to avoid legal risk that they won’t be able to verify claims.” Instead, they’ll likely turn to flawed filters to crack down on duplicates, they write. The group also cautions that end-to-end encrypted services including private messaging systems and cloud storage are not exempted from the bill, posing a risk to the privacy technology. Since encrypted services can’t monitor what their users send to one another, the EFF asks, “How could such services comply with the takedown requests mandated in this bill? Platforms may respond by abandoning encryption entirely in order to be able to monitor content—turning private conversations into surveilled spaces,” including ones that abuse survivors commonly turn to.
Even so, the Take It Down Act quickly garnered a wide base of support. First Lady Melania Trump has become a leading champion of the bill, but it’s also seen backing from parent and youth advocates, as well as some in the tech industry. Google’s president of global affairs Kent Walker called the passage “a big step toward protecting individuals from nonconsensual explicit imagery,” and Snap similarly applauded the vote. Internet Works, a group whose members include medium-sized companies like Discord, Etsy, Reddit, Roblox, and others, praised the House vote, with executive director Peter Chandler saying the bill “would empower victims to remove NCII materials from the Internet and end the cycle of victimization by those who publish this heinous content.”
Rep. Thomas Massie (R-KY), one of two members (both Republican) who voted against the bill, wrote on X that he couldn’t support it because “I feel this is a slippery slope, ripe for abuse, with unintended consequences.”