TAKE IT DOWN Act? Yes, Take The Act Down Before It's Too Late For Online Speech

Federal legislation that would protect people from having explicit images of themselves posted and shared online without their consent is set to become law in the USA after passing the House on Monday.

But advocacy groups warn the loosely worded fine-print will cause collateral damage to internet companies and free speech rights.

The rationale for the TAKE IT DOWN Act - protecting people from the non-consensual disclosure of intimate imagery (NDII) online - has overwhelming support. The trouble is the lack of safeguards to prevent people from using the legislation to try and remove protected speech they don't like, and to bombard online companies with frivolous takedown requests.

House Rep Thomas Massie (R-KY) was one of only two reps who voted against the bill amid 409 who voted for it. "I’m voting NO because I feel this is a slippery slope, ripe for abuse, with unintended consequences," he said on X.

This mechanism is likely unconstitutional and will undoubtedly have a censorious impact on users' free expression

TAKE IT DOWN is an acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act. Beyond prohibiting NDII, it requires the removal of flagged images within 48 hours. President Trump is expected to sign it into law now that it's passed both halves of Congress.

Jason Kelly, activism director for the Electronic Frontier Foundation (EFF), explained in a briefing note Monday the bill has major flaws.

"The takedown provision in TAKE IT DOWN applies to a much broader category of content – potentially any images involving intimate or sexual content – than the narrower NCII definitions found elsewhere in the bill," he said. "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests."

The 48-hour response requirement compounds the problem of bad-faith takedown requests. As Kelly points out, the short window leaves websites, apps, and other online services with very little time to determine whether the challenged image is actually illegal, so many are expected to err on the side of caution. The result looks likely to be preemptive limitations on posting images, or overly broad censorship that leads to the removal of lawful visual material.

In a letter sent to lawmakers prior to the Senate's February approval of the law bill, the Center for Democracy & Technology, the EFF, and ten other advocacy groups, said, "In its current form, the bill creates a notice and takedown (NTD) mechanism that would result in the removal of not just nonconsensual intimate imagery but also speech that is neither illegal nor actually NDII. This mechanism is likely unconstitutional and will undoubtedly have a censorious impact on users' free expression."

The Digital Millennium Copyright Act takedown mechanism, available to copyright holders, continues to see widespread abuse. There's no reason to believe TAKE IT DOWN removal requests will be any different given the bill's expansive scope.

President Trump has already stated that he intends to use the law for his own benefit. At his March 4, 2025, joint address to Congress, Trump said he looked forward to signing the bill, adding, "And I’m going to use that bill for myself too, if you don’t mind — because nobody gets treated worse than I do online. Nobody."

Encryption in firing line

The legislation also poses a threat to encryption by creating content removal obligations for service providers that don't have access to end-to-end encrypted user content. Although email services are exempt, direct messaging services, cloud storage services, and other private communication services are not.

"As currently drafted, the bill would create takedown obligations for providers of E2EE services that do not have access to the content that users share, store, or generate," the CDT letter says. "The TAKE IT DOWN Act, therefore, either would create an obligation to take down content to which a provider has no access — an impossible obligation — or incentivize content filtering that would break encryption."

The Cyber Civil Rights Initiative (CCRI), an advocacy group that was not a signatory of the CDT letter, on Monday expressed similar reservations about the law.

"CCRI repeatedly raised its concerns about the notice and removal provision with federal lawmakers in the hopes that significant revisions would be made to the bill prior to passage," the group said in a statement.

"While some of our suggested revisions were made, the takedown provision as passed by Congress today remains unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse."

All US states, except for South Carolina, have passed so-called "revenge porn" laws to address NDII. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more