A PYMNTS Company

House Passes ‘Take It Down Act,’ the First Major AI-Related Federal Regulation

 |  April 29, 2025

The U.S. House of Representatives passed the Take It Down Act Monday by an overwhelming bipartisan vote of 409 to 2, sending the first major piece of federal AI-related regulation to the president’s desk. The bill, originally introduced by Sens. Ted Cruz (R-Tex.) and Amy Klobuchar (D-Minn.) passed the Senate unanimously in February and President Trump has vowed to quickly sign it into law.

The bill makes it a federal crime to share non-consensual sexually explicit images and photos, including AI-generated deepfakes, and requires platforms to remove such images within 48 hours of a victim’s report. Nearly ever state has a similar law already on its books, including 20 that explicitly target AI deepfakes. The passage of the Take It Down Act, however, gives federal prosecutors a weapon to use against purveyors of so-called revenge porn and deepfakes.  First Lady Melania Trump campaigned heavily for the bill’s passage, as did several victims including many children.

Despite its nearly unanimous support on Capitol Hill, the bill is not without its critics. The Electronic Frontier Foundation warned the 48-hour takedown clock may not allow platforms, particularly smaller platforms, sufficient time to verify that an image of video is, in fact, non-consensual and meets the law’s definition of sexually explicit. As a result, according to EFF, platforms are likely to turn to automated filters that are prone to flagging legal images such as fair-use commenting and news reporting.

EFF also warned that platforms that provide end-to-end message encryption could be served takedown notices they cannot comply with, given the design of their systems, without breaking the encryption that is the core of their service offering.

Even the Cyber Civil Rights Initiative, an advocacy group that developed the model revenge-porn law adopted by many states, voiced reservations about the federal bill.

“While supportive of the bill’s criminal provision relating to authentic nonconsensual intimate images… CCRI has serious reservations about S. 146’s reporting and removal requirements,” the group said in statement in March. “Encouraging speedy removal of nonconsensual intimate imagery from platforms is laudable, but the provision as written is unconstitutionally vague, making it difficult for individuals and platforms to understand what conduct is prohibited or required. The provision is also unconstitutionally overbroad, extending well beyond unlawful imagery.”

The Take It Down Act is separate from the NO FAKES Act, which originally was introduced last year in the previous Congress but never came to a vote in either the full House or Senate. It was re-introduced in the Senate in early April by Sens. Chris Coons (D-Del.) and Marsha Blackburn (R-Tenn.) and in the House by Reps. Madeleine Dean (D-Penn.) and Maria Salazar (R-Fla.).

Unlike Take It Down, NO FAKES (“Nurture Originals, Foster Art, and Keep Entertainment Safe”) is aimed primarily at protecting creators and performers by establishing a new intellectual property right in an individual’s voice and likeness. Like Take It Down, however, NO FAKES would require platforms to remove content containing unauthorized or deepfake instances of a voice or likeness once notified by the rights holder.

While it has broad bipartisan support in both houses of Congress passage of the NO FAKES Act is likely to be a heavier legislative and constitutional lift than Take It Down due to its provision establishing a new, federal intellectual property right not directly related to promoting invention and creative expression.