A bill aimed at combating the rise of deepfake pornography has been incorporated into the year-end government funding package revealed on Tuesday, potentially clearing the way for its swift enactment. Known as the Take It Down Act, the legislation seeks to criminalize the distribution of nonconsensual intimate imagery, including content created using artificial intelligence (AI). The bill also mandates that online platforms remove such content within 48 hours of notification.
The Senate passed the bill earlier this month, but it had yet to be addressed in the House. Its inclusion in the continuing resolution, which must pass by Friday to prevent a government shutdown, has significantly boosted the measure’s prospects. According to a statement, the bill is seen as an essential step in curbing the abuse of deepfake technology for harmful purposes.
A Shield for Victims of Exploitation
The legislation has been hailed as a crucial tool to support victims like Molly Kelley of Otsego, Minnesota, whose life was upended when a friend posted an explicit AI-generated video of her online. “My initial shock turned horror when I learned that the same person had targeted about 85 other women, some of whom I know personally,” Kelley shared in her testimony. Her experience underscores the widespread and deeply personal harm inflicted by such technology.
Advocates, including Senator Amy Klobuchar, stress the urgency of addressing the issue. “It is estimated that one in 12 American adults have had some type of image distributed without their consent,” Klobuchar stated. According to a statement, nearly half of those victims reported being stalked or harassed online after the images surfaced.
Tech Industry Support
Major players in the tech industry, including Google, Meta, TikTok, and Bumble, have expressed support for the legislation. The bill establishes clear accountability by making it illegal for individuals to knowingly publish deepfake content on social media platforms. It also strengthens legal mechanisms to prosecute offenders, making it easier to address these violations in court.
The widespread backing highlights the growing consensus on the need for decisive action against deepfake technology misuse. As social media and AI technology continue to evolve, lawmakers and industry leaders alike recognize the necessity of measures to protect individuals from exploitation.
Countdown to Passage
With just days remaining before the government funding deadline, the Take It Down Act stands on the brink of becoming law. Per a statement, its inclusion in the year-end funding deal underscores its importance as a legislative priority.
Featured News
FTC and State Attorneys General Sue John Deere Over Repair Restrictions in Antitrust Case
Jan 15, 2025 by
CPI
Enbridge Wins Legal Battle Against Ducere’s Antitrust Allegations
Jan 15, 2025 by
CPI
GOP Pushes for Antitrust Authority Consolidation Under DOJ in New Legislation
Jan 15, 2025 by
CPI
Canadian Government Approves Bunge-Viterra Merger with Conditions
Jan 15, 2025 by
CPI
SEC Sues Elon Musk Over Delayed Disclosure of Twitter Stock Ownership
Jan 15, 2025 by
CPI
Antitrust Mix by CPI
Antitrust Chronicle® – CRESSE Insights
Dec 19, 2024 by
CPI
Effective Interoperability in Mobile Ecosystems: EU Competition Law Versus Regulation
Dec 19, 2024 by
Giuseppe Colangelo
The Use of Empirical Evidence in Antitrust: Trends, Challenges, and a Path Forward
Dec 19, 2024 by
Eliana Garces
Some Empirical Evidence on the Role of Presumptions and Evidentiary Standards on Antitrust (Under)Enforcement: Is the EC’s New Communication on Art.102 in the Right Direction?
Dec 19, 2024 by
Yannis Katsoulacos
The EC’s Draft Guidelines on the Application of Article 102 TFEU: An Economic Perspective
Dec 19, 2024 by
Benoit Durand