Economists Agree That Stronger Legal Liability for Online Platforms Would Reduce Disinformation

By: Will Macheel (ProMarket)
The US Supreme Court recently heard arguments on two cases concerning Section 230 of the 1996 Communications Decency Act. The act shields tech companies from being held accountable for user-generated content, while granting publishers the freedom to remove objectionable content. The cases examine whether tech companies should be held responsible for terrorist attacks facilitated through their platforms.
There are concerns about disinformation and political speech bias, leading both liberal and conservative politicians to push for limits on Section 230’s protections. However, the immunity from liability has fueled the growth of social media giants like Facebook, Google, and Twitter.
More than half of the economists surveyed by the Clark Center believe that stronger legal liability for online platforms would help reduce disinformation…
Featured News
DOJ Antitrust Chief Gail Slater Assembles Veteran Team for Key Cases
Mar 16, 2025 by
CPI
UK Demands Access to Apple’s Encrypted Cloud Data, Spark Legal and Privacy Battle
Mar 16, 2025 by
CPI
Turkey Probes Netflix, Disney+, and Amazon Over Anti-Competitive Practices
Mar 16, 2025 by
CPI
Elon Musk and OpenAI Agree to Accelerate Trial Amidst Legal Battle Over AI’s For-Profit Shift
Mar 16, 2025 by
CPI
AI in Markets: A Double-Edged Sword for Competition, Says CCI Chief
Mar 16, 2025 by
CPI
Antitrust Mix by CPI
Antitrust Chronicle® – Self-Preferencing
Feb 26, 2025 by
CPI
Platform Self-Preferencing: Focusing the Policy Debate
Feb 26, 2025 by
Michael Katz
Weaponized Opacity: Self-Preferencing in Digital Audience Measurement
Feb 26, 2025 by
Thomas Hoppner & Philipp Westerhoff
Self-Preferencing: An Economic Literature-Based Assessment Advocating a Case-By-Case Approach and Compliance Requirements
Feb 26, 2025 by
Patrice Bougette & Frederic Marty
Self-Preferencing in Adjacent Markets
Feb 26, 2025 by
Muxin Li