A PYMNTS Company

How the House Antitrust Bills Preserve Platforms’ Editorial Discretion and Spur Consumer Choice

 |  August 26, 2021

By: Lisa Macpherson & John Bergmayer (Public Knowledge)

In late June, the House Judiciary Committee voted to approve, on a bipartisan basis, a six-part package of legislation designed to restrict dominant digital platforms from leveraging their power to disadvantage competitors or promote their own lines of business unfairly. We hope the bills will eventually have counterparts or companions from the Senate.

Our purpose here is to assess the impact of these bills on the platforms’ content moderation policies and practices. We want to be clear that this package of legislation is not primarily focused on content moderation; it is focused instead on promoting competition and addressing the power of the largest platforms by creating new antitrust and pro-competition laws that suit their unique dynamics. The bills are based on an extensive investigation into the platforms’ market dominance and anti-competitive practices, not specifically their content moderation practices. That said, we would want any evaluation or evolution of the bills to include their potential impact on the role of platforms in hosting and managing user content, and, by extension, on the quality and safety of information available to consumers.

First, in aggregate, the six House antitrust bills would serve to reduce the dominant role each platform plays in hosting and amplifying user-created content. With more competition in the social media, search, e-commerce, app store, and other platform markets, consumers could vote with their feet (or more accurately, their fingers) for the platforms whose content moderation policies they value. That inherently means that the power of the dominant platforms in political and social discourse will be diminished, and the power of the individual enhanced. Importantly, we can’t know that more competition will necessarily result in a “race to the top” in terms of content moderation that protects user safety. In fact, it may result in new platforms that actually cater to those who seek out harmful, but legal content — but the reach of such content could be diminished since the distribution of content will be more fragmented, and it increases the likelihood that only those users with an interest in content of that nature will see it…