A PYMNTS Company

More Than 20 States Now Have Privacy Laws. Is Your Company Keeping Up? 

 |  March 17, 2026

The privacy rules that governed most U.S. companies just five years ago are now dangerously out of date, and regulators are starting to act on it.

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    For years, most American companies set their privacy rules around two big laws: the European Union’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). Those were the standards, and most legal teams built their compliance programs around them. That era is over.

    Since 2020, more than 20 U.S. states have passed their own comprehensive privacy laws, and more are expected to follow. The result is a complex web of rules that varies by state, but all pointing in the same direction: companies must do more to protect the personal data they collect.

    A new analysis by Barnes & Thornburg, a national law firm, lays out what has changed and what companies should do about it. The firm warns that organizations whose privacy programs were built around the standards of 2018 to 2020 are almost certainly operating with significant gaps. The rules have changed. Most compliance programs have not.

    Barnes & Thornburg notes that regulatory agencies are bringing more cases, under more statutes, with sharper focus on specific data practices that have attracted public concern. Artificial intelligence is a major flashpoint. Regulators are scrutinizing how companies use biometric data, whether AI systems are processing personal information in ways users never agreed to, and how consumer data is being used to train algorithms.

    WHAT’S NEXT IN ANTITRUST AND TECHNOLOGY REGULATION

    We’d love to be your preferred source for news.

    Please add us to your preferred sources list so our news, data and interviews show up in your feed. Thanks!

    “The cost of noncompliance — fines, litigation exposure, operational disruption, and reputational harm — is becoming more concrete, and the regulatory appetite for enforcement continues to grow,” the post says.

    One of the biggest practical changes involves how states define “sensitive” personal data. Historically, that category meant things like Social Security numbers, health records, financial account information and biometric identifiers. Those still qualify. But recent state laws have added new categories that many companies are not prepared for.

    Read more: The Hidden Security Risk Inside Your Company’s AI Tools 

    Information about union membership, certain online activity, and aspects of a person’s personal life now qualify as sensitive data in several states. That matters because sensitive data typically requires opt-in consent from the user, which is a much higher bar than the opt-out mechanisms most companies already have in place. Human resources departments are particularly exposed. Old job application forms and employee databases often contain newly-sensitive data that was collected without the consent those states now require.

    The protection of teenage users is another rapidly shifting area. For years, special rules applied to children under 13, and everyone 18 and older was treated as an adult. The middle ground — teenagers — was largely unregulated.

    That is changing fast. A growing number of states are extending heightened privacy protections to anyone under 18. Apple and Google have added a layer of urgency by implementing new age-verification tools in their app stores. Under these systems, developers receive information about user ages directly from the platform and are expected to apply different consent rules and data restrictions based on that age information. Non-compliance risks losing access to the app store entirely, which is a far more immediate consequence than most regulatory timelines.

    Barnes & Thornburg argues that the old approach of responding to each new state law one at a time is no longer workable. A better strategy, the firm says, is to build a program grounded in the basic principles that almost every state law shares: transparency about what data are collected, proper consent mechanisms, clear contracts with vendors who handle personal data, and an up-to-date inventory of what data the company actually holds.

    That last point matters more than it might seem. Many companies completed privacy audits several years ago and have not revisited them since. In a landscape that has changed this dramatically, a 2020 data inventory is simply a snapshot of a company that no longer exists.