A PYMNTS Company

Biden Administration Looks To Create Rules For AI/ChatGPT

 |  April 11, 2023

The Biden administration is addressing the regulation of artificial intelligence in response to the proliferation of AI tools such as ChatGPT, which has garnered attention from regulators worldwide.The Commerce Department solicited input from the public on how to establish regulations that guarantee AI systems perform as advertised and decrease potential harm.

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    The Commerce Department is considering the development of an auditing process to ensure the trustworthiness of AI-powered technology amidst an AI arms race in Silicon Valley. The department has proposed the implementation of new assessments and protocols to prevent negative consequences and confirm the accuracy of business statements, similar to financial audits.

    “For these systems to reach their full potential, companies and consumers need to be able to trust them,” said Alan Davidson, the administrator of the Commerce Department’s National Telecommunications and Information Administration, in a news release.

    Related: Biden Calls On Big Tech To Insure AI Safety

    In recent weeks, the government’s interest in AI has accelerated, as consumer advocates and technologists alike descend on Washington, aiming to influence the debate. As companies compete to bring new AI tools to market, policymakers are struggling to both foster innovations in the tech sector while limiting public harm.

    Many policymakers express a desire to move quickly on AI, having learned from the slow process of assembling proposals for social media.