One of America’s top financial watchdogs is warning companies not to make false artificial intelligence claims.
Securities and Exchange Commission (SEC) Chair Gary Gensler referred to this practice as “AI washing” at a conference Tuesday (Dec. 5), comparing it to “greenwashing,” the term for when businesses overstate their environmental records.
“Don’t do it,” said Gensler, whose comments at the conference hosted by news outlet The Messenger were reported by The Wall Street Journal (WSJ). “One shouldn’t greenwash, and one shouldn’t AI wash.”
The explosion in AI applications this year has sparked worries that companies’ marketing might embellish their actual capabilities, the report said.
And while Gensler didn’t touch on what the SEC might be doing to crack down on “AI washing,” other regulators are monitoring the problem.
The Federal Trade Commission (FTC) said earlier this year that it had issued guidance about claims regarding the capabilities of AI-powered products.
“Developers of these tools can potentially be liable if technologies they are creating are effectively designed to deceive,” FTC Chair Lina Khan said at the time.
The rate and speed at which AI technology’s capabilities are evolving have caused observers to believe there is a chance for America to lead the way by supporting innovation while remaining smart and clear-eyed about the risks of AI.
“Anybody thinking about regulation needs to start with an outside-in approach,” Sarkissian said. “If you’re looking at concentric circles, go with the farthest thing and create the final boundary.”
Meanwhile, this week saw reports that the European Union’s efforts to regulate AI face a roadblock, as negotiators struggle to reach a consensus on the treatment of foundation models, specifically generative AI systems like ChatGPT.
A trio of European countries — Germany, France and Italy — last month agreed on regulations, saying they favored voluntary yet binding commitments on AI providers in the EU.
For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.