Lawmakers Seek New Regs Aimed At Tech Algo Biases


U.S. lawmakers have proposed a bill that would make tech companies detect and remove any discriminatory biases found in their technologies.

The Algorithmic Accountability Act of 2019 would give new power to the U.S. Federal Trade Commission (FTC), as well as require tech companies with annual revenue above $50 million to study if race, gender or other biases are embedded in their computer models. The rules would also apply to data brokers and businesses with over a million consumers’ data.

“Computers are increasingly involved in the most important decisions affecting Americans’ lives — whether or not someone can buy a home, get a job or even go to jail,” Democratic Senator Ron Wyden said in a press release, according to Reuters. “But instead of eliminating bias, too often these algorithms depend on biased assumptions or data that can actually reinforce discrimination against women and people of color.”

As part of the announcement, lawmakers cited a Reuters report that revealed Amazon had abandoned an automated recruiting engine that was found to discriminate against women. In addition, the U.S. has accused Facebook of allowing advertisers to practice racial bias in an alleged violation of the Fair Housing Act.

Senator Cory Booker and Representative Yvette Clarke, both Democrats, introduced the bill with Wyden; it could have a hard time getting through the Republican-controlled Senate.

“To hold algorithms to a higher standard than human decisions implies that automated decisions are inherently less trustworthy or more dangerous than human ones, which is not the case,” said Daniel Castro, vice president of the Information Technology & Innovation Foundation (ITIF).

“This would only serve to stigmatize and discourage AI use, which could reduce its beneficial social and economic impacts,” Castro added.

The Internet Association, which counts Amazon, Facebook, Google and other major tech companies as members, did not immediately comment on the proposed legislation.