Apple’s Siri AI (artificial intelligence) assistant sends audio to human “graders” that includes sexual encounters, medical information, drug deals and other private moments recorded without users’ knowledge.
The data, Apple said, “is used to help Siri and dictation … understand you better and recognize what you say,” the Guardian said.
A whistleblower asking to remain anonymous revealed concerns about the lack of disclosure about accidental activation picking up sensitive information, the news outlet said.
“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID,” Apple told the Guardian. “Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.
All of the major voice assistants are not as private as you might think. A recent study found more than 60% of Americans had used a voice assistant, The Wall Street Journal reported. These always-listening assistants are becoming a part of phones and living rooms and are coming to hotels and cars.
Apple wants to be on top and continues to try and improve Siri over its competitors. The company hired AI exec John Giannandrea away from Google last year and promoted him to Apple’s executive team, where he now reports directly to Apple CEO Tim Cook.
In March, Apple announced it had acquired machine learning startup Laserlike to boost its AI efforts, including Siri. The acquisition closed late last year but was only recently discovered. Laserlike used machine learning to gather information from across the web and deliver user-specific results to an eponymous app. The results could be searched through like news and shared and was even able to recommend other sites based on a user’s browsing habits.