Apple Employees Voice Concern Over New Child Safety Scans, Report Says

Apple, Robinhood, AMC, streaming, distruptors, regulations, SEC

Some Apple employees have added to the chorus of criticism over the tech giant’s idea to scan U.S. customer phones and devices for child pornography, according to Reuters Thursday (Aug. 12).

This is notable because of the way Apple culture is usually very secretive.

But now, Apple employees are protesting on an internal Slack channel, and there have been over 800 messages about the plan since it was announced last week.

The criticism over the feature involves the fear that it could be used as a way for governments to find other unrelated things to persecute or censor, workers said on a thread that had lasted several days.

While some other security changes have also brought protests, this one has been particularly large and notable for its volume and intensity, according to sources from Reuters.

That said, core security personnel didn’t seem to be strong in numbers around the complaints. Reuters wrote that many have said they think Apple’s new policy is a reasonable way to stave off illicit material. And some employees have also said they want the scanning to be a step toward encrypting iCloud when people want that.

PYMNTS data shows around 57 percent of respondents to a survey said they had considered leaving a platform over security concerns for their data. Forty percent reported being warier of sharing personal information now as opposed to last year.

Read more: Big Tech Push Trust Boundaries With Consumer Security And Privacy Tweaks

Apple’s announcement said it also planned to automatically scan peoples’ phones for illegal content.

“Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind,” Apple’s announcement said. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”