Apple Under Fire, Tweaks Plan For Flagging Child Porn

Apple Tweaks Plan For Flagging Child Porn

Apple, which has drawn an onslaught of criticism from privacy experts over its soon-to-launch system for identifying images of child sex abuse that pass through the operating systems it produces, now says it will only flag images that have been identified as problematic by officials in multiple countries, Reuters reported.

The move, which Apple unveiled in a briefing with reporters, according to Reuters, may be an effort by the company to blunt criticism that a power such as China could abuse the technology.

In explaining the technology, Apple wrote earlier this month that it was developed in a manner to protect user privacy.

At the core of criticism of Apple has been the contention from small security experts that by putting content-screening technology into the operating systems of devices, Apple would open the door to government demands for information about the presence on devices of material unrelated to child abuse or other crimes, Reuters reported.

Apple also said in the briefing, according to Reuters, that computer activity won’t be brought to the attention of human reviewers until there have been at least 30 apparent abuses on a single device.

One of the company’s critics, Stanford University security expert Riana Pfefforkorn, tweeted in response to news of Apple’s apparent course-correction: “The fact that they have added 2 explainers this week alone and continued giving presentations where they announce brand-new stuff they never mentioned even 7 days ago means that our pushing is having an effect.”

Apple also has drawn fire from its own ranks for creating a possible privacy problem for users of iPhones, iPads and Macintosh computers.

Read more: Apple Employees Voice Concern Over New Child Safety Scans, Report Says