Will Cathcart, the CEO of WhatsApp, has blasted Apple’s intentions to deploy picture identification techniques that would identify child abuse images in iOS photo libraries, claiming that the Apple software may scan all of your private photos, which is a blatant breach of your privacy.
Cathcart added that Apple has long needed to do more to combat child sexual abuse content and that WhatsApp will not allow such Apple tools to function on his network (CSAM), “but the approach they are taking introduces something very concerning into the world”.
“I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for Whatsapp. The answer is no,” he posted in a Twitter thread late on Friday.
“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy”.
Apple announced plans to implement new technologies in iOS, macOS, watchOS, and iMessage on Thursday, but highlighted key information from the current project.
According to The Verge, new versions of iOS and iPadOS are coming out this autumn for smartphones in the United States, “new applications of cryptography to help limit the spread of CSAM online while designing for user privacy”.
Cathcart, on the other hand, stated that this is an Apple surveillance technology that could easily be used to scan private information for whatever they or a government wishes to regulate.
“Countries, where iPhones are sold, will have different definitions on what is acceptable,” he added.
Other child safety organizations are expected to be included as hash sources as the service grows, according to Apple.
He asked “Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?”
An internal memo from Sebastian Marineau-Mes 9to5 Mac, Apple’s software vice president, admitted that some individuals are “worried about the implications” of the new child protection, but that the firm will “maintain Apple’s deep commitment to user privacy.”