HomeUpdateApple removes the contentious CSAM detection tool from its website but maintains...

Apple removes the contentious CSAM detection tool from its website but maintains that its goals have not altered.

-

Last Updated on 20/12/2021 by Ulka

Apple has removed all references to the contentious child sexual abuse material (CSAM) detection technology that was first announced in August from a webpage about its kid safety features. The modification, which MacRumors discovered, appears to have occurred between December 10th and December 13th. Despite the website modification, the corporation claims that its ambitions for the feature have not changed.

On the tab headed “Expanded Protections for Children,” two of the three safety enhancements that were published earlier this week with iOS 15.2 are still present. References to the more contentious CSAM detection, which was delayed due to outcry from privacy groups, have been removed.

When contacted for a response, Apple spokesperson Shane Bauer stated that the business’s position hasn’t changed since September, when the company first announced that the CSAM detection will be delayed. “We have decided to take additional time over the coming months to collect input and make changes before deploying these critically important child safety features,” the company said in a September statement, citing feedback from customers, advocacy groups, researchers, and others.

Importantly, Apple’s statement does not claim that the feature has been completely discontinued. Apple’s website still has documents detailing how the feature works.

When it was first introduced, Apple’s CSAM detection tool sparked controversy because it involves comparing hashes of iCloud Photos to a database of hashes of known child sexual abuse material. Apple argues that by taking this strategy, it will be able to report users who are known to be submitting child abuse imagery to police without jeopardising the privacy of its consumers in general. It further claims that user data encryption is unaffected and that the analysis can be performed on-device.

Apple removes controversial child abuse detection tool from webpage |  Business Standard News

However, some worry that Apple’s technique may jeopardise the company’s end-to-end encryption. Some referred to the technology as a “backdoor” via which governments around the world may coerce Apple into extending beyond CSAM. Apple, for one, has stated that it will “not comply with any government’s request” to expand CSAM beyond its current scope.

While Apple has yet to announce a new rollout date for the CSAM detection feature, it has released two of the other child-protection features it mentioned in August. One is intended to alert children when they receive nudity-containing photographs in Messages, while the other delivers additional information when using Siri, Spotlight, or Safari Search to search for terms connected to child exploitation. Both were made available with iOS 15.2, which was released earlier this week and caused Apple to change its website.

Ulka
Ulka
Ulka is a tech enthusiast and business politics, columnist at TheDigitalhacker. She writer about Geo Politics, Business Politics and Country Economics in general.
- Advertisment -

Must Read

Data Science Drives Personalized Marketing and Customer Engagement to New Heights...

0
Personalized marketing and customer engagement are crucial for businesses to thrive in the current digital era. Because data science makes it possible for marketers...