

It works by converting photos on a user’s iPhone or Mac into a unique string of numbers and letters – known as a hash hence the name NeuralHash. Simply put, Apple’s new tool for CSAM detection – NeuralHash – works on a user’s device, and can identify if a user uploads known child abuse imagery to iCloud without decrypting the images until a threshold is met and a sequence of checks, to verify the content, are cleared. Simultaneously, many users, who are accustomed to Apple’s standards of privacy and security as opposed to other companies, are enthusiastic about this new rollout. The news of this new technology was met with mixed feelings especially from privacy advocates and security experts. These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.- Matthew Green August 5, 2021
Ios macos monterey uswhittakertechcrunch series#
The news of Apple’s new technology first came to light after a series of tweets by Matthew Green, a professor of cryptography at the John Hopkins University in Baltimore, Maryland. Other protective tools include a feature that will intervene whenever any user attempts to search for CSAM-related content or terms through Search and Siri.

These features include filters to block sexually explicit photos sent and received through the iMessage app, especially on accounts registered to children.
:quality(90)/article-new/2021/06/macos-monterey-setup-assistant.jpg)
In an article by TechCrunch, Apple has stated that CSAM detection is one out of several new tools and features being rolled out by them with the aim of protecting children who use its services from any kind of harm. According to the tech giant, this new technology will allow the company to spot and report known child sexual abuse material (CSAM) to law enforcement in a way it claims will protect user privacy while scanning for abusive material. Apple confirms that they will roll out new technology, NeuralHash, that will scan iCloud Photos for images of child abuse.
