woman's hand holding an iPhone
Shutterstock

Apple will Scan User’s Devices for Abuse Images

Apple announced it will be uploading software to iPhone and iOS devices that will automatically scan for images of child exploitation, but privacy watchdog groups are warning the move opens a backdoor for government use.

Apple will automatically scan phones for child abuse images

Apple announced on Thursday that it will be uploading software to all iPhones that will allow the company to automatically scan a user’s phone for potential child exploitation images before the photos are stored onto iCloud photos.

Apple said the system will be called “neuralMatch” and will initially only be rolled out in the United States. However, Apple noted in a blog post that the software will “evolve and expand over time.”

Will be applicable to other iOS devices as well

Apple also said it will be applying the technology to its other devices that run on its operating systems, iOS and iPadOS, both due to be released later this year. The company said these devices will have “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy.”

How it works

According to Apple, the system compares pictures to a database of known illegal images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.

The technology will search for matches of already known material, the BBC reported. The company says this works by translating images into “hashes,” numerical codes which can be “matched” to an image on an Apple device.

In addition, Apple says the technology can also detect edited but similar versions of original images. If a match is found on a user’s device by the software, a human reviewer will then do an assessment and report the user to law enforcement, Apple said.

Watchdogs: Opens a backdoor for government use, intrusion into user’s private lives

The announcement by Apple had privacy watchdog groups sounding the alarm that the move goes further than scanning for these images. They argue that effectively the technology essentially opens a backdoor for government intrusion to iOS user’s private lives, the Daily Wire reported.

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices,” the Financial Times reported. “The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”