Apple will scan iPhones for images of child sex abuse.

Spread the love

Apple has released information about a system that will detect child sexual abuse material (CSAM) on customers’ devices.

Before storing an image in iCloud Photos, the technology will look for matches of previously known CSAM.

According to Apple, if a match is found, a human reviewer will assess the user and report them to law enforcement.

However, privacy concerns have been raised that the technology could be expanded to scan phones for prohibited content or even political speech.

Experts are concerned that authoritarian governments will use the technology to spy on their citizens.

Apple stated that new versions of iOS and iPadOS, which will be released later this year, will include “new cryptography applications to help limit the spread of CSAM online, while designing for user privacy.”

The system compares images to a database of known child sexual abuse images compiled by the National Center for Missing and Exploited Children (NCMEC) and other child safety organisations.

READ ALSO:  Facebook bans all QAnon groups as dangerous amid surging misinformation

These images are then converted into “hashes,” which are numerical codes that can be “matched” to an image on an Apple device.

According to Apple, the technology will also detect edited but similar versions of original images.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple said.

The company claimed the system had an “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”.

Apple says that it will manually review each report to confirm there is a match. It can then take steps to disable a user’s account and report to law enforcement.

The company says that the new technology offers “significant” privacy benefits over existing techniques – as Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.

READ ALSO:  GM recalled more than 68,000 Chevy Bolts due to battery fires

However some privacy experts have voiced concerns.

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, said.

“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam – governments will demand it from everyone.”

BBC

 791 

Leave a Reply