Apple postpones a technology that will check iPhones for child sex abuse content.

Spread the love

Following outrage from privacy groups, tech behemoth and iPhone maker Apple revealed on Friday that it is postponing a controversial technology that will scan users’ iPhones for photographs of child sexual abuse.

The method was launched by the corporation last month, with the goal of increasing child safety. Apple claims it can detect content known to the National Center for Missing and Exploited Children by checking photographs on users’ phones via iCloud servers.

Apple announced on Friday that it will take additional time to develop the technology in response to privacy groups’ concerns that the system could set a dangerous precedent.

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” Apple said in an update posted to its website.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

READ ALSO:  CD Projekt Red details Cyberpunk 2077's lifepath and combat systems in Night City Wire Plus, who is the band Samurai?

Apple’s technology would give parents tools to help their children navigate online, inform law enforcement about illegal sexual materials and allow Siri and Search to block related topics.

Apple said previously that the program does not compromise users’ privacy because the scans see the images as sets of numbers, analogous to digital fingerprints, through a process called hashing.



Leave a Reply