Following outrage from privacy groups, tech behemoth and iPhone maker Apple revealed on Friday that it is postponing a controversial technology that will scan users’ iPhones for photographs of child sexual abuse.
The method was launched by the corporation last month, with the goal of increasing child safety. Apple claims it can detect content known to the National Center for Missing and Exploited Children by checking photographs on users’ phones via iCloud servers.
Apple announced on Friday that it will take additional time to develop the technology in response to privacy groups’ concerns that the system could set a dangerous precedent.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple’s technology would give parents tools to help their children navigate online, inform law enforcement about illegal sexual materials and allow Siri and Search to block related topics.
Apple said previously that the program does not compromise users’ privacy because the scans see the images as sets of numbers, analogous to digital fingerprints, through a process called hashing.