Apple probes iPhones for child sex imagesEuropost
The hi-tech conglomerate Apple announced it would launch a system to scan and find child sexual abuse material (CSAM) on US customers' devices. Before an image is stored iCloud Photos, the scanning programme will monitor for matches of already known CSAM. Apple confirmed that if a match was found a human reviewer will then probe and report the user to law enforcement, BBC elaborated.
There are still privacy concerns that the new technology could be expanded to scan phones for prohibited content or even political speech. Experts worry that the technology could be used by authoritarian governments to spy on its citizens. Apple said that new versions of iOS and iPadOS – which are scheduled for released this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy". The system works by comparing images to a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations.
Those images are translated into "hashes", numerical codes that can be "matched" to an image on an Apple device. The company noted that the technology would also catch edited but similar versions of original images.
"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes," Apple said. The firm claimed the system had an "extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account".
The company also said that the new technology offers "significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known CSAM in their iCloud Photos account.