Apple technology will scan iPhones for child sex abuse images

Apple technology will scan iPhones for child sex abuse images

Apple technology will scan iPhones for child sex abuse images. Credit: Pixabay

APPLE technology being brought out in new models of iPhones in the US will be able to scan devices for child sex abuse images

In a bid to counter the spread of child sexual abuse material (CSAM) in the US, Apple has announced today, Friday, August 6, details of its new technology system that can locate such material in a user’s device.

Apparently, this new technology searches the device’s content for a match with already known CSAM, and if any match is found, then a human takes over, who will review the material discovered, assess the content, and if necessary, report the user to law enforcement authorities.

The technology is of course designed to do its specific job, but some experts have already voiced their concern that such software could easily be configured to also scan for hate speech, or prohibited content, or even used as a tool by authoritarian governments to spy on individuals or the masses as a whole.
A database reportedly already exists which was compiled by American organisations involved with child safety – such as the US National Center for Missing and Exploited Children (NCMEC) – containing images of known child sexual abuse, and Apple’s technology can allegedly compare images by ‘hashing’ them, which means the images are converted into numerical computer codes that allow the software to ‘match’ images with existing, or similar, edited images, of those in the database.
A spokesperson for Apple said that new versions, due to be released later this year, of iOS and iPadOS, will have “new applications of cryptography, to help limit the spread of CSAM online, while designing for user privacy. Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes”, while claiming that this new technology has, “an extremely high level of accuracy, and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”.
Some privacy experts have voiced their main concerns at this move by Apple, with Matthew Green, a security researcher at Johns Hopkins University, commenting “Regardless of what Apple’s long-term plans are, they’ve sent a very clear signal. In their very influential opinion, it is safe to build systems that scan users’ phones for prohibited content. Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone”, as reported by bbc.com.

___________________________________________________________

Thank you for reading, and don’t forget to check The Euro Weekly News for all your up-to-date local and international news stories.

Written by

Chris King

Originally from Wales, Chris spent years on the Costa del Sol before moving to the Algarve where he is a web reporter for The Euro Weekly News covering international and Spanish national news. Got a news story you want to share? Then get in touch at editorial@euroweeklynews.com

Comments