Apple Will Soon Scan Your iPhone Pictures For Evidence Of Child Sexual Abuse
Apple will soon implement technology that will scan photos uploaded to the Cloud for evidence of child sexual abuse, according to a report from The Financial Times and cryptography professor Matthew Green.
The technology, which roll out on iOS 15.0 in the U.S., will reportedly use a particular hashing algorithm known as neuralMatch to scan an iPhone user's pictures for images that match known pictures of child abuse. If too many photos share a specific "perceptual hash," the user would be reported to Apple servers.
While this could potentially be a boon for preventing child abuse, the ethical and technological issues were apparent to many upon learning the news.
As Matthew Green explains, if there's a technology that can scan a phone for evidence of child abuse, that technology could be used to scan for anything on a phone deemed wrong by an authoritarian government.
Edward Snowden also shared an article that was critical of the new tech. In an FT article, security engineering professor Ross Anderson said, "It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops."
As for fears of images getting misread and a Cloud user getting flagged for innocuous images, Apple insists that the flagging process goes through manual review, meaning humans at Apple's servers will have to judge if the images flagged by neuralMatch indeed constitute child abuse.