Those images—if they’re deemed illegal—will be reported to the National Center for Missing and Exploited Children. The software won’t be applied to videos, Apple added.
“Apple’s expanded protection for children is a game-changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement on Thursday about the initiative. “The reality is that privacy and child protection can coexist.“
But some security experts and researchers, who stressed they support efforts to combat child abuse, said the program could present significant privacy concerns.
Ross Anderson, professor of security engineering at the University of Cambridge, described Apple’s proposed system as “an absolutely appalling idea,” according to the Financial Times. “It is going to lead to distributed bulk surveillance of … our phones and laptops,” he remarked.
When news of the proposal broke on Wednesday evening, John Hopkins University professor and cryptographer Matthew Green echoed those concerns.
“This sort of tool can be a boon for finding child pornography in people’s phones,” Green wrote on Twitter. “But imagine what it could do in the hands of an authoritarian government?”
Green said that “if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” noting that such “systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”
The expert told The Associated Press that he’s concerned Apple could be pressured by other, more authoritarian governments to scan for other types of information.
Microsoft created photoDNA to assist companies in identifying child sexual abuse images on the internet, while Facebook and Google have implemented systems to flag and review possibly illegal content.
The Epoch Times has contacted Apple for comment.