More than a dozen prominent cybersecurity experts on Thursday criticized plans by Apple and the European Union to monitor people’s phones for illicit material, calling the efforts ineffective and dangerous strategies that would embolden government surveillance.
In a 46-page study, the researchers wrote that the proposal by Apple, aimed at detecting images of child sexual abuse on iPhones, as well as an idea forwarded by members of the European Union to detect similar abuse and terrorist imagery on encrypted devices in Europe, used “dangerous technology.”
“It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens,” the researchers wrote.
The technology, known as client-side scanning, would allow Apple — or, in Europe, potentially law enforcement officials — to detect images of child sexual abuse in someone’s phone by scanning images uploaded to Apple’s iCloud storage service.
When Apple announced the planned tool in August, it said a so-called fingerprint of the image would be compared against a database of known child sexual abuse material to search for potential matches.
But the plan sparked an uproar among privacy advocates and raised fears that the technology could erode digital privacy and eventually be used by authoritarian governments to track down political dissidents and other enemies.
Apple said it would reject any such requests by foreign governments, but the outcry led it to pause the release of the scanning tool in September. The company declined to comment on the report released on Thursday.
The cybersecurity researchers said they had begun their study before Apple’s announcement. Documents released by the European Union and a meeting with E.U. officials last year led them to believe that the bloc’s governing body wanted a similar program that would scan not only for images of child sexual abuse but also for signs of organized crime and indications of terrorist ties.
A proposal to allow the photo scanning in the European Union could come as soon as this year, the researchers believe.
They said they were publishing their findings now to inform the European Union of the dangers of its plan, and because the “expansion of the surveillance powers of the state really is passing a red line,” said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.
Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple’s announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.
“It’s allowing scanning of a personal private device without any probable cause for anything illegitimate being done,” added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. “It’s extraordinarily dangerous. It’s dangerous for business, national security, for public safety and for privacy.”