It’s tempting to set aside the privacy rights of people who may be involved in child pornography. Most people have no sympathy for that type of criminal offense, and many people feel that policing child pornography should be aggressive.
The problem with doing so is that they’re our rights, too. When we denigrate the rights of the unpopular, we risk diminishing all of our rights. And there are often fundamentally important reasons for those rights to exist.
That’s why it is so troubling that Apple Inc. plans to use an artificial intelligence tool to scan all U.S. iPhones for images of child sexual abuse. If the AI tool determines an image is of child sexual abuse, it will forward the file to a human reviewer. If the human confirms the conclusion, the account will be disabled, and the National Center for Missing and Exploited Children (NCMEC) will be notified.
The company also announced that it will use the AI to scan users’ encrypted messages for any sexually explicit content, although the detection system is meant to target only child pornography.
Finally, Apple’s messaging app will begin using on-device machine learning to discover and blur sexually explicit photos on kids’ phones and then warn their parents. And, this software will be standing by to “intervene” when users search for topics related to child pornography.
Innocent people will be surveilled and accused
There are a number of good reasons to oppose these moves. First, it is almost certain that innocent people will be caught up in this and wrongfully accused of possessing child pornography.
For one thing, the system is not likely to be better than other human systems meant to identify child pornography. Therefore, it will probably wrongly accuse people of having child pornography when what they have is innocent images of children bathing or otherwise without clothes. Even police and prosecutors sometimes mistake these innocent images for child pornography even though the children depicted are not being exploited and the images are not sexual.
The database at NCMEC already contains so many perfectly legal images that were mistakenly added to it that the organization warns police that images registered in the database are only “suspected” and must be examined independently by police to determine whether they are actually illegal child pornography. Under Apple’s plan, many more legal images could be wrongly added to the database, leading to improper search warrants and arrests of innocent people.
Another worry is that law enforcement or some governments might begin to require Apple to scan for other things they object to. That kind of mission creep is quite foreseeable. Once Apple has the ability to break end-to-end encryption for one reason, law enforcement and governments will undoubtedly ask or require the company to break it for other reasons, including the identification and surveillance of dissidents.
Ultimately, this opens your phone to government spying, in addition to Apple’s own surveillance. Once Apple has created a “backdoor” through encryption, recent history of private and government hacking has shown that backdoor can and will be used by others.
These new tools threaten all of our rights.