Apple’s slippery slope in analyzing child abuse photos

Apple yesterday announced a photo scan tool to search for child abuse cases in the United States. Three measures whose operation we have explained in depth, the details of which have not avoided controversy. The center of it revolves around the possibility of these metrics being reused for any other type of content.

The most obvious use is to curb political dissent and threaten the freedoms of the people in authoritarian countries. Countries like Saudi Arabia, China, Venezuela and other regimes are the ones that could benefit from these new tools, but what are the chances of that becoming a reality?

A first step to scanning any content on iPhone?

It would be enough for Apple to extend the machine learning settings and search for additional types of content or change the configuration of alerts to be scanned, including not only children’s accounts but also those of any user so that the backdoor narrow which Apple is building is to widen it.

In this paragraph written by the Electronic Frontier Foundation criticizes Apple’s plans in its encryption to build a backdoor into our private lives. Granted, here are a lot of the issues critics raise with this measure.

The argument is that if Apple can scan the photos on the device before they are encrypted, review the alert and notify an agency, what is stopping it from doing so with messages, links? , websites visited and other private content? All the more so when the database containing the hashes to be examined is provided by a third party, in this case the Child Sexual Abuse Material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). In other countries, there may well be an organization that, in addition to child pornography, sneaks photos of dissident slogans, protests, posters and other subversive content. It doesn’t seem far-fetched.

Whoever controls this list can search your phone for whatever content they want, and you really have no way of knowing what’s on that list because it’s invisible to you (and just a bunch of opaque numbers. , even if you hack your phone to get the list.)

– Matthew Green (@matthew_d_green) August 5, 2021

Johns Hopkins University Christology professor Matthew Green has been very critical these days with the measures Apple is going to take. And it points precisely to this weak point, where even though we trust Apple, the person in charge of this database is the one who really has the power in their hands. Others, like Ross Anderson, professor of security engineering at the University of Cambridge, point out that the measures simply convert mass citizen surveillance into distributed surveillance that runs on devices rather than in the cloud.

These systems have been in operation for over a decade

On the other side of the coin, the truth is that scanning photos for child abuse is nothing new. Yes, this is done from the user’s own device, instead of the cloud. But tech companies have been examining photos of child abuse by hashing for more than a decade.

We are faced with a sensitive issue, because of the type of crime that is being prosecuted but also because of Apple’s stance in favor of privacy.

Google, for example, has been using hashing to identify child pornography since 2008. Microsoft developed the famous PhotoDNA database which uses hashing in 2009 and is used in all of its cloud services. It is also used by Twitter, Gmail, Facebook, Adobe, Reddit, and Discord, including NCMEC itself. Apple has been analyzing photos uploaded to iCloud in the cloud with the same techniques since at least 2019.

AIUI, the database cannot be changed arbitrarily and has been around for over a decade without this sort of thing happening to photos in the cloud. I think this describes a theoretical risk that ignores how authoritarian governments actually don’t need excuses.

– Charles Arthur (@charlesarthur) August 5, 2021

As tech journalist Charles Arthur, who covered these types of technologies when he was at The Guardian, recounts, these technologies have been in use for over a decade. And at no time has there been any abuse of them by authoritarian governments. In part and as Arthur explains:

For this scenario to work, the government needs to know the photo that is on the person’s phone and upload a version of it to the database. It’s ridiculous. How do they get them to create the hash? They’re just going to stop them under a pretext. This is what dictators do.

– Charles Arthur (@charlesarthur) August 5, 2021

An authoritarian government does not need excuses to arrest a suspect. It just stops him because that’s what a dictatorship is for. For a government to have the pretext, it would have to know in advance the photo in question and that it is on the user’s device, before uploading it to the database.

Pursuing child pornography should be a priority for tech companies, reporting offenders to authorities and discouraging the use of their services for disclosure. Everything seems to indicate that this new method is an additional means of combating this scourge. Based on what we have seen, there are no cases of abuse of these tools, as it does not seem practical for an authoritarian regime.

Back to top button