LONDON (Aug. 6, 2021) - Apple's plans to find child sexual abuse material (CSAM) on US customers' devices 'fundamentally break the promise of end-to-end encryption', according to an expert at the professional body for IT.
Adam Leon Smith, Chair of BCS, the Chartered Institute for IT's Software Testing group said tracking the images 'seemed a good idea' but could easily open the door to governments monitoring people's sharing of political memes or text messages.
Governments could monitor political content
Leon Smith said: 'On the surface this seems like a good idea, it maintains privacy whilst detecting exploitation. Unfortunately, it is impossible to build a system like this that only works for child abuse images.
'It is easy to envisage Apple being forced to use the same technology to detect political memes, or text messages. Governments won't need to buy spyware from NSO anymore.
'Fundamentally this breaks the promise of end-to-end encryption, which is exactly what many governments want (except for their own messages of course).'
'It also will not be very difficult to create false positives. Imagine if someone sends you a seemingly innocuous image on the internet, that ends up being downloaded and reviewed by Apple and flagged as child abuse. That's not going to be a pleasant experience.
'As technology providers continue to degrade encryption for the masses, criminals and people with legitimately sensitive content will just stop using their services. It is trivial to encrypt your own data without relying on Apple, Google and other big technology providers.'
Before an image is stored onto iCloud Photos, the technology will search for matches of known abuse material, then passed to a human reviewer, Apple said.
The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user's photo album.
A range of experts including security researchers at John Hopkins University have expressed privacy concerns about the plans.