How we can help

Need help with your Managed IT Services?

Our team are available Mon – Fri: 7:30am-5:30pm

Call Now On:
Stourport: 01299 848311 Hereford: 01432 663026

Technical Support

Contact us

- 11th Aug 2021

Tech News: Apple To Scan Phones For Inappropriate Content

Apple has announced that all iPhone photos will be scanned for any evidence of Child Sexual Abuse Material (CSAM) to protect children and to help stop the spread of CSAM online.

How?

Apple’s new versions of iOS and iPadOS, due to be released later this year, will include a new system designed to detect any CSAM using a cryptographic technology called private set intersection. The system can perform on-device matching using a database of known CSAM image hashes provided by the National Centre for Missing and Exploited Children (NCMEC) and other child safety organisations. The system uses its own unreadable and securely (on-device) stored hashes and safety vouchers to encode any matches that it finds. Apple says that the system’s threshold is set to provide “an extremely high level of accuracy” which should ensure that there is less than a one in one trillion chance per year of incorrectly flagging a given account.

The system means that an automatic on-device matching process against known CSAM hashes is performed on any photo before it enters iCloud photos storage.

Manually Reviewed

Apple says that only when a certain threshold in the safety vouchers is exceeded (i.e. the automated system is sure of a match) can a photo be manually reviewed by Apple.

If There’s A Match

If Apple’s system confirms that there is a match (i.e. the photo contains evidence of CSAM), Apple says that it will disable the user’s account, and send a report to NCMEC.

What If There’s A Mistake?

Apple says that if a user feels that their account has been mistakenly flagged, they can file an appeal to have their account reinstated.

Criticism

The announcement of the new system has been criticised from the point of view that allowing a system to scan users’ private photos for any prohibited material has general privacy implications and could even be paving the way for government or other surveillance.

Apple Says…

Apple says that the system has “significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.”

What Does This Mean For Your Business?

There is no doubt that any innovations that can genuinely help in the fight against child sexual abuse have to be a good thing and it’s a bold move from Apple to announce the introduction of this system. Apple has gone to great lengths to publicise the fact that the system is very accurate and appears to go as far as it can to protect privacy. Despite Apple’s good intentions however, there are fears that this kind of system could be misused in future to allow agencies, authorities, and governments a ‘back-door’ into surveillance of the wider population in the same way that governments have long wanted back doors into end-to-end encrypted apps like WhatsApp. Unfortunately for WhatsApp, for example, it has just introduced a ‘View Once’ disappearing pictures feature that has drawn criticism that it could be misused in a way that enables CSAM to be shared more easily on the app. Another benefit for Apple using its new system is that it can ensure that its file storage areas don’t contain illegal material and, therefore, it can help ensure that Apple can keep its own house in order legally, professionally, ethically, and morally.

Google Rating
5.0
Based on 45 reviews