Apple’s New Surveillance System

Apple will introduce a new surveillance system to the iPhone at the launch of iOS 15 later this year.

According to CNBC, this new system will scan iCloud for illegal child sexual abuse materials (CSAM). How it works is Apple will match images on the device with known CSAM images provided by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organizations. After a search is performed, authorities will be notified of any CSAM content.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result.”

“The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

Apple has already started testing a system using sophisticated cryptography to identify when users upload child pornography to its iCloud storage, and it can do this without learning about the contents of the user’s photos stored.

Apple advised that its system is more private than Google and Microsoft because it uses both servers and software installed through iOS updates. But many are concerned that this will lead to law enforcement checks for images with political content, particularly in countries outside of the U.S.

The Electronic Frontier Foundation argued the following:

“If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.”

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”

“That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

However, Apple has countered the criticism by confirming that governments can’t force them to add non-CSAM images to the hash list.

“Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups.”

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”

“Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

Apple is committed to protecting children and reducing the amount of CSAM created. Once the update is introduced, users will be alerted that Apple is beginning to check photos stored on their iCloud.

Until Next Time…

(Sources)

Photo Credit: Wired

Duckett, C. (2021, August 9). Apple child abuse material scanning in iOS 15 draws fire. ZD Net. https://www.zdnet.com/article/apple-child-abuse-material-scanning-in-ios-15-draws-fire/. 

Leswing, K. (2021, August 9). Apple says it will reject any government demands to use new child sexual abuse image detection system for surveillance. CNBC. https://www.cnbc.com/2021/08/09/apple-will-reject-demands-to-use-csam-system-for-surveillance-.html. 

Villareal, M. (2021, August 31). Apple’s new surveillance system met with fire from critics, tech giants. Natural News. https://www.naturalnews.com/2021-08-31-apple-surveillance-system-met-with-fire.html. 

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s