Personal security is becoming a rapidly more important feature to tech over the past few years. Ring doorbells for sale are being used to keep houses safe, every conversation with a policeman is recorded either by an onlooker or a bodycam, and Apple has always been at the forefront of this movement, with their policy to always put privacy first.
Apple iOS 15. Source: Apple
Apple iOS 15. Source: Apple
So, what’s changed?
Apple recently announced their plan to upgraded software for the iPhone: iOS 15. Amongst the features for the software was background noise through your earphones, a better FaceTime experience and the ability to scan text into your phone.
But amongst the features was an add-on that has got a lot of people talking.
In a memo posted on the Apple.com website, the company announced that all its products with this iOS 15 upgrade will scan every photo in your iCloud and iMessages for Child Sexual Abuse Material (CSAM). Traces of CSAM will be reported to the National Center for Missing and Exploited Children which works in collaboration with law enforcement as a centre to report CSAM.
The memo insists privacy is at the forefront of the company’s mind, saying that the method used has this in mind. It will not scan images on the cloud, but on-device before the image reaches iCloud, matching the photos to a database of known CSAM hashes provided by the child safety organisations.
However, a loud corner are not convinced.
Whistle-blower and privacy advocate, Edward Snowden said: “No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this.”
He added that if Apple and other tech giants can scan for CSAM today, they may scan for something else tomorrow, calling the phones “iNarcs”.
Head of WhatsApp Will Cathcart also chimed in, pointing out that Apple could have fought this from the other side and made it easy for people sent CSAM to report it, and saying of the scanning system “That’s not privacy.”
On Twitter, the response wasn’t much better. Most mentions of the photo scanning feature were negative, with others just focusing on the many other features of the update.
@sikcafe said: “That iOS 15 text scan feature is SICK.”
@rachadelasri said: “Apple has been a champion of privacy for so long, end to end encryption was their pride, but no more.”
Apple soon clarified a couple of details since this memo when the backlash started coming in saying that the “other organisations” will in fact only be two, so that governments cannot insert a photo of something they’re looking for in citizen’s phones. They also assert that this system will not be used by anyone for any purpose other than to search for CSAM and an Apple employee will review the photo before the photo is reported.
However, what seems to bother the public, and what wasn’t addressed by Apple, is the idea that a company can decide one day to implement technology that scans data that they consider private.
Back to Homepage
Back to Technology & Innovation