Apple Criticized By Cybersecurity Experts for CSAM Technology
Apple has spent years making privacy a unique selling point with privacy-first features and slick marketing campaigns. But that carefully curated, privacy-conscious image is at risk of unraveling following Apple’s controversial plans of client-side scanning (CSS) of customers’ iPhones and iPads for illegal Child Sexual Abuse Materials, or CSAM.
Several cybersecurity experts have heavily criticized Apple’s plans to implement new phone-scanning features. The features of the scanning tool include on-device scanning of iCloud Photos of users to detect potential CSAM content. Images will be scanned before being uploaded to iCloud using a technology called NeuralHash and then compared with a database of child sexual abuse imagery provided by the National Center for Missing and Exploited Children. If there’s a match that crosses the CSAM scanning tool’s threshold, Apple will block the cloud upload, shut down the account, and alert law enforcement after conducting a human review.
Concerns Over Misuse
While many in the tech community applauded Apple’s efforts to protect children, many also voiced concerns about the potential for how the technology behind the CSAM features could be used for other surveillance purposes. Privacy advocates and cybersecurity experts have raised fears that the technology could erode digital privacy and eventually be used by authoritarian governments to censor protected speech, threaten the privacy and security of people, and track down political dissidents and other enemies.
According to a report by an influential group of 14 internationally reputed security and cryptography researchers, Apple’s new monitoring plans are invasive and ineffective, and reliant upon “dangerous technology. They warn the client-side scanning system, if used, “would be much more privacy-invasive than previous proposals to weaken encryption.
Rather than reading the content of encrypted communications, CSS gives law enforcement the ability to remotely search not just communications, but the information stored on user devices.” The researchers found that it was not effective at identifying images of children being sexually abused. Editing images just slightly was found to be enough to avoid detection.
Apple said it would reject any such requests by foreign governments, but the tech giant doesn’t have a good track record upholding such claims – Apple was recently blackmailed into removing the Russian tactical voting app Navalny. Since Apple complies with local laws in each of the countries it operates in, any government could pass a law requiring tech companies to use their available capabilities such as the CSAM scanning system to look for images they claim are associated with terrorism or any political opposition.
With the EU believed to be considering device scanning as a part of a new law on child protection and for signs of organized crime and terrorist ties, it’s not exactly a giant leap to imagine that now that the EU knows Apple possesses this capability, it might simply pass a law requiring the company to expand the scope of its scanning. Why reinvent the wheel when a few strokes of a pen can get the job done?
Rolling Out Apple’s CSAM Solution
The tech giant had initially tried to dispel any misunderstandings by reassuring users and putting in efforts to get rid of their concerns, but later gave in to the criticism and announced that it would delay rolling out the features as a means to make “improvements” where needed before releasing these critically important child safety features.
Other child protections set to be released include expanded guidance in Search and Siri by providing additional resources to help children and parents stay safe online and get help with unsafe situations. Siri and Search are being updated to intervene when users perform searches for queries related to CSAM. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Apple is also introducing new communication tools in the Messages app to enable parents to play a more informed role in helping their children navigate communication online. The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.
For more information about Alvarez Technology Group and our IT services, contact us today.