iOS 15: What You Need to Know
The Reality of Apple’s New "Child Safety” Features
A Quick Breakdown
On August 5th Apple announced three new measures against the dissemination of Child Sexual Abuse Material (CSAM):
Communication safety in Messages: The Messages app will warn children and their parents when receiving or sending sexually explicit photos. This applies to both iMessage and SMS messages.
CSAM detection: Apple will detect known CSAM images on user devices and iCloud accounts then report such instances to the National Center for Missing and Exploited Children (NCMEC).
Expanding guidance in Siri and Search: Siri and Search will intervene when users perform searches for queries related to CSAM. These interventions will explain the problematic nature of the content and provide resources to get help.
The Risks of Client-Side Scanning
Client-side scanning effectively breaks end-to-end encryption (E2EE). The purpose of E2EE is to render a message unreadable to any party excluding the sender and recipient but client-side scanning will allow third parties to access content in the event of a positive match.
While CSAM is a uniquely sensitive application, there is no way to ensure that such technology will be exclusive to CSAM or that it will not produce false positives.
Government Backdoors
Service providers are incessantly pressured by external bodies to open the doors to our data. When we allow E2EE to break for certain cases, we introduce a problematic precedent: encryption will be guaranteed until it’s not. This allows for government onlookers to interfere in private communications.
The Risk for Vulnerable Populations
Parents of children under the age of 13 will be notified when their child has either sent or received sexually explicit imagery. Though this may sound reasonable in theory, there is no way to ensure that this tool will not be applied in a way that will cause harm or that it will only be applied to users under the age of 13. Such an initiative poses a risk for LGBTQ+ youth and individuals in abusive relationships as it may exist as a form of stalkerware.
The Verdict
Apple has taken its first step down a very slippery slope. Apple has fleshed out a dangerous tool that is at risk for government backdoors as well as misuse by bad actors.