ECPAT: A technical review of child safety technologies
“AWO were an excellent partner from concept through to delivery - thinking along with us, asking the difficult questions, and putting together a great team to help us meet our goal of producing technical insight that would be both accessible and adaptable for our target audiences.”
ECPAT are a global coalition of child safety organisations. They conduct research and push to change the systemic and social structures that uphold child sexual exploitation.
Detecting child sexual abuse material (CSAM) shared on private channels such as iMessage or WhatsApp presents unique privacy challenges, especially as content scanning methods cannot co-exist with end-to-end encryption. As such the design of child safety technologies and how they should be used are subject to highly polarised debate.
ECPAT asked AWO to produce a detailed technical study on the current state of child safety technologies.
Researchers at AWO looked into which child safety technologies are currently most prominently used or proposed, and how they work. The final report compared key features of these technologies, and implications to consider in policy discussions. Broadly, the report covered:
- Case studies of Microsoft’s PhotoDNA, Facebook’s content moderation system, and Apple’s proposed CSAM detection system for iCloud Photos.
- Apple’s proposed CSAM detection for iCloud Photos would use both on-device and server-side processing: when uploaded to iCloud, a photo is first hashed on the user’s device and compared with other image hash values of known CSAM — if there is a match, Apple is notified.
- At a high level, we explored the level of compliance of these tools with relevant EU data protection legislation.