ECPAT are a global coalition of child safety organisations. They conduct research and push to change the systemic and social structures that uphold child sexual exploitation.

Detecting child sexual abuse material (CSAM) shared on private channels such as iMessage or WhatsApp presents unique privacy challenges, especially as content scanning methods cannot co-exist with end-to-end encryption. As such the design of child safety technologies and how they should be used are subject to highly polarised debate.

ECPAT asked AWO to produce a detailed technical study on the current state of child safety technologies.

  • Technical analysis

    With in-house and external technical experts we developed case studies of Microsoft’s PhotoDNA, Facebook’s content moderation system, and Apple’s proposed CSAM detection system for iCloud Photos.

  • Stakeholder interviews

    We bought our extensive networks inside industry, government and law enforcement to bare, to help surface and confirm information that is not otherwise publicly available.

  • Data Protection

    We explored the level of compliance of these tools with relevant EU data protection legislation.

Researchers at AWO looked into which child safety technologies are currently most prominently used or proposed, and how they work. The final report compared key features of these technologies, and implications to consider in policy discussions.

This report informed ECPATs own understanding, and fed into the development of their policy positions and campaigns.

References

  • “AWO were an excellent partner from concept through to delivery - thinking along with us, asking the difficult questions, and putting together a great team to help us meet our goal of producing technical insight that would be both accessible and adaptable for our target audiences.”

Get in touch. Send an email or book a call directly with our specialists.