Safety · Google AI Blog
Reaffirming our commitment to child safety in the face of European Union inaction
Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.
★ Tier-1 Source
Today, because of the expiry of the ePrivacy derogation enabling the use of technology to detect child sexual abuse material (CSAM), Europe risks leaving children across the globe less protected from the most abhorrent harm.
Key facts
- While EU institutions rightly expect technology companies to take action on child safety, the April 3 expiry of the derogation clouds the legal certainty that has helped responsible platforms try
- To learn more about how hash-matching and CSAM detection tools work, please join this upcoming webinar at 3PM CET on Friday, April 10th
- As EU institutions continue to negotiate an immediate, interim solution and durable framework, signatory companies (Google, Meta, Microsoft, and Snap) reaffirm their continued commitment
- The team call on EU institutions to conclude negotiations on a regulatory framework as a matter of urgency
Summary
For years, several technology companies have taken voluntary action to detect, remove and report CSAM including, where appropriate, through hash-matching technology — a widely utilized tool to prevent and disrupt real, ongoing harm to victims and survivors. While EU institutions rightly expect technology companies to take action on child safety, the April 3 expiry of the derogation clouds the legal certainty that has helped responsible platforms try to protect their communities, safeguard child victims, and preserve the integrity of their services. As EU institutions continue to negotiate an immediate, interim solution and durable framework, signatory companies (Google, Meta, Microsoft, and Snap) reaffirm their continued commitment to protecting children and preserving privacy, and will continue to take voluntary action on their relevant Interpersonal Communication Services.