More than 70 million warning messages have been sent to people attempting to access child sexual abuse material (CSAM) over the past two years, according to the Lucy Faithfull Foundation. These warnings are part of Project Intercept, a collaboration between the charity and major tech firms including Google, TikTok, and Meta.
The approach goes beyond simply blocking content by highlighting its illegality and directing users towards support services aimed at changing their behavior. Despite sending such a vast number of warnings, only 700,000 people have accessed Stop It Now resources, which some experts find concerning.
Deborah Denis, the chief executive of the Lucy Faithfull Foundation, said: 'By placing warnings at the moment harmful behaviour is happening, we can disrupt it and divert people towards help to change.' However, experts suggest that while interventions like these are important, they need to be part of a broader strategy to stop illegal content from being created and shared in the first place.
Emma Hardy, Communications Director at the Internet Watch Foundation, highlighted that innovations must address end-to-end encrypted services, where it is currently too easy for harmful imagery to spread. She called for tech companies to build new products with safety by design principles.







