- by theguardian
- 21 Sep 2023
According to a report by the Internet Watch Foundation (IWF), the most extreme form of child sexual abuse material (CSAM), Category A abuse, constituted 20% of illegal content found online last year. The IWF, a UK-based organization that monitors the distribution of CSAM, identified over 51,000 instances of such content, which includes severe imagery of rape, sadism, and bestiality.
Disturbingly, the 2022 total for Category A imagery was double that of 2020, with criminal sites contributing to the increase by selling videos and images of child abuse. The IWF CEO, Susie Hargreaves, emphasized the immense harm inflicted on real children, who are subjected to sexual torture and exploitation for criminals' profit. She described the situation as truly appalling.
The IWF reported a significant rise in web pages dedicated to profiting from CSAM, with nearly 29,000 such pages identified in 2022. Tragically, some sites treat child sexual abuse content as a "commodity."
Last year, the IWF took action against over 250,000 web pages, a 1% increase from the previous year. A concerning trend is the prevalence of self-generated imagery, where victims are manipulated into recording their abuse before it's shared online.
The NSPCC, a child protection charity, expressed deep concern over the figures and urged the government to update the online safety bill to hold senior managers accountable for CSAM presence on their platforms. Currently, the bill's provisions related to child protection focus on content promoting self-harm and eating disorders, leaving CSAM unaddressed.
The security minister, Tom Tugendhat, called on companies using heavily encrypted services, like WhatsApp, to incorporate safety features to detect abuse. WhatsApp and other encrypted services are apprehensive about provisions in the bill that might compel them to apply content moderation policies, potentially undermining end-to-end encryption. WhatsApp even stated it would leave the UK rather than accept weakened encryption.
These alarming statistics highlight the urgent need for stronger measures to combat CSAM distribution and protect children from exploitation and harm on digital platforms.
Premier announces changes to long-delayed projectread more