ISTANBUL
Web servers in the Netherlands are hosting the most, some one-third, of all child sexual abuse material online, according to a recent report by a watchdog that tracks and removes such content.
Last year, the UK-based Internet Watch Foundation (IWF) took action to remove over 255,000 reports of confirmed child sexual abuse material, or CSAM, which refers to visuals depicting the exploitation and sexual abuse of children.
Organizations such as the IWF and others prefer the term CSAM instead of child pornography, as the latter could denote some degree of consent on children’s part.
IWF data shows the EU remains the “global hub” for CSAM, with a staggering 150,419, or 59%, of the reports being about material hosted on computer servers in the EU member states, the highest worldwide.
Among them, the Netherlands was at the top, hosting 32%, or 82,605 URLs.
But why is the EU, and the Netherlands in particular, hosting the highest amount of such material in the world?
Experts explain it is in part because the small European country has web hosting companies and server providers offering services at a low cost.
High-speed internet, developed digital infrastructure, and favorable regulations are additional factors.
“The Netherlands has plenty of fast, cheap hosting infrastructure … so naturally you are going to find content wherever you have good technology and infrastructure for hosting,” Michael Tunks, head of policy and public affairs at IWF, told Anadolu.
Dutch laws, which give a lot of weight to freedom of expression, are not stringent enough to allow authorities to crack down on servers, he said.
“The way that constitutionally and legally the Netherlands is set up has meant that it has become a hotbed for the hosting of child sexual abuse material as well,” said Tunks.
Having the servers in the Netherlands does not mean that the content was generated there, and tracking down the source of the material remains a challenging task, he added.
The Dutch government has acknowledged the issue in the past and taken measures to combat online child sexual abuse, including a strategy focused on removing such content, but the numbers remain damning.
Images of extreme child abuse double
The IWF has recorded a worrying spike in images and videos of the most severe kinds of child sexual abuse.
The content, labeled Category A, can include the rape of children, babies, as well as acts such as bestiality or sadism, the report said.
“That’s the most severe forms of content … (and) we’ve seen that double. We’ve taken about 50,000 reports down of that in the last year,” said Tunks.
The IWF warns that as children become more active online, they are increasingly vulnerable to grooming and abuse by strangers “even in their own bedrooms.”
Rise of self-generated content
Another alarming trend is the rise of self-generated content, defined as sexual abuse images and videos created using mobile phones or webcams and then shared online.
Children can be groomed, deceived, or coerced into producing and sharing a sexual image or video of themselves by someone who is not physically present with them, said the IWF report.
Around 78% percent of the total URLs IWF identified in 2022 had self-generated images.
“Generally, that’s hitting the 11 to 13 age range and we’re seeing that get younger. We’re seeing 7- to 10-year-olds and particularly young girls as well,” said Tunks.
“If you have a child, and they have a smartphone or a device that has a camera in it and is connected to the internet, then any child could be at risk of potentially generating those images.”
Lobbying for laws
Last May, the European Commission proposed a new regulation to tackle online child sexual abuse.
It urged social media platforms, service providers and tech companies to scan, remove and block all content, personal messages or encrypted data with abusive material.
“I think it’s a much-needed piece of legislation and it’s really, really important because companies need to continue scanning, detecting and putting in place measures that better protect children online,” said Tunks.
Activists in the Netherlands are pushing for the government to support the proposal, but like other countries, there is pushback from data watchdogs, tech companies and politicians.
“There’s a lot of resistance within the Netherlands due to privacy issues,” explained Celine Verheijen, project coordinator at Defense for Children-EPACT Netherlands, an NGO focused on children’s rights.
The concerns also include scanning of text, including personal messages, and end-to-end encryption.
Verheijen said they are also advocating for better legislation, for instance, to make websites accountable for hosting material and to take these companies to court.
Anadolu Agency website contains only a portion of the news stories offered to subscribers in the AA News Broadcasting System (HAS), and in summarized form. Please contact us for subscription options.