There is a rising number of child sexual abuse images hosted in European Union member states, according to an annual report published by the Internet Watch Foundation (IWF).
The latest report showed that 62 per cent of all child sexual abuse material (CSAM) webpages found by IWF in 2024 were traced to an EU country; the Netherlands was the most used location in the world for hosting criminal content, with over 83,000 URLs linked to the country.
Bulgaria, Romania, Lithuania, and Poland were the next four European countries, with all four seeing an increase in the amount of content they host.
Overall, more than 291,000 reports were confirmed to contain child sexual abuse imagery, a record level.
Each report assessed contained one, tens, hundreds, or thousands of individual child sexual abuse images or videos, the IWF noted.
“The time for delay is over. Children around the world need our help more than ever to protect them online,” Derek Ray-Hill, Interim CEO of the IWF, said in a statement.
“These figures indicate the desperate need for the EU’s pivotal legislation to tackle the pernicious spread of child sexual abuse online and should act as a clarion call for EU Member States to come together and act on this issue,” he added.
Europe is a driver in demand
Most of the children displayed were aged between 7 and 10, followed by those between 11 and 13.
Young girls were also nearly four times more displayed compared to boys.
The IWF also noted an increase in content generated using artificial intelligence (AI), nearly 39 per cent depicting the most extreme examples of abuse.
A 2023 report from International Justice Mission and the University of Nottingham Rights Lab found that nearly half a million – nearly 1 in 100 – Filipino children had been sexually abused and exploited to make content.
“[This] abuse is largely driven by demand from men in the United States, United Kingdom, Australia, Canada, and Europe, who pay as little as £15 [€17.56] to participate in online sexual abuse of children,” Lori Cohen, CEO at Protect All Children from Trafficking (PACT) who was not involved in the IWF report, told Euronews Next in reaction to its findings.
The tech coalition, which regroups over 40 companies identified over 7,000 pieces of CSAM in 2024, according to the annual report of its program Lantern to fight against child sexual exploitation.
The coalition counts platforms such as Discord, Meta, and Nintendo among its members.
“In 2024 alone, we saw broader adoption of tools like image and video hashing technologies to identify and remove CSAM more effectively,” Sean Litton, president and CEO of the coalition, said.
“Major tech companies must be held accountable for monitoring and removing child sexual abuse material from their platforms swiftly and effectively,” Cohen said regarding the role of tech platforms.
“They must now make a concerted effort to put proactive measures into practice,” she added.
Awaited legislation in the EU
The EU is in the midst of updating the legislation to prevent and combat child sexual abuse online, as the current legal framework dates back to 2011.
The Commission introduced a proposal in February 2024 to expand the scope of offences to include content generated by AI, for example, imposing stricter penalties and strengthening the prosecution.
However, debates arose regarding the balance between child protection and digital rights like privacy.
This proposal is currently being discussed in the European Parliament and the Council.
“We need the Child Sexual Abuse Regulation to be adopted without delay to ensure companies undertake proactive detection. We also call for the passage of the Child Sexual Abuse Directive, which would criminalise the production and dissemination of AI-generated child sexual abuse material,” Ray-Hill said.
“The EU cannot become a safe haven for perpetrators to freely abuse children and exchange criminal content,” he added.