By Sophie Hey, Assistant Client Manager, Valideus
On 24 April the Internet Watch Foundation (IWF) released their 2018 Annual Report. The IWF is an independent self-regulated organisation with a vision to eliminate child sexual abuse online. The remit of the IWF is to remove child sexual abuse content hosted anywhere in the world, and non-photographic child sexual abuse images hosted in the United Kingdom.
The IWF was initially set up in 1996 following discussions between the former Department of Trade and Industry, Home Office, Metropolitan Police, some internet service providers (ISPs) and the Safety Net Foundation. These discussions were triggered by the Metropolitan Police notifying UK ISPs of newsgroup content hosted by them containing indecent images of children; the Metropolitan Police believed this may have constituted a publication offence. The IWF formed part of an effort to combat the hosting of such content in the United Kingdom while also protecting the internet industry from being held criminally liable for providing access to the content. In 1996, the United Kingdom hosted 18% of the world’s known child sexual abuse material. Once Upon a Year reports that in 2018, the United Kingdom hosted 0.04% of the known global total. Interestingly, perhaps as a result of the vigilance of Nominet, no sites were found under a .UK domain.
Once Upon a Year reports that there has been an increase in the number of domain names hosting child sexual abuse material, with 3,899 domains supporting 105,047 URLs with child sexual abuse material in 2018. This is up from 3,781 domains in 2017, and 1,694 domains in 2014. The 3,899 domains could be traced to 54 countries. The 3,899 domains were registered across 151 top level domains, with .COM, .NET, .CO, .RU and .TO accounting for 80% of all webpages identified as containing child sexual abuse images and videos.
The IWF also provides a number of services to help its members make the internet safer for their customers. These services are a series of alerts lists which notify members where child sexual abuse content is located to enable quicker takedowns and more effective blocking. The alert lists cover a range of content forms recognising that different members have different roles in the online space. The services are as follows: Image Hash List, Takedown Notices, Simultaneous Alerts, URL List, Keywords List, Domain Alerts, Payment Brand Alerts, Virtual Currency Alerts, Newsgroup Alerts, and Non-Photographic Image (NPI) URL List.
Image credit: johnsonbanks.co.uk
One of the problems with finding child sexual abuse material to takedown is that offenders often create their own language for finding and hiding child sexual abuse images. The Keywords List identifies words that are being used by people searching for child sexual abuse images online. In December 2018 this list held 453 words associated with child sexual abuse images and videos. IWF notes that this list enables members to moderate discussion on gaming and social media platforms, filter results on search engines to protect users from accidentally finding criminal content, and check for files or domains that might contain criminal content and need further investigation.
Once Upon a Year announced that in 2019 Nominet would be working with IWF on a project which may assist in identifying additional terms used by criminals when they are looking for child sexual abuse imagery via search engines.
In addition to the project with Nominet, IWF has utilised Microsoft’s PhotoDNA to enable hashing of videos. This enables videos as well as images to be added to the Image Hash List, preventing re-victimisation. Moving forward, IWF hopes to utilise artificial intelligence to complement the work of human experts reviewing material.