4.1.1. Technology and ISPs

Technology was consistently identified as a leading challenge for investigating and prosecuting CSAM. The rapidly changing nature of technology makes it particularly difficult to adequately prepare or train investigators, especially as it relates to specific types of technology or platforms (Seigfried-Spellar 2018). Participants highlighted the difficulties in keeping abreast of new technologies. The plethora of online platforms available for perpetrators to access and traffic CSAM makes it challenging for investigators to keep up, and it is thought that P2P networks are responsible for the large growth in availability of CSAM on the internet (Bissias et al. 2016; Henzey 2011). P2P networks are free and relatively simple to employ so many perpetrators are thought to be sharing CSAM on these

platforms (Bissias et al. 2016). While law enforcement does monitor online platforms, the volume of CSAM and the ease with which perpetrators can traffic materials on the internet makes it challenging for investigators to fully address the problem. Further, as one platform or technology is discovered, perpetrators move to other technologies such as social networks, cellular messaging, and the Dark Web (Bissias et al. 2016). Technology companies and ISPs were also cited as a major challenge when investigating and prosecuting CSAM. Legally, in the United States, ISPs are required to report instances of child pornography on their platforms (McCabe 2008) and while ISPs are making these reports to law enforcement, challenges remain. Participants highlighted that ISPs often prioritize users' rights and are not always willing to provide timely information to law enforcement even with warrants. Further, while it is a legal requirement for ISPs to report CSAM if found, they are not required to look for it. Creating laws which require ISPs to implement server monitoring to combat CSAM would be one approach to addressing the ever-increasing challenges of investigating and prosecuting perpetrators of CSAM. While some companies, such as Google, Microsoft, Facebook and Twitter, utilize technologies to search for and report CSAM, these companies, along with others (such as Amazon) have continued to be criticized for not doing enough to address this problem (Keller and Dance 2019). The International Centre for Missing and Exploited Children (ICMEC 2018) has recommended that there be legislative and policy language enacted which clearly outlines ISPs' obligations to not report CSAM. Further, they recommend legislative considerations for clear, sufficient, and substantial penalties to incentivize companies to be "proactive and responsible" in their reporting of CSAM (ICMEC 2018, p. 11).

Given that law enforcement agencies already feel overwhelmed and unable to process the volume of CSAM, ISPs may pose additional challenges to investigators when they feel the priorities are not the same. While perpetrators are becoming increasingly proficient with advances in technology, there have also been technological developments that can be used to support law enforcement. These technologies can help in detecting and deleting CSAM more efficiently, potentially reducing the amount of times images or videos of children are shared online (Lee et al. 2020). In addition to speeding up the detection and deletion process, using automated technologies can help to limit the amount of CSAM that investigators must look at, and in turn the vicarious trauma experienced by those who manually search CSAM. Some of the primary technological tools that have been used to support investigators include digital fingerprints and image hash databases, which scans user-generated content on various platforms for known abuse images (Bursztein et al. 2019; Lee et al. 2020). Web crawlers, or search bots, are also important technologies being used to combat CSAM. Web crawlers use pre-defined criteria to automatically browse websites and download data (Lee et al. 2020). Web crawlers have been shown to be successful in identifying CSAM. Project Arachnid, a web crawler created by the Canadian Centre for Child Protection (CCCP) is one such example of a successful web crawler and is able to search the Dark Web as well as open web pages (Lee et al. 2020). When researchers have partnered with law enforcement to test algorithms which are used to detect CSAM, these have shown more accuracy and reliability in detecting such material (Lee et al. 2020). It is necessary for ISPs, technology companies, law enforcement, and other organizations to work in collaboration to ensure technologies are being implemented in ways that optimize their capabilities to detect and delete CSAM. This in turn will support more thorough investigations and prosecutable cases, while supporting victims and families in comprehensive ways.
