IGF 2020 WS #91 Technology and innovation on behalf of the abused children.


Organizer 1: Intergovernmental Organization, Eastern European Group
Organizer 2: Civil Society, Western European and Others Group (WEOG)
Organizer 3: Technical Community, Eastern European Group

Speaker 1: Martyna Różycka, Intergovernmental Organization, Eastern European Group
Speaker 2: Denton Howard, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Anderson de Rezende Rocha, Civil Society, Latin American and Caribbean Group (GRULAC)


Debate - Auditorium - 90 Min

Policy Question(s)

Political questions What are the responsibilities of the different stakeholders, in particular platforms and government agencies, around content governance? What are the benefits and limitations for different stakeholders on using technology to protect children online? What are the risks in using AI for detection and categorisation of child sexual abuse materials?

The workshop will address inter alia listed questions: • Benefits of using AI/other presented technologies in detecting CSAM or inappropriate behaviours of online predators • Challenges for automated detection (for example real time communication, instant upload, victim identification) • Limitations for using AI/other presented technologies in detecting CSAM • Privacy of victims • Main beneficiaries of the presented projects • Cooperation ideas, role of INHOPE, expectations of/for industry • Funding and state/political involvement


GOAL 9: Industry, Innovation and Infrastructure
GOAL 16: Peace, Justice and Strong Institutions


The scale of CSAM (Child Sexual Abuse Materials) around the world is still tremendous despite joint efforts of the Police forces, internet hotlines and industry. 155,240 Child Sexual Abuse Materials related reports were exchanged between INHOPE members in 2018. This is an increase of almost 80% on 2017. 89% of reports are related to children 3-13 years old and 2% of the victims are less than 3 years old. The Internet is unfortunately constantly developing the ways of sharing, accessing and producing child sexual abuse imagery. Access to the Internet gives perpetrators new opportunities for abusing children – child grooming and self-generated content are trends increasing in last years. Behind every image there is a real child being abused, possibly in this very moment. There is a need of swift action, from deleting the content to victim and predator identification. This is a global problem requiring global and innovative solutions, taking into consideration different law regulations and crucial role of time of investigations concerning new materials. The issue is undoubtfully very important but research opportunity in this field is very limited, mainly because of organisational problems with human participation, due to the harmful nature of the content for the observer and possible secondary victimisation for abused minors. Eradication of CSAM from the Internet requires emerging technologies facilitating the process of gathering data from the internet and automated verification of the content. The industry has already presented some useful solutions but the constant development is essential, as well as working collaboration with other parties – internet hotlines, police and state representatives. Security solutions should be made available not only for huge platforms but also for small companies and developing countries because it is the only effective way to protect all the Network. The workshop is aimed to present an overview of current status of selected research projects concerning the use of the newest technologies like artificial intelligence in the field of fighting CSAM on the Internet and building cooperation among different stakeholders. The workshop also provides an opportunity to discuss limitations, chances and necessary policy development in the aspect of academia, industry and NGO shared responsibility for eradication of CSAM from the Internet.

Expected Outcomes

The major expected outcome of the workshop is to raise awareness about the use of technologies in the field of fighting CSAM on the Internet. It will also be a platform for international, multi-stakeholder partnership building, possibly in the form of a working group for establishing collaboration between stakeholders, policy, organisational and privacy requirements for usage of AI and other emerging technologies for mentioned purposes.

This session will involve a 10-minute introductory presentation outlying the main issues, followed by brief presentations of ongoing projects (50 min) and discussion panel (30 min), including attendees’ participation. Questions from the audience will be fielded by the on-site moderator and rapporteur. There will also be an online participation app in place to ensure the most popular questions are answered during the workshop. Workshop is easily adaptable to full online format.

Relevance to Internet Governance: Protection of children being victims of real abuse and then secondary trauma connected with online presentation of their exploitation on the Internet is a shared responsibility for governments, private sector and civil society. It is crucial that governments establish supportive, unlimited by borders policy environment for fighting CSAM and provide financial support for development of innovative technological solutions. The industry should be aware of the problem and at least provide a mechanism for reporting illegal content on their servers or try to build automated systems to cooperate effectively with the LEA and relevant NGOs like internet hotlines. The proposed workshop aimes at presenting broad perspective on the issue, giving opportunity to consider necessary policy adjustments, proposing new best practices or even giving inspiration for creation of international research programmes.

Relevance to Theme: The issue of CSAM on the Internet is a very delicate matter, which requires taking into consideration multiple aspects concerning privacy, protection of victims and users to ensure swift and effective actions against perpetrators who groom or send inappropriate information to children and individuals sharing the content on the Internet. The cooperation among different types of stakeholders representing industry, LEA, NGO and academia should be built on trust and common understanding of goals, limitations and needs of every involved party. Significantly increase the access to information and communication technology and strive to provide universal and affordable access to the Internet in least developed countries by 2020. Violence against children, including sexual violence is a problem affecting every region and society. Noticing abuse and identifying crime scene is crucial for bringing quick help to the victim. But with a huge scale of new images and videos, it is impossible for human internet moderators due to their limited capacity.

Online Participation


Usage of IGF Official Tool.