IGF 2023 Open Forum #58 Child online safety: Industry engagement and regulation

Time
Tuesday, 10th October, 2023 (09:00 UTC) - Tuesday, 10th October, 2023 (10:00 UTC)
Room
WS 8 – Room C-1
Issue(s)

Child Online Safety

Panel - 60 Min

Description

This hybrid session, facilitated by UNICEF, will explore different models of industry engagement and regulation to address online child sexual abuse and exploitation. Tech companies are vital stakeholders in protecting children from online child sexual exploitation and abuse, even if they are not directly involved in perpetuating such acts. The United Nations Committee on the Rights of the Child calls on governments to require ‘all businesses that affect children’s rights in relation to the digital environment to implement regulatory frameworks, industry codes and terms of services that adhere to the highest standards of ethics, privacy and safety in relation to the design, engineering, development, operation, distribution and marketing of their products and services’.

The session will take the form of a moderated discussion to encourage participation from both the online and onsite audience. This interactive panel aims to foster collaboration and the exchange of ideas, experiences, and innovative strategies to combat online child sexual exploitation and abuse. The panel features expert speakers from different sectors and from around the globe, representing:

  • Australia's eSafety Commissioner
  • Children and Families Agency - Japan
  • Japan Internet Provider Association/ Internet Contents Safety Association
  • Ghana's Cyber Security Authority 
  • BSR (Business for Social Responsibility)

An online moderator will monitor chat and questions to ensure participation from online attendees. AV arrangements will include a screen behind panelists to ensure online speakers are visible to onsite participants. A live stream of the stage will also be available. 

Organizers

UNICEF
Afrooz Kaviani Johnson, UNICEF, HQ (New York) Josianne Galea Baron, UNICEF, HQ (Geneva)

Speakers

Julie Inman Grant, eSafety Commissioner, Australia

Tatsuya Suzuki, Children and Families Agency of Japan (online)

Toshiaki Tateishi, Japan Internet Provider Association/ Internet Contents Safety Association

Albert Antwi-Boasiako, Director-General, Cyber Security Authority Republic of Ghana 

Dunstan Allison-Hope, Vice President, Human Rights, BSR (Business for Social Responsibility)

Onsite Moderator

Afrooz Kaviani Johnson, UNICEF

Online Moderator

Josianne Galea Baron, UNICEF

Rapporteur

Afrooz Kaviani Johnson, UNICEF

SDGs

5.2
16.2
16.7
17.16

Targets: Implementation of regulatory frameworks, industry codes and terms of service can contribute towards creating a safer online environment for children, particularly girls. This aligns with SDG 5.2, which aims to eliminate all forms of violence against women and girls and SDG 16.2 which seeks to end all forms of child abuse, exploitation and trafficking. By collaborating with regulators, governments, and stakeholders, the tech industry can participate in shaping policies and regulations that prioritize child online safety and foster a collective approach to addressing this issue in line with SDG 16.7, which focuses on ensuring responsive, inclusive, participatory, and representative decision-making at all levels. Additionally, industry engagement aligns with SDG 17.6 which emphasizes the importance of enhancing global partnerships for sustainable development.

Key Takeaways (* deadline 2 hours after session)

Online child sexual exploitation is a grave violation of human and child rights. Threats are continuously escalating and changing.

Self-regulatory measures are broadly perceived as inadequate. Significant regulatory and cultural changes are on the horizon, demanding greater responsibility and action from businesses.

Call to Action (* deadline 2 hours after session)

Governments and companies must remain vigilant and responsive to the ever-evolving threat landscape. Continued exchange of learning and experience in collaborative and co-regulatory models across different jurisdictions is necessary.

Companies should embed online child sexual abuse and exploitation into broader human rights due diligence, including impact assessments.

Session Report (* deadline 26 October) - click on the ? symbol for instructions

IGF 2023 Open Forum #58: Child online safety – Industry engagement and regulation


Key Takeaways


1.    

Online child sexual exploitation is a grave violation of human and child rights. Threats are continuously escalating and changing.


2.    

Self-regulatory measures are broadly perceived as inadequate. Significant regulatory and cultural changes are on the horizon, demanding greater responsibility and action from businesses.

Call to Action


1.    

Governments and companies must remain vigilant and responsive to the ever-evolving threat landscape. Continued exchange of learning and experience in collaborative and co-regulatory models across different jurisdictions is necessary.


2.    

Companies should embed online child sexual abuse and exploitation into broader human rights due diligence, including impact assessments.

Context

This hybrid session facilitated in-person by Ms Afrooz Kaviani Johnson – and online by Ms Josianne Galea Baronexplored different models of industry engagement and regulation to address online child sexual abuse and exploitation (CSEA). 

Panel discussion

Ms Julie Inman Grant, eSafety Commissioner, Australia, discussed the suite of regulatory tools her office uses to combat online CSEA. Key among Australia’s tools is its complaints schemes, which facilitate the removal of harmful content to prevent re-traumatization and allow trend analysis to influence systemic change. Additionally, the Basic Online Safety Expectations, which detail the steps that social media and other online service providers must take to keep Australians safe, enable the Commissioner to demand transparency, complete with penalties. Australia’s tools also include mandatory codes for various sections of the online industry in relation to illegal and restricted content, including CSAM. The Commissioner emphasized that even the largest companies are not doing enough and stressed the need for global pressure on companies to enhance safety measures. ‘Safety by Design’ was highlighted as a fundamental systemic initiative to support industry to better protect and safeguard citizens online.

Mr Tatsuya Suzuki, Director, Child Safety Division of the Children and Families Agency, Japan, presented how the newly formed Children and Families Agency is working with the private sector to combat online CSEA. The national framework acknowledges the essential role of private sector voluntary actions to ensure children’s safety online. It respects the balance between eradicating harmful content and ensuring freedom of expression. The Agency’s strategies, detailed in the 2022 National Plan for the Prevention of Sex Crimes against Children, involve public-private collaborations. The Plan for Measures Concerning Child Sexual Exploitation 2022 outlines these government-led actions. In July 2023, a prevention package was presented to the Cabinet Office, emphasizing joint efforts with relevant ministries to address child exploitation. 

Mr Toshiaki Tateishi, Japan Internet Provider Association/ Internet Contents Safety Association, discussed Japan’s private sector initiatives against online CSEA. The Internet Content Safety Association (ICSA) compiles a list of websites known for child abuse material based on data from the National Police Agency and the Internet Hotline Centre. An independent committee reviews this data, and upon confirmation, the ICSA distributes a blocking list to ISPs and mobile network operators, preventing access to these sites. The Safer Internet Association (SIA) contributes by operating a hotline for reporting illegal content, conducting research, advising on policy, and leading educational initiatives. These associations coordinate with providers, both domestic and international, to reduce and remove illegal and harmful content.

Dr Albert Antwi-Boasiako, Director-General, Cyber Security Authority Republic of Ghana, emphasized Ghana’s approach to championing industry responsibility and innovation. Recognizing that self-regulation is insufficient, Ghana advocates for ‘collaborative regulation’ rather than traditional top-down mandates. This strategy acknowledges that companies often overlook the risks children face online. Ghana’s Cybersecurity Act mandates industry action to protect children, encompassing content blocking, removal, and filtering. This law requires further specification through a legislative instrument, which is currently being crafted in consultation with the private sector and civil society. The Act includes administrative and criminal penalties, crucial for enforcement in developing nations, and allows for fines to fund the regulatory institutions. Dr Antwi-Boasiako noted that success hinges on widespread awareness and understanding of the issues at stake.  

Mr Dunstan Allison-Hope, Vice President, Human Rights, BSR (Business for Social Responsibility) highlighted the critical role of human rights due diligence (HRDD), including impact assessments, in combating online CSEA. HRDD based on the UN Guiding Principles on Business and Human Rights (UNGPs) can form a key part of a company’s obligations to address online CSEA. The benefits of this approach include a comprehensive review of human rights impacts, special attention to vulnerable groups like children, and a structured framework for action, tailored to each company’s position in the technology stack. With regulations now echoing the UNGPs, voluntary measures are shifting to mandatory. He urged companies to embed children’s rights into their broader HRDD processes. While this significant regulatory change is especially prominent in Europe, he encouraged companies to take a global approach to achieve the desired child rights outcomes.

Interactive discussion

The discussion started on balancing children’s right to protection with their right to access information, especially age-appropriate and accurate sexual and reproductive health information. The conversation took cues from the UN Committee on the Rights of the Child, General comment No. 25 (2021). Although the internet was not built for children, they are significant users, leading to a call for both minimizing harm and amplifying benefits. Australia’s consultations on approaches to age-assurance spotlighted this need, pushing companies to look beyond age-gating. A human rights-based approach was emphasized to navigate tensions between human rights. Strategies like DNS blocking alone were deemed inadequate, emphasizing holistic approaches, like Australia’s ‘3Ps’ model of Prevention, Protection, and Proactive, systemic change, are crucial. One significant challenge lies in raising awareness and promoting help-seeking behaviours among children and young people.

Conclusion

Both regulators and companies, along with civil society, are currently navigating extremely challenging dilemmas. Whether through regulation, self-regulation, or ‘collaborative regulation’, there is a significant shift happening in the regulatory landscape. This shift presents an opportunity to firmly integrate the issue of online CSEA into these evolving processes.

Further resources

United Nations Children’s Fund (2022) ‘Legislating for the digital age: Global guide on improving legislative frameworks to protect children from online sexual exploitation and abuse’ UNICEF, New York.