IGF 2021 WS #183
Ensuring Child Safety on Online Platforms

Organizer 1: Vineet Kumar Vineet Kumar, Cyber Peace Foundation

Speaker 1: Janice Verghese Janice Verghese, Technical Community, Asia-Pacific Group
Speaker 2: Nitish Chandan Nitish Chandan, Civil Society, Asia-Pacific Group
Speaker 3: Vineet Kumar Vineet Kumar, Civil Society, Asia-Pacific Group

Moderator

Vineet Kumar Vineet Kumar, Civil Society, Asia-Pacific Group

Online Moderator

Vineet Kumar Vineet Kumar, Civil Society, Asia-Pacific Group

Rapporteur

Vineet Kumar Vineet Kumar, Civil Society, Asia-Pacific Group

Format

Round Table - Circle - 60 Min

Policy Question(s)

Ensuring a safe digital space: How should governments, Internet businesses and other stakeholders protect citizens, including vulnerable citizens, against online exploitation and abuse?

CPF would like to address the following issues as part of our roundtable discussion -
- Identify concerns related to usability and effectiveness of the interventions proposed by platforms to ensure child safety;
- Identify the liability and accountability of platforms in ensuring a safe space for children;
- Identify initiatives to promote digital literacy that can be adopted to ensure the safety of children on online platforms;
- Identify the role played by other stakeholders including the government, media, parents and teachers in ensuring the safety of children on digital platforms along with ideating interventions that can be implemented by each of the stakeholders.

SDGs

4. Quality Education
16. Peace, Justice and Strong Institutions


Targets: CyberPeace Foundation’s work towards Internet Governance and CyberSecurity is aligned towards 5 UN’s sustainable development goals - Achieve gender equality and empower all women and girls, develop industry, innovation, and infrastructure, make cities inclusive, safe, resilient and sustainable, bring Peace, Justice and Strong Institutions.
and revitalize the global partnership for sustainable development. This discussion assists us in ensuring safety and stability in the cyberspace by focusing on digital literacy and gaps in the existing regulatory mechanisms.

Description:

The pre-teen and teenage years are crucial periods for the development of the foundational skills of children. While television continues to dominate the hours spent by children in front of a screen, usage of digital media has seen a tremendous increase in the last couple of years, especially during the pandemic. Digital media platforms have increasingly become popular among young children. The platforms are used for accessing a large variety of content including educational videos, cartoons, music videos and so on by children across age groups. But they have garnered a lot of criticism from governments, parents, and civil society organisations for concerns around commercial advertising for children, exposure to inappropriate content on advertisements and videos, flawed filtering of content, incorrect labelling of content and for not being able to create a safe and secure environment for children to grow and explore.

It is important to note that the discussion around children and their safety is incomplete without mentioning parents in the discussion. A parent’s attitude either as a scaffolder or a gatekeeper plays a huge role in how children access services on the internet. Social media platforms like Youtube have launched options like supervised experience by parents, especially for tweens and teens using a supervised Google Account for beta testing. It is imperative to discuss proposed interventions by platforms to address the gaps within them and address them.

At CyberPeace foundation, our ensuring peace in cyberspace has been echoed in our various initiatives and finds synergy in Internet Governance Forum's issue area of Trust, Security and Stability for the forum’s Annual Meeting. In our session at IGF 2021 on Ensuring a Safe Digital Space, CyberPeace Foundation would discuss the larger implications of technology on children and their autonomy in an interactive roundtable discussion with experts. Existing research has identified the gaps in algorithms adopted by platforms in signalling out inappropriate content making it important to understand the functioning of the algorithm and the filters used in the process by the platforms. It is imperative to understand the usage and viewing patterns of children along with understanding the role played by digital technologies in the development of children and their learning processes. It is also important to note that the supervision of children by parents also depends on their age group and the type of content they will be exposed to. It becomes imperative to analyse the type of content majorly consumed by them. This roundtable discussion will assist the stakeholders in designing interventions that are holistic and actionable to ensure the safety of children in digital spaces.

Expected Outcomes

We would be conducting this roundtable discussion with experts to discuss the interventions and policies adopted by social media platforms to ensure the safety of children and to identify the gaps therein. We would be preparing a report based on recommendations discussed in the roundtable discussion to share with the Government of India and other stakeholders.

The panellists would have forty minutes to share their answers to questions prepared by the team at CPF. The rest of the twenty minutes would be for the audience to ask questions.

Online Participation



Usage of IGF Official Tool.