IGF 2022 WS #183 Digital Wellbeing of Youth: Selfgenerated sexualised content

Tuesday, 29th November, 2022 (12:35 UTC) - Tuesday, 29th November, 2022 (14:05 UTC)

Organizer 1: Civil Society, Western European and Others Group (WEOG)
Organizer 2: Civil Society, Western European and Others Group (WEOG)
Organizer 3: Civil Society, Western European and Others Group (WEOG)

Speaker 1: Chloe Setter, Civil Society, Western European and Others Group (WEOG)

Speaker 2: Gioia Scappucci, Intergovernmental Organization, Western European and Others

Speaker 3: Hazel Bitaña, Civil Society, Asia-Pacific Group

Speaker 4: Sonia Livingstone, Civil Society, Western European and Others Group (WEOG)

Speaker 5: Tebogo Kopane, Technical Community, African Group

Speaker 6: Stella Anne Teoh Ming Hui, Civil Society, Asia-Pacific Group


Round Table - Circle - 90 Min

Policy Question(s)

1. Which answers do governmental strategies, national policies and legislation around the world provide for young people’s online behaviour and how can states/governments better support children to exercise their rights in particular in accordance with Art. 64, 81 – 82 und 118 of General Comment No. 25? 2. How can Internet Governance support a common approach in respect of different political systems and cultural backgrounds? 3. How can young people themselves help design appropriate approaches and how can platform providers support these approaches in the design of their services?

Connection with previous Messages: The session will advance the debate on messages 3: Economic and Social Inclusion and Human Rights as follows: • Stakeholders have a joint responsibility in ensuring that digital transformation processes are diverse, inclusive, democratic and sustainable. Commitment and strong leadership from public institutions need to be complemented with accountability and responsibility on the part of private actors. • Agile regulatory frameworks – at the national, regional and, where possible, global levels – need to be put in place to outline rules, responsibilities and boundaries for how public and private actors behave in the digital space. • A suggestion was made for states to consider transposing the UN Committee on the Rights of the Child (UNCRC) General Comment 25 (GC25) on children’s rights in the digital environment into national regulation and legislation, and to ensure compliance. Another suggestion was for the UNCRC itself to tailor recommendations to individual countries during dialogue and review processes related to GC25. The session will further advance the debate on messages 8: Trust, Security, and Stability as follows: • Women and girls are disproportionately victimised online and find it difficult to obtain support. Governments need to harmonise legislation to protect victims of non-consensual intimate image abuse, and ensure easy access to redress. Network and platform policies need to accommodate a spectrum of global cultures. Peer support networks for girls who are victims of online gender-based violence, such as Safer Internet Centers, must be strengthened, while digital literacy should be improved through school curricula and start from a young age, before they venture online.



Targets: Since the selection tool in the form above does allow to select only one SDG, we have listed all SDG targets we will address here : 3. Good Health and Well-Being-3.7- 4. Quality Education-4.1-4.2-4.3-4.4-4.5-4.6-4.7-4.a-4.b-4.c 5. Gender Equality-5.1-5.3-5.6-5.c 16. Peace, Justice and Strong Institutions-16.10 The session will address the issue of child self-generated sexualised content. It is therefore directly connected to health issues of young people. Since education is key to the resilience of children and youth various sub-areas to SDG 4 are in the focus of the session. Gender-specific sexualised violence will be addressed in the session. And we will discuss the role of justice and respective legislation.


Research provides evidence of a growing volume of sexualised content generated by children and youth. This is seen as worrying by both educators and legislators. In addition law enforcement agencies are concerned, since they often have to deal with such material instead of working on leads to combat CSAM. While General Comment No.25 on children’s rights in relation to the digital environment (Art. 118) asserts that young people should not be penalised for the consensual exchange of such content, governmental strategies and legislation vary across the world. The session will review these differences and address the issue of self-generated sexualised content from a child's rights perspective. How can children's right to grow up safely, responsibly and self-determining in a digital environment be ensured and what is the perspective of young people themselves in this debate? These questions shall be the focus of the session.

Expected Outcomes

The session’s outcomes will build on the new report presented by the CoE Lanzarote Committee in Rome on April 8th, 2022 https://www.coe.int/en/web/cm/news/-/asset_publisher/hwwluK1RCEJo/conte…) and its recommendations to governments to improve their legal framework. These recommendations will be put in the context of legislation from other parts of the world and mirrored with the perception of young people themselves and the approaches in place on platforms so far. Desiderata for research on particular issues in regard of young people’s right to develop their identity and gain confidence in their sexual orientation will be identified as outcomes of the session. Most importantly the session will produce recommendations to stakeholders in Internet Governance how to address the legislative rag rug in regard of self-generated sexualised content on a global level.

Hybrid Format: We will facilitate interaction between onsite and online speakers and attendees by making available the CoE report as mentioned above and short input statements (max 150 words) from speakers in advance to participants via the IGF programme website to provide for a common ground to start from. To ensure the best possible experience for online and onsite participants only short verbal statements of speakers at max accompanied with 1 slide will be allowed during the session. To engage people beyond the usual attendees to the IGF we will set-up local / regional hubs where people from the respective community will gather online and bring their perspective into the debate. Thus it will be possible to gain a deep insight in the positions of a wide range of stakeholders from various backgrounds. The debates in these hubs will be based on the same resources (CoE report and statements from speakers) as the debate in the room. We expect a high interest in the topic and are therefore pretty sure that people around the world are ready to bring their own perspective forward to the digital roundtable. The discussion will be facilitated by the well experienced moderator who will encourage speakers and participants to limit their input to really short statements. The moderator will also adhere to a strict time management and encourage participants online and onsite to take the floor. This will be supported by at least one moderator taking care of input from the local and regional hubs and of the chat as an additional channel for participation.

Online Participation


Usage of IGF Official Tool.


Key Takeaways (* deadline 2 hours after session)

Since usually legislation refers to consensuality in order to differentiate images of abuse and sexual violence from usual behaviour in adolesence, a common definition of what "consensual" means is necessary, taking into account cultural differences.

General Comment 25 on the rights of children in the digital environment provides for a framework to address the issue of sexualised content, that needs to be translated into national legislation and transnational measures.

Call to Action (* deadline 2 hours after session)

In order to address the issues properly, consider the wording in regard of self-generated sexualised content, the definition of "consensual" and the wording in regard of sexual abuse, sexual exploitation and sexualised violence.

Make the voices of young people heard in alle matters that affect them and give the views of the child due weight in accordance with the age and maturity of the child. Take into account that sexual orientation and the formation of one's own sexual identity is a developmental task in adolescence.

Session Report (* deadline 26 October) - click on the ? symbol for instructions

The first step in the workshop was to define the term self-generated sexualized content. Therefore, Sonia Livingstone (Professor of Social Psychology, Department of Media and Communications, London School of Economics and Political Science) differentiated three definitions of self-generated sexual content and the implications that these have for youth and the law. Self-generated sexual content can be produced in:

  1. an exploitative situation with a remote adult. This includes e.g. extortion or pressuring the young person into sending sexual material of themselves. The 25. General Comment emphasizes the importance of safeguarding, protecting, and rehabilitating the victim and criminalizing the abuser. It also highlights the platforms responsibility as well as regulation for both prevention and redress.
  2. An exploitive situation, but the perpetrator is also a child. In this case, restorative and non-criminal measures of the perpetrator are encouraged when possible.
  3. A fully consenting situation between children. Here a non-punitive approach based on evolving capacity should be taken. 

In all these cases, the state and business bear responsibility for all sharing of such images, for which prompt and effective take down is vital, to ensure that children that have been subjected to abuse are supported and helped by knowing images are no longer there to avoid re-traumatization.

Children say that the digital environment is critical to their capacities to develop and explore their identities, both as individuals and as members of communities. They do understand, nonetheless, that the digital environment is strongly connected to the offline environment. Thus, in addressing the risks of sexual abuse and exploitation online, children recommend not only measures that can be done within the online space, but also actions that transcend digital boundaries. Many local languages are not popular online, which is why Hazel Bitana (Child Rights Coalition Asia) also emphasized the need to make reporting sexual abuse easily understood for children and possible in their local language.

In order to address the question of which answers legislation provides, Gioia Scappuci (Executive Secretary to the Lanzarote Committee, Council of Europe) summarized the new monitoring report adopted by the Council of Europe’s Lanzarote Committee in March 2022, which aims to address challenges raised by the significant increase and exploitation of child self-generated sexual images and videos. The report covers 43 European state parties to the Lanzarote Convention, and highlights ways to improve their legal framework, prevent this particular form of sexual exploitation of children, investigate and prosecute it and enhance the victims’ identification and protection. The report shows that only 11 out of 43 countries specifically address self-generated material in their legislation and they do not distinguish between consensual or non-consensual. The Committee strongly recommends that children should not be prosecuted for possessing or sharing self-generated sexual images and/or videos of themselves when the possession/sharing of such material is voluntary and is intended only for their own private use. The report calls for measures to assist child victims of sexual exploitation and abuse, short and long term, in their physical and psycho-social recovery. It also calls to abandon the terminology “child pornography” and instead use “child abuse material”.

Martin Bregenzer (klicksafe Germany) explained that since last year the distribution, acquisition and possession of sexual pictures of minors is a crime by law in Germany and the penalties have been increased accordingly. On the one hand, this is a major achievement in the combat against child sexual abuse. At the same time, the legislation results in significant hurdles for consensual sexting by young people so teenagers are committing a crime in many cases when sexting. Since the new law came into effect, policy makers noticed that this could backfire, so there will probably be a revision of the law in the future. He also pointed out that consensual sexting between young people can be seen as a regular and healthy part of sexuality.

Tebogo Kopane (YouthxPolicyMakers Ambassador) emphasized the role of young people as active agents/participants, but said that there is a culture of silence in much of Africa, which leads to very little open conversations regarding sexual abuse of children being led with caregivers, teachers, parents, etc. This shows that a common approach for children’s protection needs to be flexible enough to be adapted to different cultural and political contexts. She mentioned that a space for open discussion, questions, and education has to be created. Many sensitive questions are asked online, instead of asking parents, so ensuring high-quality content as well as children’s digital literacy is necessary.

The project Love Matters was mentioned from the audience. They have regional sex-education websites, where young and pleasure-positive language is used, which attracts more young people: https://www.rnw.org/?s=love+matters

Considering further national policies and transnational strategies, Chloe Setter (Head of Policy, WeProtect Global Alliance) showed that more children have internet access nowadays, they are online younger, use new chat platforms and offenders are learning more, which makes the risk of abuse much higher. However, sexual abuse online is not inevitable, but instead a preventable problem.

The speakers, also discussed the question of how Internet Governance can support a common approach in respect to different political systems and cultural backgrounds. The need for a common cross-cultural definition of consent that takes children from different backgrounds and situations into account was highlighted and at the same time formulated as a challenge. There is no simple solution to these complex problems. To address the question of the right balance between privacy and data protection on the one hand and child protection on the other, the cooperation of all stakeholders is crucial to create safe, child-appropriate and empowering spaces.

At the end, the speakers and audience members discussed how to involve young people directly and many ideas were mentioned such as creating a children’s domain (.kids) as a safe space for children. Ensuring that children from different backgrounds and situations get a space in the decision-making process was highlighted, as not all children have supportive parents that can help them with everything, so different perspectives need to be considered.

Number of participants: overall 69 participants. 26 on-site (12 female, 14 male) 43 online (20 female, 8 male, 15 not defined)