IGF 2020 - Day 3 - DC Lessons learned from the Pandemic: child rights and safety

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> MODERATOR: Good morning, good afternoon, good evening, wherever you are. 

     My name is Marie-Laure Lemineur.  I will be the moderator of the session.  I work for an organization call ECPAT International based in Bangkok, Thailand where I am right now. 

     Welcome to our session and thank you for joining.  When I checked roughly an hour ago, we had 83 registered participants.  So I was very pleased to see that we had so many participants.  Thank you once again for joining us today. 

     We'll start with a little bit of housekeeping.  Mentioning that we have four speakers today.  That all of them will be making a presentation.  And then once we are done with the presentations, we would open the floor for questions, comments.  And the overall duration of the session is 90 minutes so I'm hoping that we will have more than 30, perhaps 40 minutes for the questions. 

     And then we will discuss a little bit towards the end of the session the key takeaways from the discussion.  And we will be ready to wrap up.

     So before I give the floor to the first speaker, I would like to tell you a little bit more about the background of each one of our speakers.  We'll start with the speaking order. 

     I will start with the Baroness Beeban Kidron.  You can see all the speakers on the screen, I believe that their videos are on. 

     So Baroness Beeban Kidron is a British filmmaker and an advocate for children's rights in the digital world.  In the House of Lords, Baroness Kidron sits as a crossbench peer.  She sat on the House of Lords Democrat and Digital Technologies Committee Inquiry and the Lords Communications Select Committee.

     Is also the founder and chair of the foundation called the 5Rights Foundation, which is a charity that works to create policy and practical solutions to fill the digital world children and young people deserve.

     Baroness Kidron is also a member of the Youth Foundation for Sustainable Development of the Global Council for Extended Intelligence and Child Rights Policy Guidance Group.  So you can see that Baroness Kidron has a wide, wide experience and her views and insights on the topic will be very valued. 

     Then we have Dr. Amanda Third, who is a research fellow in the Institute for Culture and Society at Western Sydney University.  Dr. Third is also the co-director of the Young Research Center.  Dr. Third is also the research stream colleague in the Center for Resilience and Inclusive Societies and a faculty associate in the Bergman Klein Center for Internet and Society at Harvard University in the U.S.A. 

     And Professor Third is an international expert in child-centered and participatory research.  And her work investigates children's technology practices, focusing on marginalized groups and wide-based approaches. 

     Professor Third has led child-centered projects to understand children experiences of the digital age in over 70 countries working with partners across corporate, government, and not-for-profit sectors.  And, of course, children themselves.  She has authored plenty of publications.  The last one being "Young People in the Digital Society" in 2019.

     And she's also currently co-authoring the UN Committee on the Right of the Child general comment on children's rights in the digital environment. 

     And we have our third speaker.  On the program description, the session description, you can see the name of Nina Pirk.  Unfortunately, Nina yesterday informed us that she is sick, and we were sorry to hear that. 

     And Ms. Jutta Croll agreed to join us.  She was the Activity Managing Director of the Foundation from 2002 and to 2014.  And from 2014 until 2016 Jutta was the Managing Director of the German Center for Child Protection on the Internet. 

     Jutta has been a long-time child rights advocate, focusing on the child rights -- on the digital rights of children, sorry. Jutta is a member of the IGF Mag and a co-founder of the Dynamic Coalition online safety.  Also, a co-organizer of the session.  And finally, Jutta is a member of the several steering boards and committees at European and national level.  And she works very closely with the Council of Europe.

     Last, but not least, Cathal Delaney is currently working as the head of a team at the European Cybercrime Center, also known as EC3, which was set up, I believe, in 2013. 

     The center is the body of the European Union that is headquartered in the Hague in the Netherlands.  And the Center Coordinates cross-border law enforcement activities against all forms of cybercrime including online child sexual exploitation.  And, of course, the center acts as a center of technical expertise in the matter in this region. 

     So without further ado, I will give the floor to Beeban for the first introductory remarks.  Thank you very much.  Over to you, Beeban.

     >> BEEBAN KIDRON: Thank you so much.  Thank you, colleagues, and hello to everyone who is listening. 

     I will start in a curious way which is what happened to me sometime in April when I got a phone call from a teacher at a school where the children were remote learning.  And she had had an incident where a very, very young child had posted and shared some child sexual abuse. 

     And she was beside herself.  And even though it was lockdown, we agreed, and we met in the school playground outside.  And over a cup of tea and a lot of tears, she explained that in addition to the particular incident she was beside herself because she was supposed to be teaching children remotely but she couldn't trust the technology that she was using. 

     And she said, on the one hand, these children need to learn, they need to be in touch with each other, they need to be in touch with us.  But we are putting them in an environment where almost always we feel that we are pulling them into a world in which they are not ready, and it is not fit for them. 

     I was very moved by this teacher and asked her to keep in touch.  And since then and over the last few months, my in box is absolutely full of messages from teachers and parents who have found out that I have been collating this information.  And some of it is very extreme, some of it is very minor. 

     But the sense of anxiety built up in the teachers of a responsibility of two things pulling them between a child's right to education and their duty and wish to keep those children safe is tearing them apart. 

     Now I want to say that very specifically because these teachers were all in state schools.  And around the same time, I was approached by one of the biggest private school networks globally in many, many countries across the world, nearly 100 schools.  And the CEO of that network rang me up and said we are very proud of ourselves.  Within three weeks, we are now teaching 95% of our lessons online and our kids are in school every day in all corners of the world. 

     And I said this is an absolute triumph, I said what did you do about safeguarding?  And he went pale.  Nothing.  He said what should I do? 

     So those are the two things that have been with me through this pandemic.  And I know that during this session we're going to talk about lots of very different -- lots of very different views on this subject.

     But for me, I think that the pandemic brought into clear focus this terrible normal where we are asking children to be in places that are simply not fit for them.  And we are asking for people who care for children to make decisions that there is no good outcome. 

     Now 1.7 million children cannot have access to a laptop of their own for education.  So we're seeing also a divide there.  Sharing laptops with material that is and settings that are simply not fit for purpose.

     And teachers who have lost contact with -- before we went -- before we came out of the first lockdown and of course we're going into lockdown again tomorrow, but this time we're keeping children in school.  And sharing that teachers lost contact with nearly 40% of children. 

     The government have brought in an emergency law making remote learning compulsory in UK schools so any child not in school during the next wave of lockdown must have access to remote learning by the school.  But they have done nothing, nothing to make that access safe.  That is the picture for a child in the UK. 

     At the same time, I looked at the statistics around a particular product here which many of you will know.  It is called Only Friends.  And I was looking at it because I think as many of you know we are bringing in legislation in the UK, online harms legislation.  And I was looking at it because the government keeps on telling me that small companies must be exempt from this legislation. 

     And Only Friends has less friends has less than 50 employees, but it has 50 million subscribers.  And it has about 700,000 content creators.  And for all of those people who don't really know what it is, it sells access to naked photographs and sexual content. 

     And in about April, in April 2020, earlier this year, it was found that on a single day a third of Twitter profiles globally advertising nudes for sale or similar tags appeared to belong to underage users.

     And many of those were using the British firm Only Friends that would, I believe, be out of scope of the proposed legislation.

     And I find myself thinking that what we are creating and what the pandemic has really brought into sharp relief perhaps the things that we who do this every day already know, but that the general public, family, and even policymakers have not understood is that we are asking children, we are putting children in the center of a very toxic environment. 

     That actually we can no longer look at this issue of child sexual abuse and grooming and the most extreme forms of sexual violence against children as being a thing that bad actors do to individual children.  But we have to start looking at what normal we create and look for the levers to make it a less toxic environment. 

     And that in a world in which it is normal for a child to make some pocket money by selling a nude picture of themselves, that actually creates an environment in which it is also normal for a child to share it on the school remote learning school platform, and it's also then normal to pick up admiration and friends through that process. 

     So really in my work here, I wanted to sort of raise our eyes for a moment and say the levers of change that we must look for have to be systemic.  And they have to deal with some of the normals that children are finding themselves living in and the pressure of actually joining in with those new normals as well as some of the more extreme cases that we are hearing. 

     Now I wanted to also touch on a couple of other issues.  Because it was -- in the questions for the session, I was glad to see it said, you know, what kind of harm, you know, to extend even the kind of harms that children are suffering. 

     And I think that there are two or three things that I have been witnessing in this time from a UK context, which I imagine are relevant elsewhere and I have found very particularly sort of concerning. 

     The first one is about misinformation.  And I think we are all aware specifically of the huge effect of health misinformation and what it actually means.  And I still have in my head a headline from the newspaper of a young man who said I guess it wasn't a hoax as his dying words in America. 

     And I think that the depth of the misinformation around health has been very profound for us all. 

     But what I -- what caught my eye was a report earlier in the summer that said from the Center of Disease Control that said those people with most access to conspiracy theories are the least likely to wash their hands and social distance.  Well, that makes a lot of sense. 

     But also that children, that came out of the same week as the report on children's media access and it was, of course, that children get most of their information online. 

     And I put these two things together and felt a wave of sort of upset that the diet that we were giving children was so poor that access to information, which is something that is theirs by right to have information.  And I felt that their story in the misinformation had not been properly told. 

     And I think that you have to then extrapolate from that of misinformation and the very close relation of anti-vax misinformation which has resulted in many, many millions of children getting measles unnecessarily.  You know, they are the victims of this misinformation. 

     And also what we are seeing here as an increase in radicalization.  And through the summer, again, a huge increase in hate speech and specifically racially-based hate speech in the UK. 

     And so I felt that actually in amongst the harms that children are experiencing now in this world where they don't have the checkpoint of a broader society, that they do have a great deal of time inside and online, that this world that is being created of misinformation is actually -- should be central to our concerns and we must not see it as something that is happening solely to the adult population.

     And then finally, I just wanted to mention the question of domestic violence.  And this has been a huge problem throughout the pandemic here in the UK.

     Not only the strategies that people traditionally use to get away from their oppressor, but also that in the tension of the perpetrator being indoors or losing their job or being anxious, those rates go up.  So it's a sort of a double whammy of abuse and inability to get out. 

     And we have seen huge numbers of children calling help lines for the first time about violence to themselves or, indeed, in the family context.

     And I think that this is something where there are some very good remote services, there are help lines that have been absolutely fundamental to their well-being and to either getting them out or supporting them through those situations.

     And I think that that is possibly where I would like to end to say that in all of my criticism of the environment created by the technology that children are using is we could so easily flip it to create a technology that would bring children their rights and help them flourish, keep them safe.  And that in the end the battle is really about responsibility, about the business model, and about legislation and about mandating a lowest bar of behavior that would actually protect children through these sorts of issues.  Thank you.

     >> MODERATOR: Thank you very much, Beeban.  Very interesting. 

     You touched upon a wide range of aspects that I think we will go back to when we open the floor.  But your remarks I think sort of lean very well with what, I believe Amanda is planning to say. 

     So over to you, Amanda.  You can field and expand on the remarks we just heard.  Thank you very much.

     >> AMANDA THIRD: Thank you so much.  And thank you very much for organizing this panel. And thanks for the wonderful opening comments. 

     Very early in the pandemic, it was clear that in comparison to other populations such as the elderly, children were not particularly vulnerable to contracting COVID-19. 

     Even so, this crisis has profoundly impacted children around the world.  In just one sign that the impacts on children are significant, our children's help lines in high income countries have reported increased -- (microphone muted).

     >> MODERATOR: Amanda, you were muted suddenly.  I don't know why.

     >> AMANDA THIRD: I just got a message that the host muted me.

     >> MODERATOR: Can you go backwards like a couple of seconds.

     >> AMANDA THIRD:  So I was just saying that the crisis has profoundly impacted children. 

     And one sign of this is that 25% -- that help lines are experiencing a 25% increase in volume in children accessing their services.  So internationally the pandemic has registered as a crisis.

     So I just want to begin with two notes how we might think about crisis in relation to children, their rights, and technology.

     Firstly, the word crisis comes from the Greek.  But in the original meaning the emphasis is not so much on the cataclysmic event but on the process of taking the necessary decisions to deal with the cataclysmic event. 

     Crisis demands that we envision the change that we want to see and go after it aggressively.  I'm shortly going to suggest that the time is now to center children’s rights as we take decisions about the important role that technology has to play in navigating the crises. 

     Secondly, COVID-19 will be a formative experience for this generation of children.  But it is unlikely to be the only major crisis that shapes their sense of the world and their sense of safety.  For Australian children, the pandemic was immediately preceded by the national emergency of the devastating bush fires. 

     For children in the Pacific, the pandemic followed by vicious cyclones.  And in Cambodia and Viet Nam, catastrophic floods are currently compounding the effects of the pandemic. 

     In short, as we make decisions, we must be very careful not to constrain our thinking to individual crises, though they might grab our attention.

     We must use this moment to think expansively about how to address the intersecting crises that shape children's lives both now and in the future, be it those that confront us starkly, or those that are burning more slowly such as the climate crisis.  Because if children’s rights are challenged now, this is not a temporary situation. 

     Throughout the pandemic there has been a strong emphasis on managing the medical dimensions of the pandemic and its response.  However, for children, as for adults, the pandemic has been more than a health crisis.  It has been and continues to be a crisis in children's social worlds. 

     Emerging research shows that for many children, the pandemic constituted an unprecedented disruption to their everyday routines, their relationships, their sense of security, certainty, and fundamental safety. 

     The confidence that comes from predictable routines or the reliable comfort of strong and stable parents and relatives was suddenly thrown into disarray.

     So my team's research to develop child-centered indicators for violence prevention online and offline in the Philippines shows that children's concept of safety depends very much on the stability of their immediate relationships.  Or we might say that safety is social and relational so children's safety might already have said to have been compromised by mere fact of the scale of disruption in their social networks. 

     Compounding this, and among other factors, children have also been exposed to streams of information of varying quality via social media and video sharing platforms, as Beeban was just indicating, but also via the fragments of news or nervous conversations between parents and carers that they overheard.

     It is no wonder then that children are reporting heightened distress.  But recent research such as that on the screen at the moment, shows that the possibility that children or their family members might contract the virus is very low down on children's list of immediate concerns.

     Rather, they report a heightened sense of worry, stress, feeling trapped, frustration, anger, sadness, loss and grief.  And of not knowing how to deal with these often new and powerful emotions.  They feel disconnected from their peer networks, which for many is a vital source for their well-being and their resilience. 

     They struggled to adapt to online education.  And much of the transition to online learning took place on the fly and as such wasn't always underpinned by good pedagogy, again as Beeban has just talked about.

     And family tensions caused by being in closed quarters with the same people day in and day out, often exacerbated by the financial pressures in families that faced job losses, unstable work, or business closures and so on. 

     Through this period of momentous disruption, technology has played a more and more prominent role in the lives of many children.  And, of course, we have to qualify this statement with the very careful acknowledgement that not all children have regular and reliable access to technology.

     And this raises a very important set of issues about how such children can enact their rights in the context of a pandemic that's unfolding in the digital age. 

     There is a very real possibility that crises such as the present one will further marginalize those that experience digital exclusion.  But for those children with access, while there are concerns that rapid increases in children's screen time during the pandemic have exposed them to increased risks of harm -- and I will say more about this a moment -- technology has also provided a crucial point of continuity and connection for children. 

     It has enabled them to maintain connection with peers, to stay engaged at some level with their schooling and it provides a very important means of relaxation and entertainment.

     But many of the technology-based strategies that we have put in place to sustain life under the pandemic have been assembled quickly without attention to the risks of harm or how to maximize the full potential of technology to support a sense of safety and well-being in this time of disruption. 

     Now, more than ever, despite the urgent issues that we confront, it is important that we pause and step carefully.  What then of children's rights? How should we think about children's rights in the context of this pandemic and other crises? 

     While children's right to physical and mental health is obviously most affected negatively and positively by technology during a pandemic, we must remember that there is much more at stake than their right to health.

     The pandemic unfolding in a digital age has significant implications for children's rights to information, education, privacy, identity, rest, leisure, play, and adequate standard of living, protection from forms of physical and mental abuse and exploitation and much more.

     It also differentially impacts the rights of children living with disability, adoptees, refugees, and those in institutional care or situations of vulnerability. 

     So As the global community responds, we have to remember the indivisibility of children's rights.  As the Convention stipulates and as the team led by the 5Rights Foundation has insisted on while drafting the UNCRC general comment on children’s rights in the digital environment, a document that we hope will guide our thinking in relation to child rights, technology, and the pandemic, there is no hierarchy of children's rights. 

     They -- they must be progressively realized in combination.  So we must resist the temptation to only think about children's right to health.

     With this in mind, I want to briefly reflect on some of the ways that children's rights are impacted by technology in this time of crisis and what this means for their safety. 

     So we know that children don't distinguish the online and offline in the ways that adults often do, but rather they move flexibly across online and offline spaces and they often interact, learn, and participate both in offline and online platforms simultaneously.

     Nonetheless, the majority of children primarily use the internet and digital technology to support and sustain face-to-face interactions.

     Children have emphasized that during the pandemic, while technologically mediated relationships have been important to them, they can't replace their face-to-face interactions.  And I think it's useful here to remember that under the conditions of the pandemic our communication has had to be very intentional rather than incidental.  There is no bumping into people and having a casual chat.  You have to reach out and communicate.

     So children are saying that when their engagement with other people in their world moves solely online, it is not the case that the technological simply substitutes for face-to-face interaction when physical contact becomes impossible. 

     Being unable to interact with their peer’s challenges children's well-being.  Indeed, to the extent that the pandemic impacts children's right to health, it is their mental health, a form of safety whose importance is often underestimated that is perhaps most of concern.

     A major challenge for us is how do we augment existing platforms and technology-based services to support children's mental health, well-being, and their right to safety. 

     We know that the risks of harm to children online are likely to have increased during the pandemic.  Research shows that increased time online does, indeed, augment the likelihood that children will encounter risks of harm.  And while we also know that more time online enables children to develop their digital literacies and their capacities to manage these risks, arguably they have not had the opportunities to do so with the right structures of support and guidance around them given that they have been largely reliant on family members who themselves are in crisis. 

     Purely by the fact of spending more time online the possibility children are exposed to forms of digitally mediated harm have increased.  We know that for those children who were already vulnerable to forms of violence the risks are accentuated.  For the vast majority of children, the consequence is that they spend increased amounts of time at home and not all children's homes are safe spaces that support and sustain them.

     The domestic violence community, for example, is deeply concerned about rising rates of internet partner violence and the effects that this is having on children.  Furthermore, the usual structures of visibility and face-to-face accountability that regulate the behaviors of perpetrators of violence are currently compromised.  We need to think creatively in earnest and across sectors about how we use technology to identify and counter the violence that is taking place behind closed doors. 

     And we need to think in even more creative ways about how to leverage technology to support child victims of violence, whether it is offline, online or both.  If there is a glimmer of hope, it is that these issues which the child protection community has known about, are finally more visible in public and policy debate. 

     There is cause, I think, for cautious optimism that this pandemic provides an opportunity to advance children's right to protection from very serious forms of violence.

     One effect I think of this pandemic has been to challenge most people's sense of privacy.  Living in close quarters is very challenging.  But, of course, again, by virtue of intensified time online, children are evermore exposed to the not always explicit or well explained data collection practices of technology companies with potentially significant implications for their right to privacy. 

     These incursions on privacy have only been compounded by contact tracing absent other surveillance technologies that are implemented to protect populations.  But if surveillance of the general population has increased, the privacy rights of those children who live in abusive families are severely compromised.

     Again, as Beeban alluded to, technology in that context potentially amplifies the mechanisms of control that are available to perpetrators of child abuse and other forms of exploitation and violence.  While the way forward is anything but clear, we can't shy away from these difficult questions.

     Crises, and this is especially true of a crisis such as the COVID-19 pandemic, tend to throw us into the immediacy of the present, obscure the past and the future.

     Across cultures, children are seen to represent the future.  To center the needs of children as we address crises then would be to shift us beyond the sole focus on the present moment of disruption and to bring the future into play.

     So thinking about children, centering children is a vital tool in counteracting the tendency for short-termism.

     If children are seen to represent the future, they also represent our collective hopes for a better world.  Indeed, alongside their fears, children are also articulating a range of positive experiences as a consequence of the pandemic. 

     So far the consultations have shown that amidst the challenges children have appreciated the opportunity to spend more time with family outside the usual routine. They found strength in their connection with those family members, and they have learned to appreciate the small things.  And many also have enjoyed, even relished being online.

     In short, if technology can undermine children's rights, it's also clear that it has the ability to sustain children's rights in a time of crisis.  And yet we are yet to still imagine all the ways that technology can be harnessed to do so. 

     As we proceed, it is important that we don't lose sight of the ways that this moment might enable us to invent new purposes for technology which enable children and their communities to navigate crises more effectively, but also technologies that can support us to work towards a world in which children and the adults in their lives can lead more equitable and fulfilling lives. 

     There is no time like now for thinking about how to leverage technology for children's rights.  The pandemic has shown us unequivocally that the task is both urgent and necessary.  Thank you.

     >> MODERATOR: Thank you so much, Amanda.

     It is interesting and I think it's a good complement to the remarks that Beeban made at the very beginning.  And I took notes of some key comments that you made, and I would like to go back to it when we open the floor.

     Before I pass on the baton to Jutta, I would like to remind all of the participants that there is a Q&A section where they can write down their questions for the speakers when we are done with the presentation.  So far we have four questions. 

     So please feel free to write down the questions and we will go back to them.

     So, Jutta, over to you and then we will finish with Cathal.  Thank you very much.  Unmute myself.

     >> JUTTA CROLL: Thank you very much.  I have shared my screen now.  Can you see my slides?  I see someone nodding so that is fine.

     >> MODERATOR: Sorry, can you make it the huge screen?  Because we can see the detail.  There you are.  Thank you.

     >> JUTTA CROLL: Wonderful.  So my name is Jutta Croll.  I'm heading the project Children's Rights and Child Protection in the Digital World for four years now.  And I'm happy to take over for my colleague Nina Pirk from the German help line.

     We have been working within the project with help lines from Germany in regard of their compliance with the GDPR and also at the same time respecting the rights of the child. 

     And the German help line (speaking in non-English). 

     There are several the children can reach out to counseling by young people and they can reach out to counseling given by adults. 

     So looking at the figures you can see that in all three parts of the work they are doing they had a huge increase during the pandemic.  these are statistics until April so for the time that children and families have been in lockdown.  And especially an increase in the online counseling for children, but also in the parents' telephone hotline where it increased by 54% from March to April. 

     Having a deeper look into the details, the helpline remained open for all of the time.  And they increased even the times when children can reach out, extended them during April with plus 10 hours per week for the child help line and the chat.  And also 30 hours per week for the parent’s helpline to address all of the calls and messages they received.

     The details of the reports they had to address saw a bit different data like we heard from Beeban before.  So they had expected to have more context about domestic violence, about grooming and potentially harmful content, but that was -- that increase was not as much as was expected.

     But they had an increase in reports about violence which was physical, emotional, and also sexual violence.  Children had a lot of reports or just questions about love and relationship online and also about cyber bullying.  So it was not only harm, it was also like positive experience where they had questions regarding that.

     And then the question that came from parents like about gaming, data protection and excessive use were less relevant for the children, but they came in from the parents.

     We have -- I have here some examples from Nina from questions that parents put to the help line.  This was related to where can I find suitable content for children?  What might be the possible risks if my children are more and more hours online?  And how can I talk to my children about these risks?

     Many parents felt overwhelmed with the situation and they were frustrated and got feelings like am I a bad parent?  What shall I do?  But also very, very expressive issues like a son sent some nude pictures to someone he met online and now he is bullied, how can I help him to be protected.

     And issues that are not only related to the pandemic but that also came up during this time of crisis.  And a 13-year-old daughter having a relationship online to an adult and how can I cope with that situation.

     Things that children reported were somehow different.  They felt they are behind with their tasks for school and parents weren't able to help them.  So they are afraid of the future.  But also parents took away the phone so they could not be able to stay in contact with their peers and felt angry but also disappointed about the situation.

     And then children that parents were still working, were not in lockdown, so the children felt alone at home, were scared and sad and couldn't even do something about that.

     And then also children who lived in the families where the parents had problems with each other where there was fights so they were afraid of the situation.  So it's a huge picture of different issues the help line had to deal with.  It was not in all ways the thing that they had expected, which happened during lockdown and the pandemic. 

     And we even got reports from another help line who said that they got more reports about physical, sexual abuse that happened before the pandemic.  So these children somehow during the pandemic when they came -- were not in physical contact with their -- with the offender, they felt more confident to report to a help line to search for counseling.

     So they somehow opened up in that situation when they were not in direct contact, for example, in the sports club or even at school or kindergarten with the offender that they felt cared about. 

     So we have different developments in that situation.  And I do think we are still in the situation that we have not enough means, not sufficient means to deal with the situation.  Help lines are overwhelmed.  They have a lot of reports of parents overwhelmed.  We still need to develop strategies to ensure that people -- that children can benefit from the rights that are dedicated to them, that they can exercise their rights and at the same time being safe.

     Thank you so much for listening.  And I'm ready to take your questions.  Thank you.

     >> MODERATOR: Thank you very much, Jutta.  Very interesting remarks. 

     So we started with Beeban and Amanda who painted the broad picture around different wide range of children's rights.  Now we have the perspective of a frontline service provider, help line that is attending to calls of the children.  And you shared some interesting data that we can comment in a few minutes.

     And now we also will have the perspective of a service provider, frontline provider which is a law enforcement agency.  And then we can have very interesting discussions around some of your comments and some of the data you are sharing.  Over to you, Cathal.  Thank you very much.

     >> CATHAL DELANEY: Thank you very much.  Let me share my screen. 

     Okay.  So my name is Cathal Delaney, as Marie-Laure introduced me at the beginning.  I'm Team Director, Europol.  dealing with child exploitation and abuse, online and offline related crimes.

     So I wanted to pick up on from the other speakers and from the presentations that have been made so far is that law enforcement is not always viewed in terms of what we do in relation to children's rights.  But in this area in particular, we are very much focused on the child and their rights and their status as a victim when crimes are wielded against them.  And particularly in relation to sexual exploitation and abuse. 

     We take the role as being inherently protective of those rights and trying to enforce those rights and their right to protection and their right to privacy.  These are very much central to what we are trying do in law enforcement when we are pursuing those who will exploit and abuse sexually children online and offline.  And to make those children -- to try to protect them from that abuse and to protect them from it continuing into the future as well.

     An effective way that we look at this at Europol -- Europol is an agency which is intended to support law enforcement throughout Europe in their work in different crime areas where two or more Member States are affected by the crime, the victim, and different matters that need to be investigated. 

     And the way that we look at doing this is through these four different steps -- four different areas.  I won't call them steps because they don't necessarily follow one another.

     You have, most importantly, a collaboration.  So the collaboration that we have with our law enforcement partners and collaboration that we have with the public and the private sector and with nongovernmental organizations and with the different stakeholders who are involved in the different crime areas in our area in particular in relation to children and their sexual exploitation.

     And following from that then we have an operational response which is obviously a huge part of the law enforcement responses to actually do something about what we are being told is happening and take action in relation to that.  And our role rather at Europol is to support the Member States law enforcement agencies in taking those actions and trying to ensure with them that the best possible result comes from those actions. 

     And then we have the strategic response which is looking at the picture of what is going on and how we can use that picture to better protect children and also to contribute to the decisions that are being made by policy makers and by the decision makers both in the Member States and the EU level to inform them about how things can improve and what are the steps that need to be taken to do that.

     And then prevention and education which is one of the very few areas that Europol reaches out directly to the public for.  And that is providing different materials, advice, and ways for the public to protect themselves, protect their children and become more educated about what is happening in the area.  And so that they can do that.

     So in terms of operational response, we, as I said, are very focused on the children and means that we focus first on the identification of the victim.  In any investigation, the first question we ask is where is the victim and how can we help them?

     And we sometimes ascertain that through different conventional methods of investigation with our -- with the partners, with the agencies that have asked for our support.  And we have victim identification function at Europol that we built over the last number of years.  And it is a part of the team that drives the victim identification task forces which take place every year.  One is taking place at the moment.

     And we use the information we gather from that victim identification process and our investigation process. And when I say we, I mean all of the investigators involved in this.  And we use that information to try to determine where a child is so that we can enable them to be reached by the national law enforcement agency responsible in the jurisdiction.

     And doing that also involves intelligence analysis and technical support intelligence analysis for information that we store in our databases about the offenders and their activities and so on.  And the technical support on the ground, on the spot, or at Europol headquarters with the technical expertise and knowledge that they our specialists have in this particular area. 

     And the result of that during the lockdown period was that we were able to support remotely several operations.  One example here was a case in Italy where our colleagues in Australia discovered material online that they believed was recent abuse of a child.  And through the examination of that material and the cooperation of several different jurisdictions that we coordinated, several different law enforcement agencies in Europe, it was possible to discover the particular region in Italy where we believe that the abuse was taking place.

     And then the colleagues in Italy were able to inform their local investigators there about the circumstances of the case and the information that they had related to the suspect.  And ultimately that led to the identification of the suspect and the victim, a young girl, a child who had been abused by him within 10 days of those videos being posted online, which is quite a significant achievement in that particular difficult circumstances.

     And another example was in Spain, where similarly the national agency were able to conduct a search remotely based on information that they had previously received and information that we had added to, again, to rescue a male child in that case, a boy, and to discover the identity of his abuser as well and take him before the courts to be dealt with according to justice.

     And then also we had a case in France where we actually supported on the ground with the technical and intelligence support leading to the arrest of a suspect there.  And the -- and the safety of two of the children -- two children that he had been abusing himself. 

     So each of those are just illustrations of the extent of the efforts that we would go to in order to ensure that children who are being abused online and whose abuse had been posted online that we will do everything that we possibly can to support the law enforcement agencies in Europe and beyond in ensuring that that information gets to where it needs to be acted upon and that children are made safe as a result.  And this is just some additional information in relation to the Italian case that I spoke about already.

     The information that comes from those cases is added to the information that we already have at Europol.  And we have the video analysis system which is for victim identification.  And in the system at the moment we have more than 50 million images and videos that are unique.  And these are not multiple copies.  These are unique images and videos of child sexual exploitation abuse. 

     And because of the efforts that we have made over the last number of years and with our partners through vehicles like the victim identification task force which we hold at least once a year, sometimes twice, we have managed to examine just 20% of that material. 

     And that is to give you an idea of the scale of the problem that we face in relation to the amount of material that is out there.  And I know that network, they have given statistics previously in relation to possibly that many and more images and videos appearing online every year as well.

     And the difficulty, of course, for any child whose abuse has been recorded and whose abuse has been distributed is that there is every possibility that they will at some stage at least be fearful that they will be recognized on the street because -- by somebody who has viewed material in relation to them, who have seen material in which they were exploited and abused.  And that that person could recognize them on the street.

     And this through survivor surveys carried out by the C3N in Canada and other collaborators have established that this is a very real fear for victims.  And you can see how damaging and difficult that would be for them as they go through their lives.

     And what we do at Europol is we have a very strict privacy and data protection regime.  And within our own team, that privacy and data protection regime is very strong.  And we -- because we recognize the importance of the information that we receive not only in terms of criminal intelligence but that it is material in which children have been abused and in which we want to do our very best to ensure that those children and the abuse material that is there is not any further exposed than it absolutely needs to be and is only seen by those who need to see it, and that there is not a possibility for those children to be re-victimized in the context of the investigations that we support and in the ways that we support them as well.

     So in looking then at the strategic response in relation to what we saw during the pandemic, we produced three public reports during that time.  First one is catching the virus.  And the second one was at the beginning of April just within four weeks of the lockdown beginning.  And then we had beyond the pandemic which is at the end of April.  And then we had a dedicated report in relation to child sexual exploitation and abuse, exploiting isolation. And that was published on the 19th of June this year. 

     In each of those reports what we wanted to do as a team was to emphasize what the impact and what the crisis seemed to be having in relation to children and how it could be the case that both the public and decision makers could be properly educated and steps could be identified for them to try to mitigate the damage being done.

     So we looked at -- in the first report, we looked at what the situation appeared to be with information that we got from our law enforcement partners and from the various other sources, and we saw that there did seem to be a problem.

     I won't reiterate again what Dr. Beeban and Dr. Third have already said because it pointed out that the time is likely to become obvious during the pandemic in relation to the children themselves and now which Dr. Third said have become impossible to verify basically.

     In the second report then, we looked at what the future situation was likely to be.  And in the dedicated report.  And then we looked at the whole range of data sources to try to establish what the full picture was and how that picture could be mitigated through the actions of both law enforcement, policy and decision makers, and the actions of the public themselves.

     And what we did from the information was that we produced prevention and education advice for the public and obviously for the different levels of the public, the carers, parents and the children themselves.  And working in collaboration with the safety commissioners and office in Australia and producing the global online safety advice for parents and carers in the European context and adopting what they already produced. 

     Producing videos and information and -- yeah, information for the different levels of parents and children to inform them of what could be done to help them to not be vulnerable in this situation.

     And I think going back to what has been said already and the difficulty of potentially damaging children's privacy during this time and the right to privacy for those children who have access to the internet is very much dependent on having access to this type of information where parents can be educated about what it is that they need to do in engaging with their children and conversing with them and discussing with them what it is that a parent is trying to do in protecting them.  That they are not trying to take away their rights but rather they're trying to supplement them and strengthen them by telling the child what it is that they are trying to do and including them in the conversation. 

     And by doing that to hopefully motivate the child to take this as something that they also have a part in, that they have a role in, and that they need to understand and to take seriously in order to protect their own rights in the future and be able to do that as well.

     In the end of the day, it is very much the case that whatever way we look at this, there is no way that law enforcement can solve this problem on their own.  And we don't want to solve it on our own.  We want to solve it in collaboration with partners, whether those partners are in the public sector, in the private sector, with non-governmental organizations, and with bodies like the governance forum.

     We want to be working with you in order to solve this problem because it is only by doing that, by having all of us working on it together that we can be effective and that we can produce a result which at the end of the day will protect the rights of children whether they are to health, whether they are to protection or privacy.  And that is something that we as a law enforcement agency at Europol are very focused on and that we as a community of investigators who are very much focused on protecting children and protecting their rights and also want to achieve as well.

     Thank you very much for the invitation and for your attention as well.

     >> MODERATOR: Thank you very much, Cathal.  Once again, very interesting.  And thank you for bringing in the perspective of the victims. 

     And perhaps for those who may not be familiar with this field of work, starting your presentation by mentioning that the work that you do as an agency is victim focused and centered is something that shouldn't be taken for granted for all law enforcement agencies, especially 15-20 years ago. 

     This had been a very positive progress made by major law enforcement agency, the national and regional and international level that we have witness, and we thank you for that.  

     So, and also thank you for ending on a positive note, mentioning the fact that this issue is really about collaboration, thinking of responses and solutions.  We need to think collectively, and I think this panel is actually a good sort of example of that. 

     Because we are bringing in expertise across sectors from academia, from NGO, from law enforcement so that we can all together put our brains together to look at the problem and the solutions.

     So I would like to go back to some questions that were spelled out in the Q&A section that are actually linked to those points.  They are -- I mean they are all very interesting. 

     I will start perhaps with one on photo DNA because it is sort of linked to the topic of the volume what Cathal was referring to right now.

     And John Carr is asking if you are aware what is happening at the European parliament in relation to the use of a proactive tool that is called photo DNA and potentially its ban and the ban of other proactive tolls to detect grooming and other illegal content.

     And John is saying that the pandemic is reminding us or has reminded us that those two are very important.  And can we have your thoughts on that?  I don't know, Cathal, or some of the other speakers.  I don't want to put you on the spot.  If not Cathal, other speakers.  Yes, Jutta?

     >> JUTTA CROLL: Thank you for giving me the floor. 

     I do think this is also related to the other question that we got in the question and answer section as well from Maimuna Jeng, I hope I pronounced that correctly, who is asking whether these practices like using photo DNA to a detect child sexual abuse material might also be a kind of censoring of the sexual expression of people, especially feminists, he or she writes on the internet. 

     And I think do it is very important to explain from the start that we are not talking about censorship in any way.  We are talking about fighting child sexual abuse material with the means of those tools like photo DNA.

     So that is not about suppressing freedom of expression. In the other way, it is ensuring that children's rights not only to safety, but they also have the right to freedom of expression.  And we need to ensure that children can exercise that right in a safe environment.  So I don't think we are -- we can achieve something if we are talking about whether it is censorship or not.  We need technical tools and I'm pretty sure that Cathal can go more into details how helpful these tools are. 

     And that they are especially, as far as I know -- and I'm not a technician -- photo DNA has nothing to do with mass surveillance.  It is not like surveilling or monitoring the interaction of people.  It is discovering files and images.  But you can explain better, I think.

     >> CATHAL DELANEY: I will take that as well.

     I have to be a little careful because at the end of the day Europol is a UA agency so we're not going to comment on any political developments or anything like that.  And John knows that as well.

     And so let me talk a little bit about the technical issue that Jutta brought up.

     So the purpose of these types of tools is that they will detect material relating to child sexual exploitation and abuse in platforms where that is being distributed.  As Jutta said, they are not being used in order to suppress freedom of speech but rather to -- for a very particular defined purpose in order to detect this material and to inform law enforcement about it, which reaches the law enforcement Member States, 18 of them, through Europol in relation to referrals that are made through the mandatory reporting in the U.S. and through the National Center for Missing and Exploited Children there.  And those number have been increasing the last couple of years to a significant extent.

     And in relation to what is the legislation that was proposed by the commission earlier this year -- and Europol obviously applauds that this legislation was produced and then respects the role of the Europe parliamentary and council institutions in deciding how that legislation is to be implemented. 

     So, yeah, I -- I mean what I understand -- I'm not defending the legislation.  What I understand is that what it seeks to do is to maintain the status quo and not to go beyond what is already the situation and to close a gap between legislation which comes in to force the 20th of December this year and other legislation that they hope to introduce next year in order to close this gap basically in the law that otherwise exists. 

     That is what our position on it is and the technical aspect that I think needs to be understood about it.  And about the other technologies that are being used to do the same thing.

     >> MODERATOR: Thank you very much.  Beeban, you raised your hand.  You wanted to add something about this topic?

     >> BEEBAN KIDRON: I think -- first of all, before I speak, I want to thank Cathal and all of the colleagues at Europol for what they do on the frontline and we really recognize that that is different from those of us who talk about policy, and thank you very much.

     But I think I can't make a political point just like you can't.  And I think I would like to make a slightly broader political point. 

     I'm disappointed that policy makers have sort of been backed into a corner about talking about freedom of speech versus child sexual material protection and the other protections that we see.

     I just want to make this analogy.  It is like if in the milk system we put a little bit of poison and every 10,000 over breakfast somebody got sick, we wouldn't sitting there going on the one hand, freedom, and on the other hand protection.  We simply don't have the right framing in this virtual world.

     And the framing that I would like to see policy makers take because I accept some of the arguments about the tool itself.  But let's do it this way and say if you cannot do your very profitable business without carrying, you know, industrial level of child sexual abuse, then you are not fit to -- you are not fit to trade.

     So I see it as a business practice issue.  And that the solutions then will have a lot more attention and the solutions will have to be much more effective.  Unless you can do that, it is -- it is a not fit purpose.

     And I haven't -- you know, again, I think I said it at the end of my opening remarks, you know, this technology is marvelous.  Look at what we are doing, we are in different parts of the world all speaking together.  The truth of the matter is we have got to have a much more zero kind of attitude for this and say it is not fit for purpose.  So what are you going to do with it?

     And maybe new tools and maybe these tools and some restrictions to their own services and their own sprint, but it is not about freedom of speech.  That is not an unqualified freedom against all other freedoms, and these are private companies. 

     They may be the public highway, that is something we allowed, but they are private companies and not fit for trade if this is the trade that they enable.

     >> MODERATOR: I would like to jump in and link what you just said, Beeban, to a question that Sonya Livingston is asking about solutions.  And Sonya is saying that the panel has raised a very troubling set of problems facing -- sorry, I missed -- oh, God, I lost the question.  Where is it?

     My apologies.  A set of problems facing children during the pandemic.  So to your point, Beeban, what immediate actions can you call for from the government or from the industry?  Can you think of something very specific?  You were talking about being the industry perhaps.

     >> BEEBAN KIDRON: I think we have to be careful because the short-term things we call for are just not the whole solution, and it is always a very difficult place to be.  I think, you know, I would like to see rigorous application of their own age restrictions, yeah.

     Right now, you know, so it doesn't solve the whole problem, but it does mean that 8-year-olds aren't using services that they just don't have the maturity to deal with.  And I think that there is certain parts of their design they should disable immediately.  And I have talked to some of them about this.

     But and I -- and I -- maybe I could just stop on this exact point which is, you know, why is it that it is routine and normalized that strange adults are introduced to children as friend requests?  I mean in what world does that make sense when at the front line, you know, colleagues are trying to deal with grooming and child sexual abuse.

     And actually I had a recent conversation with a senior safeguarding police officer in one of the nations and we talked about -- and he went back and looked at past cases.  And he actually started seeing how kids were being introduced, you know, through these benign apparently things.  This is just a commercial thing for their own network effect.

     So we have got to, you know -- it should not be that a child is offered friend requests of adult strangers period.  They don't need it to make the system work.

     So I think my long-term thing is that we have to start doing impact assessments on services.  We have to see where the risks are.  Take out the risk because the risk is a harm that hasn't happened yet.  And that would mean our colleagues who are dealing with harm at the other end, ya, would have a whole lot less to do if we made it a little bit more difficult and less sticky in the first place. 

     I actually think we need a wholesale regulation, I think we need algorithmic oversight.  I think the other big thing we have to make a difference on here is the difference between individual pieces of content and enabling them to spread.

     These are two different processes.  One is a bad actor, and one is a business model.  Allowing, enabling that spread is also criminal in my view.  So I think there are things -- and I don't want to take all of the time -- but there are literally a whole host of things. 

     And in terms of 5Rights, you know, who I'm talking about, we have a very good publication on the website saying priorities for the online homes bill which doesn't cover all of this but covers a few pieces of it.

     >> MODERATOR: Thank you.  So we're talking here about algorithmic amplification, right?

     >> BEEBAN KIDRON: Yeah.  I mean if you think of me sitting in this room alone, whatever I put up, only those who try and come to find and see it.  If you deliver it and spread it around the world, therein lies the problem. 

     However bad, if it is isolated in one place, ya, and this is something that I don't think policy makers are quite yet attuned to.  But a lot of the problem we have here, whether it is misinformation or child sexual exploit is about spread, is about network effect, and actually is about automating the bad instead of automating the good.

     >> MODERATOR: Very well.  Automating the bad. 

     So to move on because we have another two questions which I think are very relevant.  One is about the dark web.  And again, perhaps that should go to Cathal. 

     And another one is from Abhilash about self-generated images and regulations.  So the one from Sivasubramanian Muthusamy is asking what is the proportion of these crimes in the dark web to that of relatively normal internet activity in the internet space? 

     Or how do those from the dark web connect to and interact with children?  I think we can apply this question to the COVID-19 pandemic. 

     >> CATHAL DELANEY: We gave an example of one of the forms within the dark net.  And that form was particularly -- those couple of forms actually and they were particularly concerned with what they called capping which is capturing video streams that are produced by children for various different reasons. 

     And so it can be that they are produced because the child is being sexually extorted or coerced.  It can be that they are exchanging material with a peer.  It can be that they believe they are exchanging with a peer, but it is actually an adult.  Or it can be that they are broadcasting unfortunately, and their own abuse online in order to get likes and to gain popularity and status and so on, which is to me rather tragic, but it is the reality.

     And what we saw was that there was an increase in material being formed in relation to capping.  And there was an additional form in relation to this in order to enable it.  And both of them had a significant following in terms of numbers and both were on the dark net. 

     And so to us it was an indication of where their interests lay at that particular time.  And that they saw this as a fruitful and useful way to gather and exchange and distribute data between themselves and make it available to one another in order toe able to have access to it and to reabuse essentially these children. 

     The material comes from the different sources in various ways that I described but most come from the internet because the interaction between offenders and victims on the dark net is very low because that is not the way the dark net is built. 

     It is built to ensure that people are anonymized and that their identity is not available.  And making connections takes more effort there than it does on the care net. 

     What they will do is they will source this material from the care net and then post it on the dark net.  Or they will begin the relationships that they want to in an open forum, and then they will move it to a private forum, which is something that we are all very well aware of as a modus operandi.  And then within that private forum they will then abuse the children themselves. 

     Occasionally it has been the case that we have seen where groups have formed in the dark net in order to do this type of abuse, particularly the capping and so on.  So the proportion of offending directly against children which is taking place in the dark net is very low.  But what the dark net does is it amplifies and at an order of magnitude the actual abuse and re-victimization of those children through the exchange of that material among those who are frequenting that space. 

     And it also enables them to amplify the idea among themselves that this type of abuse is okay, that it's an acceptable thing for them within their society to do and that this is the right place to do it because it protects them from being detected.

     >> MODERATOR: Thank you very much, Cahal.  We'll move on.  We have six minutes left so we will take one last question before we wrap up. 

     It is from Abhilash Nair and asking an interesting question about how we regulate self-generating materials and the tension between the laws that criminalize the child sexually abusive material that they were drafted with the objective of protecting children from predatory adults. 

     But now we think that the data is suggesting that there are rising self-generating images by the young children themselves.  And then if we apply those laws then we will criminalize the children's behavior.  But at the same time we need to deter the behaviors. 

     So Abhilash is wondering what the panel thinks, how do we find a balance?  It is a tough question.  Jutta, you want to answer that?

     >> JUTTA CROLL: Yes, only a quick answer.  I do believe we need to differentiate between children who are forced or even blackmailed to take these images and children who do it deliberately. 

     I would say it is somehow at a certain age exploring your own sexual orientation and your personal development, it might also be part of that process to take images.  But then we come to the question that Beeban raised earlier.  It's the difference between taking an image and spreading the image. 

     And that we need to address the issue that of course children have to develop their own personality, the right to have their own identity and sexual orientation is part of that process.  And what about planting the images all over the internet and then they are probably also misused by other people. 

     And so we need also to talk about how we can stop the spreading of these self-generated images.  I would not talk about criminalizing children by law for doing -- for taking images, but it might be a criminal act to spread that around the internet.  Thank you.

     >> MODERATOR: Thank you very much.  I think it is a very clear answer.  Do you want to add to that on the particular point, anyone?  Yes, Beeban, please go ahead.

     >> BEEBAN KIDRON: Just to say to underline the fact that actually where the law has been developed without this concept in mind that children might self-generate these images, I think we have to decriminalize children, so we have to take action. 

     So I would agree 100% with Jutta on this point that there may be some times -- some places where we actually need to make an intervention in existing law because it didn't imagine this scenario.

     And very, very few children should be criminalized for this sort of activity.  I mean only in very, very extreme cases where -- you know, not the social sharing.  And there is a problem and I know there is a problem even in our primary schools which means kids under the age of 11 are being put on the sex register.  That's just -- it shouldn't happen.

     >> MODERATOR: I think it is a topic that deserves a session on its own. 

     And I will -- I will take the opportunity to refer the participants to a very nice campaign, a video that is on Europol website that says from the top of my head says no.  If I'm not wrong, it is in all of the European languages. 

     For those not from Europe, you will find either English-speaking, Spanish-speaking or French-speaking and other languages that you might need.  But it is very well done and speaks for itself. 

     And I do use it personally to trigger discussions around the risky behavior of self-generating and solutions.  And, of course, there are other material out there.  You can find it in the eSafety commissioner -- the Australian eSafety Commissioner website and other organizations.

     Yes, Jutta?  Sorry, I thought you were raising your hand.  We are wrapping up the session, and we have one minute to go. 

     And I would like to thank the speakers for their very valuable presentations, ideas, insights and there are certainly key takeaways that we can, you know, take with us.

     And I'm tasked with doing a report, so I have plenty of notes and we recorded the session.  So you have set so many interesting things and very applicable to our respective field of work that I would like to thank you from the bottom of my heart.  And also acknowledging that some of you had to wake up very early this morning to connect, and but it is all for the cause, as Cathal said at the very beginning. 

     I would like to before I end, I would like to make a call to the participants because the IGF this year is initiating a call for voluntary commitments, is asking all of the participants and the speakers to this year to the virtual IGF to make a pledge to move forward the goals of the Internet Governance Forum and the recommendations from the digital corporation road map. 

     I invite you from whatever vector you belong to, to go to the website and look for the web form where you can make a commitment so that within the next year until IGF 2021 you commit to take forward one of the actions and the goals of the Internet Governance Forum. 

     With that, we can end the session today because I suppose we will be cut off if I don't stop now.  I thank all of you and I wish you a very good day, if you are starting the day, and a good evening if you are ending the day.  Thank you very much to all of the participants. I hope you enjoyed the session.  Bye-bye.  Thank you.

     >> JUTTA CROLL: Bye-bye.  Thank you for your excellent moderation.  Bye.

     >> MODERATOR: Thank you.