IGF 2021 – Day 1 – (DCCOS Regulate or prevent to protect children – a false dichotomy?

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

      (Video playing)

   >> We all live in a digital world. We all want to trust.

   >> And be trusted.

   >> We all despise control.

   >> GROUP: We are all united.

      (End of video)

   >> JUTTA CROLL: Hello, everybody.

    Thank you so much for the introductory film.

    I hand it over to Amy Crocker to welcome all our speakers and participants.

   >> AMY CROCKER: Thank you, Jutta.

    Good afternoon, good morning, and good evening for the parents joining from other time zones.

    My name is Amy Crocker, and on behalf of this Dynamic Coalition on children rights in the digital environment and the organization I represent, ECPAT International, I'm happy to welcome you to this Internet Governance Forum 2021 session.

    We would all like to be together in Katowice, but we've experienced effective online meetings.

    The theme is Internet United. So the Internet is bringing us together today, even with difficult circumstances around the world.

    So welcome.

    A quick note on the run of show, this is a roundtable discussion. We have several speakers lined up ready to frame this discussion, give their views. We'll also be posing questions to them and also inviting you to engage. And we really welcome that.

    So please feel free to use the chat to ask questions or make comments and raise your hand if you would like to speak.

    I believe the session is being recorded. There's a transcript, and that will be available afterwards.

    After the session, we submit some key outcomes and call to action to the IGF. Later in the month, we'll submit the full report to the session.

    We'll also have a few polling questions. Let's hope the technology works with us. I will share some instructions in the chat.

    I will introduce the session and then hand it back to the moderator, Jutta Croll.

    Why we're here today: DCCOs Regulate or Prevent to Protect Children – A False Dichotomy?

    An overarching question for this system is: How do we balance or align the principles of regulation of digital environments and prevention strategies to protect children?

    And in this discussion, we seek to explore the key IGF 2021 theme of Emerging Regulation with subthemes of content moderation, human rights compliance, data governance and trust locally and globally.

    Why is the theme so important in general and so important for us as a Dynamic Coalition? It's estimated that one in three Internet users worldwide is a child. This makes children a key stakeholder in all issues of Internet governance. However, their voices are too often not heard, and their rights are often not prioritized.

    In this session, we're focusing particularly on online violence and exploitation, including the victims of violence and exploitation and their right to privacy.

    We have experts on this call. The comment number 25 on children rights and in relation to the digital environment in March of 2021, it demonstrated that the rights of children online are equal to those offline. It's important to point out that the relationship of what happens online and offline between children is an important part of the discussion we'll be having today.

    For the Dynamic Coalition, the primary importance of children rights is also grounded in key points of legislation that are guiding us. One is Article III of the UN rights of the child.

    And the second at the EU level is (?) And it is a primary consideration in all actions taken by public authorities and private institutions.

    You know, while a few people oppose the fundamental principle of children rights or protection, including the right to protection from violence, there are different perspectives being brought to the topic of how best to approach this Chang. This is driving and being driven, of course, by a broad set of questions being raised currently about how to manage our digital world.

    At the core of the digital transformation being seen in many countries and different regions is how much to regulate digital online service and providers platforms. As we see in pieces of legislation being tabled around the world to discuss harmful and illegal content online, thinking of the digital services act and the EU, the Online Harms Bill in the UK and the Safety Act in Australia, just to mention three, there's an experience of self‑regulatee to environments. It's complicated. It's clear there's so single thing that can protect children from violence online. I think it's widely understood and accepted that we need a broad range of conditions, including but not limited to strong and clear legal frameworks addressing harmful and illegal activity, the development and deployment of innovative technology for prevention and support and that of children and online workers and people that are (?) Of violence themselves.

    Defaulting the binary or exclusive positions, this is the nuance and compromise needed to identify sustainable cross‑section solutions to danger to children in real life and online environments.

    Advocates of regulation may argue that technology companies have failed to self‑regulate effectively to scale and globally. They point out that technologies to address online violence are proven to work but are not being taken up and deployed sufficiently widely or with appropriate levels of transparency and accountability.

    Proponents of a public health approach may argue that too much focus has been placed on regulation and technological interventions at the cost of investment and prevention of strategies that tackle these social problems at the roots.

    When considering these two perspectives and the many others that surround them and fall between them, we also need to be asking ourselves some key questions. You know, what is our tolerance of risk affecting children online and our threshold for harm.

    What is our attitude of online activities and what is needed for the safety of children and other vulnerable groups.

   I wanted to point out by research carried out in partnership with eight members and eight UN member states. This was a survey conducted on public attitudes towards online privacy and child protection. We found some really interesting facts that were consistent across the eight countries where we polled. 73% of adults believe children cannot go online without running into adults who want to harm them. Seven out of 10 believe there's not much privacy anyway.

    76% were willing to give up some of their personal privacy to allow for tools to detect images of child sexual abuse. There's support by the EU to launch new legislation (?) By private companies.

      (Reading document)

   >> AMY CROCKER: Without compromising both online and offline, this Dynamic Coalition seeks to listen to different perspectives, identify some common ground, and try to consider a broad, sustainable, nuanced approach for children in digital environments.

    So on that note, I would like to hand back to our moderator, Jutta Croll. She'll introduce the speakers, and then we'll have questions.

    Thank you.

   >> JUTTA CROLL: Thank you, Amy, for your introduction and preparing the ground for the debate that we'll have now in the next 80 minutes or so.

    I would like to introduce our speakers of today to you, and I would like to start with Sonia Livingston. She's a professor at the (?) Of Economics. I believe also a founding member of the Dynamic Coalition. We were given reason to rename the coalition and make it more appropriate to what we are facing now, more than 30 years since the UN convention on the Rights of the child has come into effect.

    I need to say that Sonia needs to head to another session after 40 minutes from now on. So we will give her the floor, then, first.

    Let me name the other speakers in this session.

    We have Patrick Burton. He is executive director of the Centre of Justice and Crime Prevention in South Africa and has experience in both research and policy level on (?). Welcome, Patrick, to our panel.

    Then we have Thiago Tavares. He was there when we started the coalition. He has a bachelor’s in law and administration and degree in management and social development. He's with SaferNet Brazil and told us before the session started, that it's also a risk for position in these days now in Brazil. Maybe you can go into that further on.

    Then we have Michael Tunks with the Internet Foundation in London. It's long been working, I think since the beginning of the Internet, to make the Internet a safer place and to prevent child sexual abuse.

    We have Andreas Hautz, my colleague from Germany, from Jugendschutz.net and the German Child Protection Organization. Part of the unit for international work there coordinating the international project (?).

    To, to all of you, for joining this session. We have time for your sessions.

    We want to have enough time left to discuss further with participants in the room.

    Sonia, please, you have the floor.

   >> Sonia livingstone: Thank you. You sent questions to prompt me. They did, indeed, prompt me.

    I will focus on this. I do think they're interesting, and they do, perhaps, put a finger on some tricky issues for us.

    One you said was: Do we need to separate issues between addressing illegal and harmful content and conduct?

    This is currently a contentious mess in Britain and probably everywhere else without particular regulation.

    It promoted me to think that in our field, perhaps uniquely, we have some truly problematic contestants, or combatants. It seems very peculiar to many of us, I know, to advocate for child safety and child rights and find ourselves up against people who advocate for democracy and free expression. In various forms this, kind of debate has proved incredibly problematic. I see it every day on Twitter in the kind of latest where those advocating for child safety and child rights are taken to have a kind of maliciousness.

   I can only think that one tactic is to keep a separation of what is legal and what is harmful. It's problematic for us because as John Carr and I discussed many years ago, legal and harmful are hard to discuss in the abstract. There's illegal to everyone of course, but it's certainly illegal to cause harm to children. So we can go around in a tongue‑twister, but it's not beneficial to do that.

    And the next one you sent me, Jutta, was the role about technology companies and prevention. This has been a year not only of the general comment but an almighty row of encryption where we've found ourselves bizarrely advocating for child safety and child rights against those who advocate for privacy.

    We find ourselves in some very unholy wars. And I think it's time for this ‑‑ I really hope this group can come up with some solutions that reposition what it is we're trying to advocate.

    So should we have seen the encryption debate coming and got skilled up in advance of it? That's the example of a kind of question I might ask us. On that question of kind of anticipating and preventing, I, like many here, have thrown myself behind the kind of by‑design movement in hopes that if we can successfully courage companies to implement safety by design, security by design, ethics by design, I would say child rights by design.

    The question is: What can that bring? Is that the kind of movement that could be effected not necessarily by heavy‑handed regulation against which companies will lobby but through other means, through standard bodies and through training of young digital professionals through trade associations and so forth.

    I don't know the answer, but I would love to hear people's thoughts here.

    And I will stop.

    Thank you.

   >> JUTTA CROLL: Thank you so much, Sonia, for your shot at intervention and for sticking to the time.

    I would like to hand over to Amy before we go to the other speakers because she has prepared two questions, also, for participants in this session, and we would like to get an impression of what they think about the questions we have posed.

    Amy, over to you.

   >> AMY CROCKER: Yes. I will just host in the chat one moment. Let's see if this works. It's always fun.

    Okay. So you will see the URL and the code to enter. If you would like to navigate there and enter that code, you should see a question.

    Are people getting it?

   >> JUTTA CROLL: Does it work?

   >> JOHN CARR: The questions appear to require us to think and type at the same time.

   >> AMY CROCKER: So this will take the remainder of the session.

    Great. So we have some results live in beautiful rainbow colors.

   >> JUTTA CROLL: We can't see it so far, Amy. You need to share.

   >> AMY CROCKER: I need to actually show you. Yeah, let's work out how to do that. Perhaps we'll go on to the ‑‑ okay. Sorry. I was trying to ‑‑ I think what I will do is ‑‑ if you would, refresh your screens. This is just a very simple question that you can type in words. It will create a cloud, just to get some ideas of what people ‑‑ you know, why people came to the session but, also, what are your thoughts on, you know, the protection of children. Do you have a particular thought that comes to mind? Then I will ‑‑

   >> JUTTA CROLL: It doesn't refresh, Amy.

   >> AMY CROCKER: It doesn't refresh? Interesting. I think it's refreshing now. One moment.

    Okay. That's okay. Let's focus on the first slide.

    So I think I will try and share my screen.

   >> JUTTA CROLL: Jennifer advised that probably you need to go to the next slide, and then it will refresh for us as well.

   >> AMY CROCKER: Yeah, I have actually done it. I think it's just not refreshing, is the problem. Thank you for your support.

    Okay. Let's not spend so much time with this.

   >> JUTTA CROLL: I would suggest we go on to the speakers. Oh, now we can see it.

   >> AMY CROCKER: This is a screen share. The second one is not working. We'll just do this one.

    What we have is a pretty sort of equal mix between one information about the topic, which is great. I think it's really important that the more people that get involved in sort of thinking about this issue, the better.

    Agreement on some common ground and a roadmap for the future.

    I really think that ‑‑ you know, Sonia was just saying that there's actually a lot more common ground and consensus than the different sort of interest groups admit or make time to find. So I think that's really important that that's come out.

    And prioritization of prevention, that's interesting.

    So from the get‑go, we have a clear lead towards prevention as a crucial part of this conversation.

    So, great. I will hand it back to you, Jutta. If we can get the other one to work later, we'll try to do so.

   >> JUTTA CROLL: Okay. Thank you.

    With only 18% saying they expect a clear answer to the question we have put for this session in regard of whether regulation and prevention is a false (non‑English language), I think it would be great use for the speakers. I think we go in alphabetical order because I don't want to prioritize at that point.

    Then Patrick Burton is the first person to speak after we already heard Sonia.

    Patrick, please.

   >> PATRICK BURTON: Great. Thank you very much. I'm in the panic mode because I had in my head that we had six or seven minutes rather than two. So I will try to focus on one or two or eight, nine, 10 points in those two minutes.

    So, first, I never know whether to take these things as interventions or provocations. If it was a provocation, I would start off by saying that I'm very firmly of the view that regulation is simply politically expedient. At the risk of angering all my colleagues and the hugely ‑‑ I see John shaking his head already.

    I don't think that's the case. I think it's an oversimplification. I think it's something we need to be very careful of.

    I do think we see an overemphasis on regulation, but I see regulation as one arrow in the quiver, one tool in the toolbox. If you're looking at bullying and other violence against children, (?) Should lie.

    I started by saying locally that its regulation might be considered politically expedient. It is because it shows that there is ‑‑ there are steps being taken by government, by those in position of power and authority and responsibility to protect children.

    There are steps that are often short‑term or medium‑term, and they change and they address the incredibly important visual, post‑fact violence that occurs. The child sex material that gets circulated. The sexual content that is everywhere on the Internet that children have constant access to.

    However, it isn't changing what is behind that. That is why I say it's essential, but it is also just one aspect of what needs to be done. I do really think it is easy to focus on regulation because prevention is about behavior change, design change. There are things that change ‑‑ it's about the way that children engage with technology with the Internet and parents support them and teachers in schools and everybody in the community encourages children to live online.

    Focus on early years. Focus on early childhood development. Focus on parents support. We know that's evidence in preventing violence, including child sex abuse and child sexual violence.

    What can we learn? Amy mentioned and it's clear that (?) Have very clearly come up with this conceptual model and the intersection of violence models. Many people in this forum are exploring that. We know that exists for different types of violence.

    So we need to be looking at how we learn from either sectors, and what is the role of technology and what is the role of industry and supporting those kind of interventions as much as it is in terms of regulating platforms.

    Sonia mentioned safety by design and privacy by design. We create safe streets by having streetlights and cutting down grass. We have safe schools by making sure the girls bathrooms are away from the boys' bathrooms. Huge playgrounds. Different aspects.

    But how do we support parents and caregivers are early years? How do we start building these conversations into ECD, Early Childhood Development program.

    We have places where that's starting to work in Asia and the East Pacific.

    How do we not skew one at the expense of the other?

    Also, bear in mind that much of the narrative and the message around regulation tends to generate a sense of disempowerment by children and caregivers, and that tends to lead to more restrictive practices that don't work in children's either. How do we get a wide‑range support of intervention ‑‑ prevention interventions rather than just focusing on one or the other.

    I hope that made sense. That was my five pages of input in hopefully one page.

   >> JUTTA CROLL: Thank you so much.

    I know Sonia has to leave early. So I would like her to prepare a short answer after we've heard the other two speakers.

    Just know that Patrick has already referred to safety by design and privacy by design, but he has not referred to child rights by design, which I think is a wonderful idea.

    Now we go to Andreas Hautz, Jugendschutz.net, Europe‑EU, Civil Society.

   >> ANDREAS HAUTZ: I hope you can hear me well. I got a new webcam earlier today. I wasn't sure it would work.

    Hello, everyone. My name is Andreas Hautz, Jugendschutz.net, Europe‑EU, Civil Society. Thank you for the invitation. I'm happy to speak here.

    I will work at this work. I have brief notes on our organization. We are the joint, federal, and state center for the protection of minors on the Internet. Our mission is to save Katowice to participate and be protected in the digital world. Which means we look at things to fix specifically directed at young users.

    There's a protection of minors on juniors in the media. There's an enormous risk of children in the online world. Minors can be confronted with bullying, child sex harassment, and many other risks that violate a person with integrity. So what do we do about that?

    In my opinion, regulation and prevention go hand in hand. There's a take‑down strategy by communicating with providers, hotlines, and law enforcement, which leads to quick removal in (?) Percent of other cases. It's not enough to advocate. Providers have to act because it's not only an action to remove content but create safe digital environments for children.

    So, on the other hand, we also urge providers to design the services differently. Providers have to consider safe settings in their safety‑by‑design concepts in the first place.

    Guidelines and terms of the service need to be understood by young users.

    In this context, we must consider also what needs to follow if the child has already been harmed. That means that reporting mechanism s must be easy to understand and use.

    Many of these claims are taken up in the new Youth Protection Act that ended up in Germany this year.

    One could conclude regulation is needed to engage the providers to act, but you could also conclude that it is a process of analyzing and learning.

    One thing that is for sure, regulation can help to lead to prevention. It must not. It should. (Chuckling).

    And that said, of course, you cannot regulate everything. Media literacy is important for children and parents should be educated in this field. We have to find a way to gather how media literacy can be improved.

    There's something else I would like to mention, the tools for automatic detection.

    Hashtagging and using photo DNA can stop the circle of abuse. In our opinion, providers have the responsibility to do so but always think about the consequences regarding other human rights and the rights of the child such as privacy.

    So, of course, we need to watch technologies closely. Again, it's always a process of learning.

    What is crucial in this process is transparency, I guess. We need to know exactly where can be done, will be done, and actually is done to prevent online harm.

    Of course, the balance between privacy and protection can be an issue here. So we must address that. By talking about it, we help people to understand tools that seemingly may violate their right to privacy but, in fact, don't.

    We also need to listen closely if there might be concerns and if they're not considered by themselves. Dialogue is the key to everything here. Regardless, we're talking about prevention and all regulation.

    The regulation and prevention always have to come together hand in hand.

    As regulation can never be perfect from the beginning, it has to be developed continuously. Prevention might be a result of that but can also come along with it.

    That's why organizations like ours, prevention strategies and providers to overthink them if they don't work properly.

    The job of civil society is to engage providers to improve their measures and have institutions and governments to evaluate and reflect their own efforts.

    Thank you very much.

   >> JUTTA CROLL: Thank you, Andreas. You're a very quick speaker. But it's (?) To follow.

   >> ANDREAS HAUTZ: I tried to get it in time.

   >> JUTTA CROLL: Now we have Thiago Tavares.

   >> THIAGO TAVARES: Can you hear me?

   >> JUTTA CROLL: Yes.

   >> THIAGO TAVARES: Thank you, Jutta.

    Thank you, Amy.

    Thank you, distinguished panelists, for having me on this session.

    It's a great honor to have this dialogue with my long‑standing child safety friends and fellow colleagues.

    In America, it's one of the dangerous places in the world. Social inequality, violence, a lack of public policies to offer equality education for children is just some of our big and historical challenges.

    My country, Brazil, has more than 50,000 murders. Children are out of school. Children are not safe there. Never were, even before the Internet existed.

    I'm recalling that because sometimes we are missing goals, social and economic disparities when we build regulation proposals for the Internet which are, in the end, (?) Of our societies.

    Different societies propose different approach and regulation proposals based on the values, priorities, and political interests on that particular moment. In Brazil nowadays, the federal government are proposing a terrible and asymmetric regulation to spot ‑‑ I will repeat ‑‑ to stop content moderation on hate speech, cyberbullying, harassment, neo‑Nazi, and (?) Content.

    Several child rights organizations in Brazil released a statement opposed to such bad regulation proposal and offering an approach on disparity, accountability, and content moderation at scale, equality assurance, KPIs in local language such as audits. Instead of English‑only training datasets that are available today for harmful content, but behavior at scale.

    This is going on. There's already a lot of discussions on our national Congress. Another issue is the need of safeguards, checks and balances on the policy options we have, and the assessments on children rights and the rights as a whole.

    We, as child safety experts, should not endorse policies that does not support law or fundamental rights such as privacy, data protection, and freedom of speech. There's operations targeting high‑profile (?) And human rights defenders.

    Safety Act Brazil would like to commend President Biden. We also welcome the UN committee for prompting the use of information and communications technologies for criminal proposals. There's no easy answers, but working together in stakeholder fashion, we can face better advancement goals.

      (Audio is distorted)

   >> THIAGO TAVARES: I will finish here. These are my initial remarks. Thanks so much for having me. I look forward to positive debates.

   >> JUTTA CROLL: Thank you, Thiago. I think we'll come back to the question of whether regulation could have a counterproductive effect that you said earlier, stopping moderation by law.

    Michael, the floor is now yours for an organization with long‑standing experience in dealing with child sex imagery and its removal from the Internet.

   >> MICHAEL TUNKS: I'm going to base my remarks largely around the IWF that's been heavily involved in terms of the development of the Online Safety Bill. Obviously, the bill places a duty of care on platforms that have easy‑to‑use services or search providers to ensure the safety of their users on their services.

    Principally, the bill tries to bring the online world with the offline world, as we've said previously, and aligned the two. What is illegal offline should be illegal online.

    It's a regulatory approach. In the same way we have seat belts in cars these days and the way playgrounds have to take account of safety regulations.

    It's complex and takes time. We've seen the EU‑Australia, just to name a few, as Amy outlined in the start, taking particular time and caring for the proposals they're bringing forward.

    The UK has taken about five years since the introduction of an Online Safety Bill with the pledge in the conservative party in 2015. We're still at draft stage with that piece of legislation. So it's complex.

    In the context of regulation versus prevention, that angle, I think I would agree with my other speakers that this is about both regulation and prevention as well. We must ensure that we prevent as well as regulating and improve the online world for children and young people.

    So what must regulation do?

    Well, at the IWF, we believe that regulation should build on best practice. For the last 25 years, the IWF has been removing large materials of sex abuse material from the Internet. We have some of the fastest removal times of anywhere in the world, as Andreas outlined earlier. Hotlines do we move content incredibly quickly. We want to see what that's built on.

    Child sex abuse in the UK reflected that the IWF was a large part of the success stories and why (?) Hosted in the UK and that's down to the partnership approach working with industry, law enforcement, with the government.

    We think it should be built on best practice.

    Secondly, there's collaboration as well. The fact that electronic service providers have to report to the National Service Center for (?) And Exploited Children is critically important.

    What we may see through the EU proposals for a new center needs to compliment the approach and not duplicate that effort and add more reports in the system. We must ensure that we are safeguarding children as a result of it.

    Thirdly, we think regulation should be principles‑based and based on technology. Technology often outpaces regulation and it's responsive to changes.

    Finally, we think it's important ‑‑ this is something other panelists and Sonia mentioned at the start as well, and that's respecting the rights of children and their rights not to be harmed. We need to note the lessons.

    Finally, I just want to touch on the separation between illegal, harmful, and other gray areas as well. I think it's really, really important that we do have clear legal standards and definitions in other legislation that we create. I think, as touched on earlier, this is really going to be pretty crucial to the UKs Online Safety Bill.

    What we have seen in the area of child sex abuse is that clearly the definitions and standards are very important to achieving a consensus. Many companies we work with deploy our services on an international basis. Having a clear standard of what is or is not illegal is helpful to them across the world in terms of the deployment of some of the solutions.

    However, that does not mean to say that companies should not go further in terms of doing more. And we can all do more in order to protect children online.

    Just some of the examples that the IWF are putting into account is things like the World First Report (?) Which allows children to remove images of themselves. We're looking at what we can do for images that don't meet the threshold as well.

    Finally, just another tool that we're using, we've developed a taxonomy and tool that allows us to match to UK, New England, Interpol (?) About what's behind an image so it can meet classifications. This will be really, really important as well and an important tool on close collaboration across the EU next year as well.

    Thanks very much.

   >> JUTTA CROLL: Thank you, Michael. That was very helpful, indeed. I have some ideas how we can go further with the debate.

    First of all, I would like to invite Sonia to react to what Patrick Burton said in the beginning in regard to an overemphasis of regulation. What do you think in regard to making the best interest of the child a primary consideration?

    Do we have an option to say yes and we can achieve that when we overregulate? Or do you think we need a balance?

   >> Sonia livingstone: That's a good question. I've been thinking about best interests recently. It's sort of taken off. It's the phrase that's everywhere. There's a very interesting general comment on what is the best interest of the child, which sets out some really clear and helpful procedures as well as clarity on outcomes. I think sometimes people think the best interests might be a thing, a box that can be ticked.

    Yes, we've attended to the child's best interest.

    It really does signal a careful process of weighing how any particular decision or intervention might affect any or all of children's rights. It needs to be either an individual decision for a particular child or more difficult, a collective decision, and often platforms and digital providers are in the process of making a decision. That's incredibly difficult. I've heard talk of the average child, meeting the best interest of the average child is not going to meet the best interest of the children I think we're concerned about here. So I think we need to be very careful in supporting any notions of averages.

    But, undoubtedly, when we make any intervention, whether for empowerment or education or privacy or protection, that we recognize all the possible consequences on children's rights. And the general comment (?) Made as well because she made many drafts, was to be careful. We need to promote the particular right in relation to the digital right but always being aware of those other consequences.

    But taking that in the direction of the question of regulation and the role of regulation, I guess it seems to me a clearer case to make is we should advocate for hygiene factors, that businesses and governments must be required to ensure children safety, privacy, security, equity.

    If you like, those are the things that we want sorted because, otherwise, they will be problematic. Whether they're done by top‑down policy or by design I think is something we could debate and might be different in the circumstances.

    But all those other children rights, the rights to education, the right to play, the right to enjoy family life, the right to growing into their fullest development, the right to voice ‑‑ I mean, governments have those obligation. It's hard to translate that into if you like platform regulation. That's where the strength of the by‑design solution comes in.

    If we can influence or, as it were, spread the word about the importance of children rights, I don't know if we go knock on the door of platforms, but I do think there's a lot to be done with intermediaries, funders, trainers, those who manage procurement processes, those who order the private sector.

    You know, there are lots of intermediaries, standards, bodies, and trade associations, as I mentioned before. There's lots of organizations that are already addressing ethics, already addressing discrimination, already addressing hate, and so forth.

    We just need to get child rights on that agenda because I don't think so we're going to pass platforms that meet the full range of children rights, though it would be interesting to argue that. But let's focus for regulation on the hygiene factors and think other ways to get the bigger picture.

    That would be my suggestion.

   >> JUTTA CROLL: Thank you so much. That is very helpful to further our debate.

    I think, Amy, it's time now to have your second question on the Menti. Is that possible? Or shall we go ahead with the discussion?

   >> AMY CROCKER: We can certainly try.

      (Laughter)

   >> AMY CROCKER: I mean, let's see if it loads.

   >> JUTTA CROLL: While you set it up, I would just like to courage participants in the session to put their questions in the chat to all of our speakers to give us reason to discuss further. We are looking at the file. Amy is setting up the Menti where we're getting an impression on the atmosphere in the view, in the digital room.

   >> AMY CROCKER: Yeah. I think if it's working well, you will refresh the code I sent you earlier in the chat, it should be fine. I can resend it, if anyone missed it.

   >> JUTTA CROLL: I think we still have the first question in the Menti. We don't have a fresh one.

   >> JOHN CARR: I see the original question.

   >> AMY CROCKER: Well, it's refreshing but not working.

    Is everyone refreshing their own screens?

   >> JUTTA CROLL: Yes.

   >> AMY CROCKER: Okay.

   >> JUTTA CROLL: Okay. Maybe we use the time just to go back to our speakers.

    If there are people on site in the room that don't have the ability to go to Zoom, Jennifer, if you could help us gather questions from the room and put them in the chat, would that be possible?

    Wonderful.

    In the time between, I remember that Andreas, you spoke about automatic detection of child sexual abuse material.

    I was wondering, at that point, what is your position, Patrick, to this automatic detection? That is kind of a prevention, but I think it's another type of prevention than the one that you were talking about when you spoke about early childhood intervention.

    How do we balance these different types of prevention?

   >> PATRICK BURTON: Thanks for that question. Certainly, I do not see that as mutually exclusive. I will firmly support the idea that automatic detection of child sexual abuse material is necessary. It's mandatory. It should not be an option. I think the use of tools like that are rarely effective.

    I would want to go one further, though. I think if we're going to look at the use of those tools, I would want to see complimentary enforcement of how platforms respond to things like reporting, how they respond ‑‑ what sort of referral processes and victim support processes are in place.

    For me, tools focus on what I was essentially ‑‑ the thing that scares us the most, the sort of worst form of violators that children experience. There's a lot more sometimes that escalates to that. There's other forms of violence that children experience before they are actually engaged in or abused to produce child sex material. But there's a lot of stuff that goes unreported and doesn't warrant being reported to the IWF hotline, which is a critical tool.

    I would want to see how pressure is put on platforms to tomorrow timelessly to act. It's not just child sex abuse. That, for me, would be very useful regulation to go along. It's how we hold platforms and industry accountable for handing reports. That's just one example.

    I hope that makes sense.

    It was clearer in my mind.

   >> AMY CROCKER: Uh‑huh.

   >> JUTTA CROLL: Thank you. I think it served the debate.

    Andreas, would you like to respond?

   >> ANDREAS HAUTZ: I don't think it's a real different position I have. It's very near, I guess.

    What I would like to add is regarding the whole discussion about hashtagging is that we lived through it last year when there was in issue and the whole discussion around the temporary (?) And I would like to emphasize that I think we need to be fast and up to date as the tech companies address this issue pretty well and serious, but we fail to develop a satisfying and accurate framework for them.

    So most of the social media platforms use hashtagging either to prevent child abuse from being uploaded or disappeared on the platform.

    I would like to stress this because I worked as a hotline analyst for nearly four years, and it's not only the most important issue, of course, but it's the protection of survivors in this field. It's not only that. It's also the protection for analysts that have to rewatch this stuff again and again. It's really harmful, also, to the people who have to assess content.

    So I think ‑‑ but Patrick also mentioned it before ‑‑ hashtagging shouldn't be an issue.

   >> JUTTA CROLL: Thank you, Andreas.

    Amy, did I hear you?

    Okay. No.

    So thank you, Andreas.

    It was said in many cases we need to react very fast and at the same time Michael said before that regulating is complex, and it takes time. We all know that. It took five years since the introduction of an Online Safety Bill pledge. It also took us more or less five years to come out with the amended Use of Protection Act in Germany. We know regulation needs time. But, at the same time, we know that we need a fast reaction to certain developments.

    I want to invite the speakers, as well, to answer the question about whether we should just rely on the platform providers, that they take the action in due time as fast as possible to react or whether we wait for the regulation to come in to be developed and then to take effect, which, of course, might, in some cases, even too late because the Internet is so fast developing that we have new technologies, and it's really very difficult to keep up the pace with that.

    Thiago, what are your thoughts on that?

   >> THIAGO TAVARES: Thank you.

    John was before. Do you want to go before?

   >> JOHN CARR: No. No.

    Braise before beauty. You go first, Thiago.

   >> THIAGO TAVARES: What a gentleman, my friend.

      (Laughter)

   >> THIAGO TAVARES: Thank you very much.

    First of all, we need regulation.

    I would like to add a comment. We do need tools. The scale of the problem, we can solve that without making tools, without AI, without other technologies that can proactively defect systems at scale, including preventing the uploading of sexual images.

    However, that solution should be implemented in a way that doesn't create negative schematics.

    We also need to consider, in my view, the size of the company and the scale and the volume of incidents.

    I will give you an example. For example, Apple has only 100 people on the safety team. This is worldwide. That number was figured out in a public hearing at the UK Parliament.

    There's 100 people that carry out all trust and safety issues that affect users worldwide, including children.

    When they take the cheapest solution in the market to announce the plans to engage with child safety and child protection issues, this is something that we, as child safety experts, we were asking Apple to do that for more than two decades, which they decide to do now. Which is great. It's wonderful. But it's not enough.

    There's a lot of content that is harmful for children that will not be detected because they try to avoid to create a real full‑service structure to be able to detect harassment, cyberbullying, not sharing of intimate images, sex trafficking, and all those other things that violate children's rights.

    So this is my review and contribution for this debate. I want to propose that my dear friend, John, has a different view. I wanted to hear his thoughts on that.

   >> JUTTA CROLL: Thank you, Thiago.

    John, would you like to take the floor and tell us? I've seen you're nervous to speak.

   >> JOHN CARR: Me, nervous to speak? What?

      (Laughter)

   >> JOHN CARR: No. I don't want to upset Patrick by saying I agreed with nearly everything he said. I apologize if that is upsetting, Patrick.

    I think of myself and most of the organizations or many of the organizations I work with as being involved in the world of campaigning and lobbying. Therefore, we're involved in the world of politics. Politics is about absolutely expediency at one level and trying to achieve what is possible at a given moment in time.

    Obviously, we need ‑‑ this is not meant to sound patronizing. We need the best possible research. We need the best possible intellectual input. We need the best possible set of independent experts to help guide us.

    But at the end of the day, imperfect as it is likely to be, we're struggling to achieve the best we can.

    By the way, I lose patience with the old‑school method that the Internet is an old system and must be run in an old way. That's bullshit. North Korea is not the same as Sweden. Countries are not the same. I think there's an obstacle that's been deliberately used by people who don't agree with the perspectives that many of us will have to delay change or keep things as they are for as long as possible.

    Just one very quick point in response to what Sonia was saying earlier. If you look at a lot of what we now think of as the main corpus of human rights legislation and indeed the UNCRC, the human rights emerged in the aftermath of the Second World War. It was addressing some of the worse abominations of the human rights and indignity that the world had ever seen, certainly in modern times.

    And the UNCRC, it's pre‑Internet. And the idea that the best interest of the child, that came from a world and a time when everybody that was important in a child's life, at least had the possibility of knowing the child, meeting the child, the doctor, the teacher, the nurse, the shopkeeper ‑‑ that world is gone.

    Some of the most important players and actors in a child's life now are on the other side of the planet. And the question is: How do we device systems?

    I get Sonia's point that the average child is not the individual child, but we're going to have to find some way of coming up with a framework that actually recognizes ‑‑ we don't want to give Internet companies all that information about an individual child. We need systems that can somehow find the middle.

    My last word is this.

    (?) Davis, the head of trust or whatever her title is for Facebook or whatever, she appeared before the house Of Commons in London. She said, We have no interest in providing a place that's unsafe for children. There's no profit for us, as a company, in having an environment that doesn't work in a way that's safe for children.

    Of course that's true. That's never been the issue. The issue is what priorities companies attach to solving these problems.

    The most valuable asset that any Internet company has is engineering time, engineering time.

    And the question of every CEO of every tech company makes pretty much every day is: How do I deploy that asset, engineering time, in the best benefit of my company today?

    Unless they're feeling threatened in terms of revenue or reputation, the inertia slips down the list. That's why regulation is essential. If we leave it to companies to make their own decisions about things, like Thiago was saying, companies employee people because they can get away with it. And that's no longer acceptable anymore.

   >> JUTTA CROLL: Thank you, John Carr. Strong intervention.

    I think we'll get back to the question of whether legal harmonization is possible. Like you said, we have differences between South Korea and Sweden, but I would put that at the end of the session because we have now two interventions from the floor in Katowice.

    I assume that Javri (?) From the Technical University of Munich, he has a question about children.

    We're turning now for some five minutes to children as users themselves.

    And we take first Javri (phonetic) and Jennifer.

    Can you speak?

   >> JAVRI: Yes, I can. I actually wanted to ask about these kind of children that we see on social media that I think there are two kinds of these influencers.

    One, one of them are kids that are maybe developing a project or even a business with their parents, and their parents are using them for, like, advertising or like as models to sell these services.

    And another kind of influencers could be ones that are activists. For example, I have seen in Colombia there's a really famous child that he is a climate activist, and he's always on Twitter and YouTube and all the social media.

    So I wanted to know in both cases how could we regulate or how could the rights of these children are protected, and, yes, I wanted to learn about the limitations that are about these presentations in social media.

   >> JUTTA CROLL: Thank you for your question.

    I know that there's test done in this regard. I don't think it's the same department that you're working in, Andreas. Are you ready to answer the question, or anyone else from the panel?

   >> ANDREAS HAUTZ: I'm not really ready, I guess, because it wasn't my department. I know the research, but it is focused more on the risks of influencers. So I don't think that this would be the point.

   >> JUTTA CROLL: Is anyone else on the panel or from the room who has an answer to the question of Javri?

    Patrick?

   >> PATRICK BURTON: I don't have an answer, but I guess my comment would be that certainly I think the fact that you've made the distinction is good because I think there's a very real distinction to be made between those influencers.

    Child influencers that are used for a product that are often harmed lifestyles or that are harmful for children, that we know are harmful for children, I think, clearly, there is a role for regulation there. Maybe now I'm picking and choosing what we regulate and what we don't regulate.

    But companies are subject to laws and regulation that govern the use of child labor, for example. Companies or businesses, whether they're microenterprises or start‑ups, if companies are using children to market ‑‑ if they're engaged with a relationship with child influencers, I think that's a role for regulation there. Whether it is regulation within this industry or whether it's regulation that is covered through child labor laws at a national level, I think that's going to be a conversation that needs to be had.

    You know, in parts of east Asia, for example, we also see cases where child influencers are being used to market lifestyles or promote lifestyles or promote products that are ultimately harmful to children as well. There's management to that situation to protect that activity.

    So I don't have an answer, but that's my personal view on it. I'm not sure if it's adding anything.

   >> JUTTA CROLL: Thank you, Patrick.

    I would like to turn, then, to Jennifer.

    Jennifer, would you like to take the floor with your question?

   >> Jennifer: Thank you, Jutta.

    My name is Jennifer Chung. I actually have a couple of hats in the IG space, but, really, I'm asking this question in terms of the .NET kids intervention. I think we've worked quite closely with a lot of you who have spoken. John Carr with ECPAT and others, you've all worked in the child organization space where it's not just kind of one solution. There's like multilevel of things that we need to keep in mind both in terms of regulation and more so in the way that we are proactively looking at keeping an Internet namespace that is friendly and beneficial for children.

    I think I want to hear more from the panel on things we should definitely keep in mind. I heard from Patrick earlier that it's not just regulation, and there's other things that we need to do.

    And from Thiago, there are other regions that we need to have these training materials in languages other than English because there is, you know, Latin America, and the Asia‑Pacific, there's many different communities and people and vulnerable communities, especially the children, who don't have English as a first language.

    I really want to get a sense from you because we are looking towards you, as the expert, to guide us in a way that we can create this namespace.

    Thank you.

   >> JUTTA CROLL: Thank you, Jennifer, for this question. I forgot to mention that, of course, the (?) Foundation has been working for a long time with the Dynamic Coalition and has become a member some time ago.

    Thank you, everybody.

    I see Thiago nodding very heavily.

      (Laughter)

   >> JUTTA CROLL: Please go ahead.

   >> THIAGO TAVARES: Thank you, Jutta.

    Thank you, Jennifer. It's a great pleasure to see you on the screen.

    I'm going to share some thoughts on that. I think the cases are very unique. It's a unique experience, especially for your long‑standing commitment with media literacy and in the domain namespace. I think it's a unique experience, and everybody should look for that particular case to better understand how to improve this discussion on the domain namespace As well.

    I would like to give you an example of how we're cooperating (?). We were looking for detecting keywords in Brazil that was more prevalent on websites that was trying to use material but also other kinds of harmful content involving children.

    That exercise makes it possible to identify hundreds, hundreds of new keywords. 2,099 words were discovered in Portuguese that they were not aware of before this exercise. That's evidence. It's something that materialized. We need more language diversity in non‑English training datasets so better apply those policies, not only in social media and applications but also on the domain namespace as well. Thank you so much for your questions.

    I give you back the floor to Jutta or perhaps my friend, John Carr.

   >> JUTTA CROLL: Thank you, Thiago.

    Since I don't see any further hand from the panelists, neither comments or questions in the chat, and I'm reminded that we only have 10 minutes left. So I'm really glad that Jennifer turned us to the issue of creating safe spaces for children without restricting their rights to accessing information and their rights to peaceful assembly and association and right to play and so on and so on and how do we solve this issue while on the one hand, we want these safe spaces for children. We have heard from Michael Tunks that in the UK Online Safety Bill, there is a provision for platform providers.

    We have the same in the German Produce Protection Act there's an open end of measures that can be taken by platform providers to fulfill their duty of care. It's necessary to know that it's an open‑ended list. Of course there is room for platform providers to develop their own measures that would help to create these safe spaces for children.

    Also, we see that we have paths of regulation that have a counterproductive effect and last year, in December, nearly 12 months ago, somehow the safe space for children was reduced due to the fact that most platform providers, they stopped moderating content.

    Of course, we are seeing that we need to have a harmonized regulation across Europe. We have seen that with the harmonized GDPR. Also, it had its effects on international level in other countries where the same regulation was copied or redacted in another way but so, of course, there's a need for harmonization, but how do we avoid? There's a good, national regulation that helps create a safe space, and that could probably be overruled by harmonized legislation across a certain region of the world like Europe or in other types as well.

    I would like to invite the panelists and, of course, the people in the room to respond to this.

    Is that the autonomy when we ask for harmonization on the one hand and legal harmonization on the others and there's different levels of safety for children in different parts of the world.

    Would you mind focusing your final statement in this regard? Maybe we take now the other order ‑‑ the order in the other way.

    Michael, maybe you would like to start?

   >> MICHAEL TUNKS: Thank you. I think this has been a useful discussion today. I think, as we outlined at the start, this is not just about regulation or prevention. It needs to be both.

    I think there are certainly steps that companies and platforms can take now to ensure their services are free for Chex ‑‑ child sexual abuse.

    The's been 8.8 million attempts in April to access known child sexual abuse material. That's an enormous amount of access to child sex material. I think there's always ways we can improve. There's always more we can do.

    As we look at grace and regulation around the globe, I think there's a need for the Dynamic Coalition groups to harmonize their group around children's rights online and children's right to be free of sexual abuse and right to play and how we align those with the competing priorities around some of the things that we're seeing around encryption. It has to be possible to keep children safe online as well as protecting privacy as well. I don't think we should talk about it as a dichotomy.

    I think, similarly, it shouldn't be a dichotomy between regulation and prevention either. We should all work together to improve online safety for children.

    My final remark would be that contexts is really, really important, especially around transparency reports. I think we need to know how much content is actually illegal. I think we need to understand if companies are making large numbers of reports to the National Center for Missing and Exploited Children, are they really good at finding it or is it because they have problems with their platform?

    I think regulation throws up some very interesting questions. We need to build on best practice.

    Thank you.

   >> JUTTA CROLL: Thank you, Michael.

    I would now like to turn to Thiago. One sentence only, please, because we have only five minutes left. I think Amy will also wrap up what has been said in this session, at the end.

    Thiago, please.

   >> THIAGO TAVARES: In one sentence, Jutta, thank you so much for having me. It was a great pleasure to take part of this very, very qualified debate and to contribute with and share my vision.

    From a perspective that is not usually considered, which is the perspective of the global soul, especially in the countries of Latin America, which has some particularities that it's important to consider as well.

    Thank you so much.

   >> JUTTA CROLL: Thank you. We appreciate that you gave, also, the view and the perspective from Latin America, which is really important to get the broader view on these issues.

    Andreas, your turn.

   >> ANDREAS HAUTZ: I would like to second what Thiago and Michael said before. I would just come back that I also don't see a dichotomy between regulation and prevention. It has come together. But I would like to focus again on the problems with the temporary issue. I don't want to make a big thing out of it now, but I just want to focus again on it to show that it's really crucial that regulation and harmonization have to work together and have to work properly because we cannot afford to lose 60% of reports on child sexual abuse.

    I think it's not only the big thing we have to focus on but also the details.

    And thank you very much that you invited me to speak here.

   >> JUTTA CROLL: Thank you, Andreas. I will take that pledge to the workshop number 170, child protection online, how to legislate. That will be held in 30 minutes' time in Room 6. We'll continue the debate to the Digital Services Act.

    Going in alphabetical order, I would like to take John before we go to Patrick again.

    John, your thoughts on harmonization.

   >> JOHN CARR: Well, I'm against sin. I'm in favor of everything being nice. So harmonization is a wonderful idea, but I'm afraid too many people who I think of as enemies use that as an obstacle or delaying tactic.

    We, in our own jurisdictions, in our own ways, we have to do the best that we can to make it work for the children in our own countries. I'm absolutely refusing, anymore to delay that while I wait for this harmonization thing or globalization thing to come walking down the street.

   >> JUTTA CROLL: Okay. Thank you, John.

    Patrick and then Amy for a quick wrap‑up of this session.

   >> PATRICK BURTON: Thank you. So I think harmonization at the end is a whole other conversation. I actually agree with John there. I work in 12 or 13 countries at the moment that cannot harmonize their own legislation, their own definitions, even of CSAP.

    It's a real major challenge. That's one of the reasons that I think not to undermine the role that regulation has. I think that focusing on different aspects of prevention to a large degree allow the local adaptation of contextual adaptation that you need.

    So, again, that balance, they're not mutually exclusive, but I would like to see more emphasis there.

    Thank you for the invitation.

   >> JUTTA CROLL: Thank you.

    Amy? One minute.

   >> AMY CROCKER: Yes. Hi, everyone. I will be very quick. So much has been said. I think just to sum up the overall messages are that that's the general consensus that we need both, which I think is an outcome that I suppose many of us kind of were expecting, but it's interesting what is complex is the balance you find. I think also what I've been hearing is the importance of context. You cannot have these universal rules. There are some basic universal rules, but you need to understand the context that is happening in each country or jurisdiction, and we need to be working much more proactively in how to bring different changes in relation to this topic together in order to find sustainable institutions.

    I think it's been a fascinating conversation. I will be writing up the report in a few weeks.

    Thank you so much for joining.

    Thank you, Jutta.

   >> JUTTA CROLL: Thanks to all of you. I hope to see you in room six for 30 minutes on the session of Child Protection Online where we can continue the debate.

    Thank you. Have a good IGF. If any of you have been traveling, a safe trip back home.

    See you. Bye‑bye.