This is now a legacy site and could be not up to date. Please move to the new IGF Website at

You are here

IGF 2020 - Day 5 - OF50 Global Partnership to End Violence Against Children

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 




>> MODERATOR:  Welcome, everyone, to the End Violence Open Forum.  We are excited so many people are interested in extremely important topic and I hope more will get on board as we are starting the session.  Today we will be speaking about technology and governance, and how they shape our ability to keep children safe online.

So why are we speaking about this issue?  As you know, every half a second, a child makes their first click online.  In fact, one in three Internet users worldwide are children.  And this number is only going to get higher as we are connecting the unconnected parts of the world.  And now couple those figures with the fact that in every single moment there is an estimated 750,000 predators who are looking to sexually exploit and abuse children online and the fact that technology is rarely developed and designed with children's safety in mind, and you will begin to get a picture of the risks and threats children face.

One of the unforeseen consequences of the rise of the Internet and digital technologies has been an exponential growth in child online sexual exploitation and abuse.  But not only, the way the digital platforms are designed and the way in which they incentivize user interactions are creating more opportunities and risks for children and their safety to go beyond online abuse and create multiple vulnerabilities that can be exploited.

You will get to hear about these aspects through the session today.  My name is Marija Manojlovic, and I lead the Safe Online team.  For those of you who don't know us yet, the global partnership was launched by the UN Secretary‑General in 2016 with the goal to catalyze and support.  We are also the only global entity making targeted investments to tackle online sexual exploitation and abuse.  This year our work has reached a milestone.  We have invested $50 million in programmes and cutting edge technology tools in over 70 countries, focused on tackling online child sexual exploitation and abuse.

What makes the job exciting is the fact that we get to support and collaborate with organisations that are doing ground‑breaking work to keep children safe, some of whom are with you here today.  So without further delay, let me introduce our panelists.

I will not attempt to summarize bios and achievements because that will get us to the end of the event.  Today with us we have Signy Arnason, who is Associate Executive Director of Canadian Center for Child Protection.  Emilio Puccio, coordinator of the European Parliament Intergroup on Children's Rights.  We have Beeban Kidron, founder and Chair of 5Rights Foundation, and John Tanagho, Director of the Center to End Online Sexual Exploitation of Children from the International Justice Mission in the Philippines.

A few words of housekeeping, as you know, the session will last 60 minutes.  And we will have one round of questions to the panelists who will take four to five minutes to respond, and then open the floor for questions.  Please leave your questions in the Q and A window, and not in the chat.  So please use the Q and A window for your questions to the panelists and we will monitor those throughout the session and please indicate to who you want the question to be directed, or do you want us to ask the same question to all the panelists.  My colleague, will be supporting us in taking notes of the session and monitoring the Q and A chat box.  Feel free to Tweet at us and use social media.  Our Twitter handle is GP to end violence.  And safe online, and you will see them written in the chat box.

So I will now open the session and start asking the questions to the panelists, and I am looking forward to the questions that will come from you as soon as we are done.  So let's start with Signy Arnason.  Signy, you have been with the Canadian Center for Child Protection for almost two decades and you have played a critical role in starting the tip line for reporting the sexual exploitation of children.

You have become one of the most significant global actors fighting for the rights of the victims of online sexual abuse.  Tell us a little bit more about the work with project Arachnid, which is a groundbreaking technology, what were the enabling and disabling factors throughout this work and what have you learned?  Over to you, Signy Arnason.  We need to unmute you.

>> SIGNY ARNASON:  Okay.  Perfect.  Thank you, Marija.  Thank you for coordinating this.  It's a pleasure to be on a panel with such distinguished professionals.  We are pleased to share information about what we are doing at the Canadian Center for Child Protection, and specifically with project Arachnid.  It's transformed into a platform for where companies intersect with the tool as well.  It goes out and seeks known images of child sexual abuse material and the purpose is to detect the children and send notices to the providers for removal.  The initiative has been around about three and a half years, and we have issued 6.3 million notices to providers.  We have detected 24 million suspect media.  So that would be media that would have a photo DNA match in some capacity but wouldn't be close enough for us to say for sure whether that was the exact image.

So that has to be assessed by analysts.  And essentially we have crawled 125 billion images and videos.  So what we are doing is looking for the needle in the haystack in trying to detect where this material is.  What is important to note about this Arachnid came in conjunction with your International Survivors Survey.  We did that in 2016.

This was the first time really we have comprehensively heard from this population it, a population who have been seen, tremendously seen online and exposed, but they have not been heard from.  And they have not been heard from for a variety of reasons because their safety is constantly at risk.

At the end of the day, while we believe certainly it's important to be detecting new victims that are popping up online at a regular, on a daily basis, we have residual material that's been left over.  We have a population of children who are now adults who have basically been abandoned.  So what we have done is the offenders may have rested, brought to justice, but little has been done to address the fact that their series and their material becomes commonly traded online and then the ramifications for them in their everyday living as a result of that.

So one of the things certainly that we quickly learned through project Arachnid is that it really has not worked in children's favor.  We have done a disservice to this population by treating child sexual abuse material at a jurisdictional level.  So what we have done is said to ourselves, okay, is this illegal?  Is what I see in the four corners of this image, is this illegal in my jurisdiction?

Well, I can tell you for certain that a Canadian, that someone in the U.K., they don't care what server the information is sitting on.  All they care is that they have access to the information.  And as a result what happens is some material comes down, some stays up.  So we saw immediately the issue just on the illegal side.  So if we go with an Interpol definition, we go with the worst of the worst, prepubescent, the genital area or a sexual act, the worst of the worst. Let's all agree that's coming down, we were seeing debates about whether that should be removed or not.

We know this is not how the offenders operate on line.  They possess images that are part of a continuum in relation to the entire series.  So a common tactic is to take an image and pull it from, say, a video where the video in its entirety would be deemed illegal, but in a still image of the 3‑year‑old standing beside the bad guy in a sun dress before she is about to be assaulted is pulled from the video and posted online.  That is used as a gateway into other material and other information.

What we have learned and what we are taking a stance on is that we need to stop approaching the removal of this material from a criminal law perspective.  We have to approach it from what is in the best interest of children and survivors, and we are talking about removal.  Companies have terms of service.  They are able to remove individuals' information according to the terms of service.

So we have been aggressively pushing as an organisation that really what's been happening online is not good enough, and it's no wonder we have an epidemic that we are facing.  The other point I would make before I probably have to wrap up my four or five minutes is that it's really essential that this issue be drawn out into the public light.  We have had copious amounts of conversations even with industry and these have happened behind closed doors.  This happening behind closed doors does us absolutely no favors.

We need to be exposing what is happening.  There is a reason why the cuties movie resonated with people, and it doesn't even touch what we see online in terms of the sexual abuse and harm that is occurring for children day in and day out.  I think what that demonstrates is the public doesn't understand what is happening for children, and when you introduce a world where we decide we are putting children with adults, we are commingling them in that environment with no rules and regs. is it any wonder that we are where we are today.  Project Arachnid is an important part of the solution in reducing trauma, the revictimization and providing survivors with hope that there is a tool dedicated to detecting their material and holding industry accountable for renew.

>> MODERATOR:  Thank you so much, and thank you for the amazing work that you and the organisation are doing.  I think you have flagged literally three really important things that will be a good introduction to the intervention of Emilio Puccio, but I want to flag them first.  That's the lack of understanding of the profound impacts that CSAM circulating the impact, not only CSAM as illegally framed but the continuum of images and videos around that.  They have on humans, on children and how long those effects last.  The second thing is removal cannot be driven only by the illegal definitions, by the criminal justice definitions of what illegal material is, because that is not putting the victim in the centre.  That is not providing the victim‑centred approach, and finally, the public does not understand the issues and is not aware of the issues, and we need to really bring those to light.

Emilio Puccio, can I bring to you and flag a few things before I hand it over to you.  Recently, the European Union, there has been these issues of the interplay between these components that I just flagged are really important, and how they interact.  So sometimes they can endanger or support the efforts to prevent online sexual exploitation and abuse.  And in Europe a piece of legislation is about to enter into force the 20th of December, and this piece of legislation will effectively render illegal the currently used tools by the tech companies to detect the child sexual abuse material online.

You have organized recently under the auspices of the U Parliament child rights intergroup a meeting of experts to discuss these developments, to shed more light on the role the technology and the private sector play in the fight against Chile sexual abuse online.  Can you tell us what came out of the meeting and what are the experts saying?  And can we do something to ensure that we are not crippling our ability to help children stay safe online?  What are the things we can do to shed light on this issue and make sure that the public and parliamentarians and Europe are aware of what is currently happening and the dangers of it.  Over to you, Emilio.

>> EMILIO PUCCIO:  Thank you so much, thanks for having us here today with I allot of experts on this panel.  Let me start by say that I couldn't agree more with what Signy said before me, and what you said before handing it over to me.

So as regards these separate meeting we convene, what we wanted to do was basically to recreate the crime scene and give the politicians sort of a glimpse of what is happening out there in the real world when we deal with child sexual abuse.  As has been said by Signy and by you, unfortunate will there is not enough knowledge.  Us in the European Parliament as legislators are more focused on the legal answer for that, but it's important and it has been said by Signy, it's not only the legal responses we need to work on.

I have to say these expert meetings gathered experts so we have representatives of Microsoft, it garage gathered as well as law enforcement authorities on top of key lawmakers and legislators in the European Union, and we also heard directly from a child survivor. I think we should still stress more and more and more that the reason we are all doing this is just because, of course, we want to prevent this phenomenon from happening and we want to give some sort of justice to the people who have been victim or fallen victim to this heinous crimes.

And unfortunately, I have to say that in the European Parliament, in the European Union in general we don't give a lot of spaces to victims so it was important for us in our event to give the opportunity for the victim to speak and talk about what she has been going through and so because unfortunately this has been highlighted, the voice of the victims often is unheard.  So there is always room for improvement for sure and I have to say from this event, one of the main takeaways is that unfortunately, of course, this panel, the panel to fight sexual abuse online cannot be won by a single actor.  It has to be a joint effort, a whole society, multistakeholder approach.

I will not go into the figures.

You have shared with us the terrible and staggering figures about the increase, the consistent increase in child sexual abuse online that we have been witnessing over the last year and I think what was important as well in this expert meeting that we gathered, to give the voice to those front line professionals who are working out there hands on and especially the authority were actually confronted with an enormous amount of images online, and that was very important to listen and hear from them that, again, this cooperation that they have been enjoying with the private sector, with companies, it has been proven essential for them.

Actually paradoxically, I also have to say that in a way, the global pandemic that we are now all living in, and all of the lockdown measures that have ensued to actually contain the further spread of the virus has somehow in a way helped us understand what are the risks online?  What are the risks that the online world poses?  And in a way, they have somehow since the situation has been farther exacerbated, they have somehow also motivated, it was sort of a wakeup call for the politicians in the UR European Parliament to actually do more and try to have a global and legislative response to what is happening now.

To be honest, it has always been clear to me, the fact that I am a lawyer and I work with the legislators and I have been working in the European Parliament for over five years, the legislation alone will not be the only answer.  Legislation is just one part of the solution.  I have also been a firm believer of business for good, and I do hold that businesses are much better placed to invest in technological solutions to help the fight against child sexual abuse.

And I feel like that in fact with great powers always come great responsibilities.  And I will say that online service providers together with every actor in society has to do their part in this fight.

And the online services provider has to make sure that the networks are not used bisexual predators.  Now, more at a political level that we are now since the European Parliament are now actually working very much on the European Commission proposal on this very important piece of legislation to allow the online services providers to continue to make use of those technological tools to detection the sexual abuse material even after entering into force of the European electronic child code in December 2020, which will have the unintended consequences to prevent then these online service providers from continuing to use this technological solutions or tools to actually detect sexual abuse material.

It is important to mention that in this political debate, the eternal debate against privacy or security, or in purely legal term when it comes to criminal law, debate between repressive members toward preventive measures, because the two things need to go hand in hand.  I do think that in farther measures alone as it has been said by Signy before me cannot be the only answer because, of course, they always intervene after the crime has been done.

So this is why I want to very much commend the efforts of the European Union and especially the European Commission that has published recently an EU strategy for a more effective fight against child sexual abuse where it has included clear provisions for the European Union Member States to invest more in preventive measures.

So now the European Parliament is called on finding a legally sound solution to this unintended consequence of the European electronic communication code, and I'm very confident because I have witnessed it firsthand, even a lot of importance has been attached by the politicians, that an agreement within the Parliament is within reach.  So then over to you now.

>> MODERATOR:  Thank you so much Emilio and thank you for giving us more background to the central issue that is being discussed in Europe now, especially giving that the predators don't care where the material is hosted.  They don't care where it's sitting.  They just want access to it.  We know that a lot of material that is being processed it actually hosted in Europe, and if this legislation is disabling us from using the existing tools both from industry, but also from the NGO side, I think it's going to be really detriment pal, have detrimental impacts on children and sour ability to protect them online.

I think the entire discussion keeps on coming back which I'm really happy about, back to the centering on the children and the survivors.  I want to go to you, and ask you a question that really is kind of at the centre of how you do your work.  So you at 5 rights foundation are centering on the child and how different factors in digital environments and policy space creates this risk environment or safety environment around children.  So you are looking into design features, and business model of disability platforms, policy and regulatory frameworks and so on.

Most recently you have been generating attention around your work around risk by design as well as efforts to get global guidance and tools for development of Child Online Protection policy frameworks.  So can you tell us more about what you are seeing, how the design features are proposing a culture either of safety or risk and how can we find a way to work with key stakeholders to crow yate frameworks where we could actually address these issues.  Over to you, Beeban.

>> BEEBAN KIDRON:  Thank you very much.  I want to say what an incredible privilege it is to come after two speakers, there is a breakout of agreement about what they both said and they put the case very well.  I thought I might just briefly touch on two narratives that I find very difficult because part of this, and part of the sub text of or indeed even the text of what the previous speaker said is about changing the narrative about how we talk about it, and then also perhaps talk about a couple of things we are doing at 5 rights.

So the first piece is really the idea that it is okay to suggest that child sexual abuse is an acceptable part of a company balance sheet.  So I'm forever being told that, you know, it will be, if we deal with it will be restrictive to growth, or as one tech executive said to me recently on the one hand, on the other hand, and I asked him to put both hands on the table because I said on one hand, on the other hand is not an acceptable framing.

It's not about a network effect versus children.  It's not about content creator's need for income on the one hand and on the other hand children's needs, et cetera, et cetera.  I think we have to stop treating it like a balance sheet and ask the more fundamental question which is to say if you can't run a company without spreading industrial levels of CSAM, are you fit to trade?

And I think if we answer that question honestly, it would drive investment into tackling CSAM out of the system much, much quicker than any of the conversations we are currently having including the current binary between privacy and protection, which I agree with Emilio is both unhelpful, untrue, and simply not the right place to be, and yet we are constantly fought into a corner that puts friends as foes in the policy arena.

I consider myself a Child Protection Activist and a privacy activist.  So that's where I sit, and that's where we all need to sit because the technological systems that are being built for the future right now, right now, the ones that regulators haven't even got their eyes on yet are being designed without the explicit needs of children.

So, I think that we have to remember that a lot of this is happening in the arena of private companies and as Signy said about terms and conditions that actually we need more than two people in this debate.  It's not about companies and end users, it's about nation states, it's about the international community, regulators, all of those people as one part of a tripartite thing between users, companies and what we might call society in a broad sense.

So that's one really important narrative that all of us, and sometimes myself included have gotten wrong and we have to be much, much more disciplined about the way we talk about this.  The think the other thing, and this is very particularly a function of 5 rights foundation and my colleagues in the team, which is that there is a tendency, it's driven by our own sense of horror, the media, the politicians but to concentrate on the extreme.  But we fail to look carefully enough at the culture that's feeding them or at least we excuse that culture very quickly.

So, for example, we know that child pornography straight into the hands of primary age different desensitizes them meaning they need ever more extreme sexual content to be excited.  We know it's within the concept of consent where young girls present in the hundreds and thousands with eye infections and able tears that are a direct result of acting out pornographic scenarios, often unwanted and nonconsensual.  Or the normalization of sharing in sexual content, for example, the U.K. company only fans which makes self‑generated sexual abuse material the fastest growing area of consent.

So we have these drivers that we appear to accept in the commercial world as absolutely normal and we are not really looking at it properly.  And it creates a very toxic environment for children and fuels a very distorted sexual and relationship environment which among other things creates a growing market for CSAM.

And what I find astonishing is that it's done in plain sight of all of us.  And once we will always have a battle against a group of bad actors, we really cannot afford private companies to profit from sharing, spreading and selling sexual content to and of our kids.  And I just put this in the mind of people, you know, it's like imagine if there was a local cafe somewhere on a high street with eight, ten, 12 or even 16‑year‑olds in the window selling images of themselves, videos of themselves.  You know, in reality.  And this narrative that it's somehow normalized in the digital world is really, really problematic for dealing what is really atrocious at the sharp end, because it drives a culture.

So those are my two sort of concerns around the narrative, and about how people like us may perhaps, and I really did hear very, very, you know, resounded in my imagination, both of the previous speakers saying we must put the light on it.  We must also start talking about it in a realistic way.  Do you know this is happening?  Would you accept it in real life?  Can you see this comparison?  And try and talk about it in languages that is perhaps not so expert but is evocative of something that they would find completely unacceptable in so‑called real life although we all know this is indeed real life.

So let me turn briefly to two things we are doing at 5 rights.  First, my colleague Victoria has been working on risky by design, and I will in the course of this session put a link in the chat for everyone, but what it does is it highlights the everyday risks and common features of the digital world.  For example, you know, introducing strange adults to children or exposing children's real time location to strangers or the problems of public streaming from bedrooms and then introducing private messaging into that environment.

And what it really looks at is the cumulative nature of the risk.  So if you make a child public to 500 million viewers on TikTok and then have direct messaging which thankfully they just got rid of it, but there are other services that do that, you are in effect putting them in the window that I alluded to earlier.  And so on.  And what risky by design does is it shows people who are not thinking about the system, who are only thinking about the content, that actually there is so much we can do by child risk assessment, by actually upholding a really basic set of design principles that have more to do with health and safety kind of legislation than any of the legislation that we are currently talking about myself included.

And then finally, I just want to just do a bit of a shout out because I think everyone on the call may in fact be our friend to this project, which is with the end violence community around the world, we are creating a global Child Online Protection handbook, and this actually came out of previous work where we wrote the Child Online Protection policy for the Government of Rwanda, and so many countries reached out to us and said we would really like your help in this.  And the idea that many of the frameworks that we all work to, the NNR, the Convention and so on are not quite practical enough.  If you are the person in a country who is actually given the job of actually writing the roadmap.

So what it is is very holistic, it has broad pillars, but it actually seeks to identify stats, pathways and best practice.  And those are just two of our flagship projects, but I think fundamentally, you know, what I would like to say is we must insist that society does not put up the big questions, the biggest of which is has the digital world delivered for children?  And the resounding answer is no, and we have to start from that perspective, not deal with the harm after the event.  Thank you.

>> MODERATOR:  Thank you so much for so graphically and passionately explaining the key issues that are surrounding children in this complex environment that we are grappling with and not to mention how they feel in the complex environment.  I will now move from that level because you have set the stage perfectly for John, I think.  And John, through the work of IGM in the Philippines you are doing impressive work to support the justice system.  And then but you are not only working with the justice system, you are working seeing firsthand how the interplay between local capacities, normative frameworks, business model companies and then access to adequate technologies can either foster or hamper the response, and prevention efforts and actually to address child abuse.  Can you tell us more about what you have learned over the years and how we can draw from those lessons and use it more broadly beyond the Philippines?  What have been the main learnings and how do they relate to the things the other panelists have brought up today?  Over to you, John. 

>> JOHN TANAGHO:  Thanks so much.  Thank you for hosting this.

I'm so happy to be part of this panel and hear from all of the experts.  I just want to say I agree with everything that's been so far.  It's great that we have different perspectives from different kinds of work that very been doing.  I would just say, you know, IGM has been primarily working in the global south, and so a lot of what I'm going to share is really from that perspective, and where, you know, victims are routinely abused to create child sexual abuse material, videos, photos, live streaming, and yet the justice system is just not at the capacity that we, you know, we often take for granted in the more developed world.

So I would just say, you know, IGM has been operating in 13 countries, and over the last 23 years we have learned a lot.  So what we have been doing is working alongside Governments in the developing world to strengthen the basic infrastructure and capacity of justice systems to protect the most vulnerable people from violence.  And we have been doing that in the Philippines since 2001, and since 2011 focused on the online sexual exploitation of children.

But what I will say is we see the same thing in country after country.  When you strengthen the capacity of front line Government agencies in a sustainable way, then they can better implement and enforce the law to identify victims, to rescue them from ongoing harm, and to hold offenders accountable.  And I think that's really important not only for the justice system, but for the larger society and the community.  Because when you have a state of impunity where children are being sold online, where they are being trafficked, where they are being abused and it's sort of an open secret, and as it's a way that people make money, then what that communicates is this is okay.  Our society, our Government thinks this acceptable.

So you have to replace that environment of impunity with just a measure of accountability where some people are going and being arrested and being prosecuted.  Where victims are actually being identified and rescued.

So what IGM has learned is what's really critical is to enter into partnerships with specialized police, prosecutors, social service providers on the ground, and actually take cases through the justice system pipeline to identify what are the gaps in this justice system, and what are some potential solutions to things like victim identification, to case buildup, to police, you know, applying for and getting search warrants.  It's the things that we take for granted in other countries, but in the Global South, these are just some simple things that often don't work.

And so we have learned when you work with Governments to implement those solutions, you can actually see amazing results in terms of victim identification, victim safeguarding, offenders held accountable, crime prevalence reduction, and overall improved justice system performance.  And I just want to highlight four things I think that are so basic that IGM has learned in country after country and that now we are seeking to scale into new countries that we are not working in, and really want to scale through local NGOs, local partners.  The first is that you need a clear Government mandate to investigate and combat the crime whatever it is.

It could be live streaming of sexual abuse.  It could be a trafficking of children.  But you need a Government mandate that tells specific specialized law enforcement units, prosecutors that, hey, it is within your remit, it is within your job to investigate these cases, to prosecute these cases.  To provide social services, and aftercare to these survivors.

And often times, that mandate just doesn't exist or if it exists it's just sort of cascaded to all law enforcement.  But it doesn't work that way, because all law enforcement are dealing with all crimes.

You really need a specialized law enforcement unit with expertise, training, and time to work on these cases.  The other thing we have learned is units do need sufficient Government resources.  So you could introduce excellent technology that's working in the developed world, but what you need first is for those units who are going to use the technology to have staffing.  Enough staff to use it, right, with recurrent annual funding for their budget, and basic equipment, laptops, good Internet, office space.

Then when you introduce, you know, new technology, it's actually going to be used.  The trick is if you just, you know, bring the technology, it's just going to sit on the shelf and collect dust if the people aren't trained, if they don't have the resources, and the mandate to use it.

And then finally, I think what you have to do is always contextualize the response.  So what we have learned at IGM is, for example, law enforcement and prosecutors, they don't just, they don't need a one size fits all seminar training.  This is what works in, you know, in the U.S. or in Australia, so do it in the Philippines.  That doesn't work.  Because the training has to be contextualized to the local legal framework, what actually are the police allowed to do here?  What are they not allowed to do?  And what technology is allowed under the law, what is not allowed.

And so contextualized training  is really critical to strengthen the justice system.  So I think when you talk about all of these things, what's really critical is that when you are talking about technology or you are talking about training, when you are talking about balancing, you know, privacy and child protection, what's critical is that Governments really have a clear eye to understanding of what is happening on the ground, in their country, in their region, globally.  I mean, we have all sort of said this in different ways but sexual exploitation of children online is a global crime.

So it doesn't work to just have a jurisdiction by jurisdiction approach.  What children need is for the global community to come together and say, this is the standard across the board.  This is the material that has to get removed.  This is, you know, the data sharing that needs to happen between technology companies, and law enforcement.

For example, law enforcement in the Philippines can't goat the kind of evidence or data that law enforcement in the U.S. can get from U.S. technology companies.  They can't serve a search warrant on them.  They have to go through a drawn out, you know, process of MLAD and other things like that and it just doesn't work.  So I think what is really important is the Governments really ensure that their legal framework is actually empowering the justice system and private industry to do what's necessary to combat these crimes, and then, of course, at same time, you know, as Signy mentioned, we do need private industry to step up their game and say, okay, we are not going to do the bear minimum that's required under the law, but we are actually going to do the most that we can that's allowed under law.

And then as Emilio pointed out, even that's getting tricky with the EU, but I think we have to be speaking as one voice to Governments and to industry that, look, this is a pandemic.  The sexual abuse of children online in person to create, you know, online materials is a pandemic, and we should be responding to it the same way we have with this global pandemic.  I mean, look, Governments are shutting down countries, right.  I mean, nothing is getting shut down to protect children from sexual abuse and exploitation.  What that says is we haven't taken it seriously enough.

Governments have not taken it seriously enough, and also industry to some extent hasn't taken it seriously enough either.  So I think that's what IGM has learned from being on the ground.  All of these things are important, but in the Global South, don't forget, like step 1, step 2, step 3, strengthen the core infrastructure of the justice system first.

>> MODERATOR:  Thank you so much, John.  And I think most of what you have said strongly resonates with the work we are doing at the 5 rights foundation to really enable more proactive and action oriented frameworks for the Governments to create not only focusing on a specific illegal aspect of CSAM, but what is a child online nurturing framework look like to empower the sectors to do what they need to do to prevent the risks and address the harms.  That's really important to emphasize.

I don't see any questions in the Q and A.  I will just continue talking to my panelists because I love doing that.  I'm going to go back to Signy for a moment, I don't want us to be, the message has been sobering for most of the people on the call.  I want us to be more optimistic as well.  And Signy, you have been telling us over the last few years you have started working more proactively with the ISPs, Internet Service Providers and they have become more kind of open to hear you and learn from you and with you over the years to become more proactive in removing material.  What was the trick, if you will?

How did you get them to become more positive players in this ecosystem?

>> SIGNY ARNASON:  That's a loaded question.  If I can be completely frank, The New York Times Article.  You start to expose the rot, then companies start to become a little bit nervous about how the public feels about what companies are choosing to leave up and what they are taking down, and often it was just the worst of the worst.  So we have certainly been on a journey with those companies who are prepared to be talking to us, and there are many of them, certainly.  To educate them on what this issue means.

So when you are getting a notice related to, you know, the CSAM adjacent image so the four corners may not meet the threshold for something illegal, but it's incredibly destructive and harmful to the victim because they are a known survivor, that needs to be removed and there should be no question about keeping that up.

It’s really odd sometimes the arguments that will come back, does the survivor really want this material removed?  It's just so out of touch with reality, and certainly not in step with what is happening for survivors so we have absolutely seen and we are more than prepared to give credit where credit is due, but at the same time I do not feel sorry for industry.

They are making billions of dollars.  I feel sorry for children and we will always land on the side of children and how we are not doing enough for them.  And just so that I can actually talk about a little bit about prevention quickly, I think it's really interesting, we bought into this narrative that it's parents' responsibility to keep children safe online.

And companies and industry has fed into that notion, which is quite frankly insane.  The idea that parents can manage and know everything their kids are doing online is absolutely impossible.  So while we do need to educate children about being safe online and being responsible digital citizens, we need to be creating platforms that actually are established with children in mind.  Otherwise, don't have kids on your platform.

If you have adult pornography on your platform and you have children intersecting with it, you are going to end up with problems.  Beeban mentioned only fans, companies, you have got Twitter as an example where they are the gateway.  They are the advertiser and leading teams over to that environment where then they are deciding that oh, I will monetize my sexual material.

And it is creating an absolute disaster online.  So I really think that we, it's, of course, essential that prevention, we are focusing on the wrong side of the coin.  Quite frankly, we have got all of this harm occurring online, and then Governments are actually in a position where taxpayers' dollars are going towards ramping up law enforcement efforts, looking at ground prosecutors, we are stacking it on the wrong side.

We need a framework that, of course, has civil society in mind, but a framework that holds companies accountable for the systems that they create and they are profiting from.  And they are profiting to the tune of billions of dollars.

So we are seeing a shift, and it's important, and we will continue on that journey, but I really, to be quite frank and honest, the reason companies are starting to care what we think about it is because they are determining that we have actually got big mouths and we are prepared to say where there are problems.

>> MODERATOR:  Thank you.  I think Emilio wants to build on what you just said.  I think it resonates with European Union's current movements.

>> EMILIO PUCCIO:  Thank you so much.  I wanted to say precisely picking up on what Signy just left us with, prevention.  My experience having worked there for five years in the European Parliament with politicians, I must say that it's more difficult for politicians to understand that it's more important to invest in prevention.

Because, you know, prevention, you don't see the fruit of your investment immediately.  You see it afterward.  So it is also something that is very difficult for politicians to sell that to the electors.  It gives more strength to say we invested so much, we increased the number of police officers out there.  Even visually, it gives more sense of security that you put more police officers out there.

So unfortunately, like also on the legislative perspective, it's much more difficult to make people understand that it's more important to invest in prevention.  I have to say in this, I conclude, of course, that recently, in the European Commission strategy for the most effective fight against sexual abuse, there are often multiple calls from the European Parliament.  At that level it's been understood and widely shared so more is going to come and happen in prevention, in preventive measures and the European Commission would set up soon a preventive network.  So I just wanted to unveil this positive note.

>> MODERATOR:  Thank you Emilio.  We have now a question for, and I will just ask from one of our colleagues and it says how can design for safety become part of industry culture?  I think that's a billion dollars question, but does anybody want to particularly address it.  Maybe Beeban, you, and maybe you can take the question asked also around the culture of balancing different aspects around sexuality of young people below 18 and above 18.

>> BEEBAN KIDRON:  I will start with the first question, because, so I have been asked this question about on the one hand, on the other hand and I have been told that I am against even 18 plus pornography, and I just put on the record I'm not against 18 plus pornography as a political stance.  I personally don't use 18 plus pornography and I have some issues around misogyny and consent, but in the context of this conversation, I want to be clear that I am talking about the routine delivery of 18 plus pornography into the hands of children, and in the U.K., at least, the average age upon which a child first sees hard core pornography is now 8.

So I'm actually talking about a child whose sexuality may be very different and blossoming, but at any age, but I am absolutely sure that it is not a great idea that the first idea of your own sexuality or indeed other people's sexuality is a gang bang into the school playground by your phone unasked for.

So I want to make that very strongly.  On the other point, I would absolutely say slightly guilty as charged because I do believe in balancing rights and I think what I was trying to say and maybe I should have been more careful and saying I do not understand this balancing between child sexual abuse and the commercial benefits of companies rather than balancing overall.  So the balance sheet of benefit from a corporate perspective is what I was saying and thank you for the comment because I will say it more carefully next time.

I do also want to just pick up on one thing, which is, which actually speaks to both of the questions I was just asked, which is around recommend loops.  About automated systems.  If I actually say something ghastly, terrible, even the worst in my own room here in London, and it is accessible only to those people that I can tell about it in a personal way, that is very much more limited than actually the way things are spread in ways that are deliberate and automated.

I think that we have to look, I think we have to look, and I'm talking about rings of pedophiles here, I'm talking about the sorts of spread of material that I was talking about in the more broader part of the conversation that actually this stuff is spread at super drive, and I think that that is one of the places where regulation has to now start looking.  Because it is different to say that anybody can say and do what they like within the law, than it is to say that you have an absolute right to spread that all over the world irrespective of the societal or individual damage or indeed the age of the person who receives it.

And then finally, if I might say, that actually educating children to deal with a toxic world that is built on the commercial interests of a handful of people is not an adequate response.  Absolutely we must educate kids, and actually one of the things that I do with my spare time, not that I have much, is data electricity for children and that's a very different thing from E‑safety, that's a very different thing from digital electricity, but data electricity really helps them understand how they are being nudged and pushed by the information around them including their own actions and makes them feel actually both angry and strong, I would say, but we cannot, we cannot let the final answer be that we educate children for a bad environment, we have got to clean up the environment.

>> MODERATOR:  Thank you so much, and I don't think that anybody could agree more with what you have just said.  John, do you want to chip in?

>> JOHN TANAGHO:  I wanted to add one thing from the prevention conversation, I don't want us to miss the fact that there are different forms of prevention.  There is going up stream and understanding what are the causes or factors that lead to offending, but I think also what IGM has learned in our experience is if you take a situation where you have complete impunity and no one is ever held accountable for abusing children and you replace that with some measure of accountability, then you actually do create a deterrence and that deterrence is itself a form of prevention.

So, for example, in several of our projects in the Philippines, in Cambodia we have seen significant reductions of the prevalence of child sex trafficking, in person child sex trafficking in bars and brothels and in street red light districts because you have justice systems that were stood up.  And all of a sudden a situation where traffickers were never arrested or prosecuted, now they are being arrested and prosecuted by the dozens or hundreds.

So I think obviously justice system importance to deterrents is not the entirety of prevention.  There are so many other ways that you guys are discussing and I couldn't agree more, but I think we don't want to lose sight of that, especially in the Global South where you do have pockets where there is just complete impunity.  And so if you are not afraid that you are ever going to get caught for selling, you know, and for producing and selling child sexual abuse material, I will tell you what, that is the number one reason that people will do it because they are not afraid of getting caught.

And so we have to just never forget that strengthening the justice system to create a deterrence is actually part of prevention.  I think we just take it for granted, you know, around the world because we have, you know, justice systems that more or less are functioning at some, you know, effective level.

I think the other thing I want to comment in terms of prevention is this idea of child sexual abuse materials getting online in the first instance, which I think to me is just actually ridiculous.  I think every tech platform should choose though use photo DMA and other tools to detect and block CSAM at the upload.  Because if you think about it, it is contraband.  It's illegal.  It's not a matter of freedom of expression.  Nobody has a right to possess the material in the first place let alone introduce it into a platform where there are children and other people.

I mean, just imagine if with other contraband like guns, if you are not allowed to bring a gun into an airport or into a mall, where do they check for that gun?  When you are walking in.  There is a metal detector, and so I think photo DNA and other AI tools should be like metal detectors.  They should be sat, when you are stepping into the platform, if you are carrying contraband, block.  You can't bring that in, because it's illegal, and so nobody has a privacy interest to possess child sexual abuse materials let alone to upload them.  So I think we have to have a higher standard for this idea of preventing CSAM not just taking it down once it's on, but it shouldn't get up there in the first place.

>> MODERATOR:  Thank you so much, John.  I think we have left a couple of more minutes, and I do want to kind of summarize the key points you made, but I also want to give you a 30‑second chance to kind of like make one final point for the audience, given that there have been some questions around safety by design, questions around the role of users in preventing and thinking about all of the issues you have mentioned around survivors, impunity, normalization, and suggest 30 seconds if you have a message for the audience.

Let's start maybe with Signy and then go Emilio, Beeban Kidron and John.

>> SIGNY ARNASON:  The message is that Arachnid has changed the lens on what is happening with children on the line.  It is the evidence‑catcher.  So back foe John's point about companies blocking, we are seeing companies that are saying they are doing it and Arachnid are finding images, and we are seeing you mustn't be doing it in this area so there is inconsistent approach.

So at the end of the day, our stance is we have an epidemic on our hands.  We shouldn't be running around patting ourselves on the back thinking we are doing exceptional and extraordinary work.  Who has time for that?  We really need to be talking about the failures and then how we shift this dialogue and change what's happening for children, and while it would be nice to believe that, you know, there wouldn't be any need for regulation.  We believe there is a need for regulation.  We need Governments to step in, to lead, to protect children as they do in the offline world.  We would never accept the things that are happening to children in the offline world, but somehow we have turned this environment into all about privacy.

And we have completely forgotten the ones that actually have no right to send forward and say their privacy is being violated every day.  So this has to change.

>> MODERATOR:  Thank you, over to Emilio.

>> EMILIO PUCCIO:  Thank you.

I couldn't agree more with what was just said, and I actually wanted to add picking up on what John said.  I do agree that the punishment, the functions as a deterrent is very important, but in Europe that is not even an issue because we have a directive that has made it is a crime all over the European Member States and yet unfortunately we are confronted with a huge number of cases of child sexual abuse.  So as I said before, I think we don't just have to focus on the punitive aspect.  And once again in the case of prevention, I think something that hasn't been mentioned in this conversation is that I think we should also, especially it's very important foreign legislators to take into consideration that unfortunately, as said, and as I'm comfortable in my sound we do know that some part of our world population has a problem, has a sexual attraction to kids.

So it's also important to look into this, into the health perspective, the mental health perspective.  So I think this has to be tackled as a public health issue.  And I think a lot more needs to be done focusing on the potential offender.

>> MODERATOR:  Thank you for raising that important issue of potential offenders and offenders.

>> BEEBAN KIDRON:  Maybe I will use my 30 seconds to try and answer Hester's question about parents and so on.  Just to say I think we underuse consumer law in this area, and that sort of relates to something Signy said at the beginning, we don't use our, consumers don't use the companies that don't do the right thing, first of all.

Secondly, and similarly, I think as citizens, parents, school governors, so on, to fight to put this on the agenda in a sophisticated way, in the way that these panelists are talking about it with all of the tensions that there are in the conversation, but to actually say we need a world in which this is sorted and put it right at the top.

And I do agree with what was said earlier.  We don't take it seriously enough.  We brought the world to ground for the virus, obviously, but this is also a public health thing.  And then really just finally, I spend quite a lot of time with young people.  They are outraged.  They are upset, they are anxious, and one of the things they hate the most is standing by and seeing what is happening to their community and to other kids in their environment.  And I just kind of go, we owe it to them, you know, we owe it to them, and so I think we have to take a citizen approach and a personal responsibility approach until you have done everything in your own world, you haven't done enough.  Even if you are not an expert.  Thank you.

>> MODERATOR:  Thank you so much.  John, over to you.

>> JOHN TANAGHO:  Yes, I think my last 30 seconds would be everything we do has to be survivor‑centred.  Everything we do has to look at it from the perspective of, you know, those individuals who are being harmed right now, and it could be that they are being harmed because there is still a trafficker or offender sexually abusing them in person to produce new materials or it could be that they are being harmed because they are not in that position anymore but their abuse materials are circulating online and that's an ongoing harm.  We talked about prioritizing things.

And I can't think of anything that is more important than protecting those children from further harm.  And so whether it's strengthening a justice system that can, you know, conduct good investigations, find those victims and safeguard them or whether it's, you know, ensuring that those materials are removed from the Internet as quickly as possible or not uploaded in the first instance, we really have to focus from the survivor perspective, because nobody is experiencing greater harm than survivors.  We talk about things like, you know, freedom of expression, and privacy and resources.  But the reality is, you know, all of those things can coexist with doing what's in the best interest of survivors.  Will and we do that, we do it all of the time.  We always balance things.  You have freedom of expression, you can't walk into a theater and yell fire.  You have freedom to drive your car, but in most countries, you can't drive at 120 miles an hour because you might kill people.

So I think we always balance our freedom with protecting people.  And there is no reason why we shouldn't do that when it comes to sexual exploitation of children.

>> MODERATOR:  Thank you so much, John.  We have unfortunately lost Beeban, she had to run, but sending regards to all and thank you to all of the attendees as well.  I want to really thank you.  I have enjoyed this conversation a lot, and I think that we have started by talking governance and technology, but I love the way we have really kept it centred on the survivors, children, the victims, and the work that needs to happen around there.  And I just want to put emphasis on I think the key theme of the panel has been we need to stop normalizing and accepting that society and companies and users are accepting child stem usual abuse as an acceptable level of risk or acceptable level for their business model or acceptable as, you know, like a balance between issues around privacy and safety, because that is not helpful.

And I think going back to speaking about issues like what can happen, and I love the way that Signy was saying what helped them kind of advance a little bit at work was The New York Times Articles.  Let's speak more about the issues and find the ways for the people, societies, users and communities to use their power and pressure to make the changes within societies, companies and Governments.  So I think that would be the call that we at End Violence have to our partners but everybody on this panel and beyond.  I want to thank everybody for being here today.

I want to say this this event was part of the end violence Summit series which are aimed at inspiring and catalyzing political and financial commitments needed to end violence against children.  So I think your calls are really aligned to that.  As we enter a decade of action to deliver Sustainable Development Goals the global campaign we are launching will support acceleration after action, especially given the context of COVID‑19 and beyond.

So thank you, everybody, once more.  And I hope that the attendees have had a chance to learn something and engage.  Please be in touch with us, and we look forward to hearing from you.

Thank you.

>> Thank you, bye‑bye.

>> Thank you, Marija.  Bye‑bye.


Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10

igf [at] un [dot] org
+41 (0) 229 173 411