IGF 2021 – Day 1 – Lightning Talk #85 Rapid notice & takedown - the key to getting child sexual abuse off the internet fast

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> DENTON HOWARD: Good morning, everybody, I want to check that my audio is all working and confirmed.  Can somebody put a message in the chat, please.

 

>> We all live in a digital world.

We all need it to be open and safe.

We all want to trust

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united.

>> DENTON HOWARD: Good morning, everybody, I don't know who you are, Claudia, good morning.

>>  CLAUDIA:  Good morning, now I'm here.

>> DENTON HOWARD: You're from.

>>  CLAUDIA:  Berlin.  At the federal Ministry of Families.

>> DENTON HOWARD: We have a number of participants here this morning, I think we have a total of 15 at the moment ‑‑ 14 at the moment.  So my presentation will start in just a second, give people one or two minutes to jump in, and then we'll get this party started.  Just while we are waiting, rather than repeat, you'll hear a lot of stuff from me, from today's presentation, what are you looking for?  What information would you like to know?

>>  CLAUDIA:  Wasn't expecting that question.

>> DENTON HOWARD: I won't put you on the spot.  Hold that thought.  When we get to the end of the presentation, we'll come back to it.

We are just a minute past the hour.  Claudia, could you disable your video, and we'll just jump in.  Otherwise I'm not sure when the screen share when that will happen.

So we'll just give it another minute to allow one or two more people.  I'm just looking to see, is anybody in the audience that I know.  That isn't.  I can tell you loads and lies and nobody will know anything different.  Just bear with me two seconds and we'll do a little countdown.

Just get my presentation set up.

If the computer will do its magic.

Claudia, can you see my presentation with the word INHOPE on the screen.

>>  CLAUDIA:  Yes, you can see everything.

>> DENTON HOWARD: Okay, perfect, great.

So what we'll do, I just want to check one thing on the chat.  Great.

So, guys, what I'm going to do is get started with this morning's presentation and really, I'll do the presentation and then we will do any Q&A afterwards and that kind of stuff.

With that said, my name is Denton Howard, and I am the Executive Director of INHOPE, which is the international Association of Internet Hotlines.  I had planned to be in Katowice, as many people sitting here this morning also intended to be there, to meet person in colleagues, to share coffee and hopefully a beer, but sadly, most of the people, due to the COVID situation, our travel plans had to be curtailed at relatively short notice.  My plans only changed last week.

Now that we have moved this lightning talk online, hopefully the technology will allow me to convey the information and ideas that I watch to share with you today, that hope flu you can take that knowledge and apply it to your own environment, your own legislative area or your own policy area.

So the title of my lightning presentation is rapid notice and takedown, the key to getting Child Sexual Abuse Material off the internet fast.

I chose this title, one, it sounds snappy, and I had a publication deadline.  But just like lightning, my presentation will be fast, hopefully a little scary, and also make a small impression or major impression maybe even.  Joking or being lightness aside, I want to articulate the scale of Child Sexual Abuse Material online, and how working together with relative stakeholders efficiently and in a coordinated way, we can basically make the internet a better place and safer place, getting the material offline as fast as possible.

I can't just use the word rapid a lot, you'll understand why toward the end of my presentation.  Before we dive into the detail, because I don't know what everybody's level of knowledge is, so for the purposes of this, I'm going to assume you have no knowledge or background in this subject area.

With that said, I want to get into sort of address the question about what is Child Sexual Abuse Material, and you may think you know, but sometimes it needs to be very clear.  So first and foremost, I'm sorry to be very blunt, but where I am at the moment, it's 8:00 in the morning, I don't know what time it is where you are, but Child Sexual Abuse Material or CSAM, which is the acronym we use continuously, is the recorded sexual abuse of a child.  CSAM can be on a video, it can be on an image, it can be sometimes in an audio format, and what's illegal depends on where you sit.  Certain countries have different legislative approaches.  But it is really, really important, I cannot stress this enough, that behind every image and statistic that you might hear or number you might hear or icon on a screen you might see, there is a child who is a victim, and they should be at the center of everything that we talk about and everything that we do.

Also, we all see the lovely picture of the kid and the victim.  We have to remember, there is perpetrators.  The perpetrators are the bad people that do the things to the child.  I know that sounds simplistic, it's important to strike that home.  I want to repeat what I just said.  Just to be sure it's clear.  The recorded sexual abuse of a child.  Again, it takes a victim, and it takes a perpetrator or perpetrators to it to that child.

Each instance of CSAM can be copied and shared an infinite number of times.  We can create a million of a copies of a file literally four clicks of a button.  So the point is, we can also then share that to an infinite number of locations, on an infinite number of countries ‑‑ not infinite, but hundreds of countries, again, in a very short period of time.  Our objective is to stop this as quickly as possible when we become aware of it, via notice and takedown, I'll explain that in detail in a minute.

So we also want to get to the root file so we can get that removed and again, the why you'll come to in a moment.

So you may hear certain people from privacy rights or people who have questionable interests saying that viewing child sex abuse material is not a victimless crime.  A video of the abuse of a child is crime scene, and it's really important you get that idea, because each time that video or image is shared or viewed, that person is, in effect, abused again, and the person that's viewing is abusing that child again.  So we use the term revictimization.

CSAM that is shared online results in that being a continuous process of re‑abuse and, again, the cycle continues each time it's copied and re‑shared.  And that is just unbelievably traumatic for many victims what is may be able to come to terms with the physical abuse that happened to them.  The fact that the recording is constantly being re‑shared and re‑happening is beyond what a lot of people can handle.

With that in mind, for the victim, we can't undo the really unspeakable things that have happened to that child, but we can try to stop the recording from being re‑shared and redistributed.  Also, and again, we always hold to this objective, we hope that a child can be rescued from the perpetrator or from a difficult environment or a situation where they are being taken advantage of.

Now, the iconography I've used on screen is kind of generic, because normally when I give presentations on child sex abuse material, and I say this in the loudest possible terms, we never ever images of CSAM, we could not do that.  My job, while I do have to go to environments where child sex abuse is analyzed, thankfully I have the benefit of I support and enable people who have to do the analysis of that content, but I'm lucky I don't have to deal with it.  Personally, I wouldn't be emotionally equipped to do the really tough job analysts have to do.  I'll come back to that point in a little while.

We never, ever show images.  That applies across the board, with one exception, I want to talk to you about that.  The one exception is this.  The face of the child that you see there is a child called Thea Pompadoic.

Now, I can show this you this because it's in the public record, it's in the public domain, although you may have a problem finding it on the internet because she died in 1984 when she was 6 years old in the Netherlands, she died of a drug overdose administered to her of the recording of her being sexually abused in 1984.  Now, I know you'll all intelligent people, but that was 37 years ago, but images of her being abused are still showed online today.  That's 37 years, if you can imagine images, that aren't great quality.  People are still sharing that, they are sharing everything else they can get their hands on.

I'm doing this to draw your attention to it, rather than being a statistic or image or icon on the screen, there are people behind this.  Now, I won't go on about this, but I want to draw your attention to an article written by a very good colleague of mine, Nick Moran on what happened to Thea.

So my point really is to draw your attention to the key issue of the material.  So with this in mind, while we can't change what happened to Thea, we can try and get the content of her and the thousands and hundreds of thousands other victims out there and that is why notice and takedown is so important.  That is the process of getting the content taken offline.  I'm not sure what everybody's level of technical knowledge is, I'm keeping it to basic principles, we can get to technical questions at the end if anyone wants to deep dive.

So with that said, hopefully I've scared you a little bit, and hopefully you all had to talk a mouthful of coffee to understand, oh, my God, that's terrible, if I've achieved that, that's a success.

But as I said earlier, now I want to give you a little bit of background to ‑‑ now I've told you the issue and the problem, now I'll address how we deal with it.  As I said, I'm the Executive Director of INHOPE, now INHOPE is a global network of hotlines or in some countries, like in the United States and Canada, it's known as a tip line.  So currently INHOPE is 46‑member hotlines in 41 countries.

I'm very glad to say that next week, on the 15th, that number will increase to 350.  We have four new hotlines joining the network, this is momentous to break the half century mark because we originally started in 1999 with 6.  So it's been ‑‑ it's a great week to announce that information.

As a network, we provide a collective service to hotlines around the world and connect them together.  That includes technology platforms, which you'll hear about, capacity building, training, best practices, representation, coordination and many other aspects and elements to support hotlines in their mission, which is our mission, which is to detect and remove Child Sexual Abuse Material from the environment.  I said digital environment, I didn't say internet.  The reason we say that, the world and technologies and platforms are changing all of the time and the internet is just a piece of hardware, it's a network of networks, and it evolved into what we commonly know as the worldwide web, and we have newer technologies constantly coming forward all of the time.  So it's a terminology change that I'm using, and I think I would recommend others to start changing the way things are worded because it means if something isn't on the Internet, it specifically might be other people's concerns.  Anything you can see on the screen, it should be what we are all concerned about.

So rather than rabbit on, if you want more information, go to INHOPE.org and familiarize yourself with what we do, where your nearest hotline is, you can select from whatever country you're coming from and learn as much as possible.

With that said, I wanted to give you a little bit of run-through