IGF 2019 – Day 2 – Raum I – WS #170 Children's Privacy and data protection in digital contexts

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> WILLIAM BIRD:  Good morning.  Everyone, we're starting late because there are so many of you.  Hello, everybody.  Can you take a seat in a circle, please?  Preferably ‑‑ because there's not that many millions of us, you can generally face towards me for the moment.

Good morning, everyone.  Welcome to this workshop 170, which is undoubtedly, going be to one of the cooler workshops, and you will be able to go home and tell your friends how fantastically good it was because we do actually have a ridiculously fantastically cool panel.  Maybe people weren't aware of who was on the panel as to why they aren't the thousands of you that we were expecting.

Be that as it may, I'm William Bird, and I'm thrilled and delighted for those of you who are Germans, thank you very much for hosting us in your country.  It's very lovely to be here, despite the cloudy weather.  Daniela and Frederik and ‑‑ sorry.  Okay.  We do have an official photographer with us, who is standing at the back.

Given the subject of this particular workshop around privacy, et cetera, is there anyone who would object to having their picture taken during the session?  Obviously not outside of this.  Can't tell what you are going to do in the privacy of your own time, but if there is anybody uncomfortable having their picture taken during this session, let us know, raise your hand, and our photographer won't take any pictures of you.

Are you happy?  Were you just waving in wild excitement?  I'm talking to you.  Yes.  You're happy?  You were just waving with your hands because this is going to be such a good session; right?  What I'm going to do is talk to what we're going to do in the session because we have very little time.

We really do have a spectacularly cool panel with us this morning, and once I've introduced them each with a one‑liner, and they have three minutes, and we have an evil timer, and it's going to stick them to their three minutes.  We are going to stick to their three minutes.  Each of you will have found a crafted piece of paper with a list of notes, and you are going to answer the three questions.  You'll have discussion and go into all of these.

To begin with, in regard to our ‑‑ was that my time?  Oh, wow.  We have Sonia Livingstone.  Is there anyone in the room that doesn't know who Sonia Livingstone is?  Sonia?  Everyone knows who Sonia Livingston is.  That's cool.  In this conversation and this is the sort of thing that jingles your bells, if you don't know who she is, you can go and Google her, and you will find it thrilling from there.  She's a legend in this field certainly to most of us.

Then we have Rebekka Weil, head of Trust & Security for Bitkom.  Some of you may be familiar with her here in Germany.  Then we have Gehad Madi.  Then we have our own legend ‑‑ he just stood up.  Just all the way down in the corner over there.  Heads up one of our programs on child rights and digital literacy and digital identity, and we also have Mr. Steffen Eisentraut, head of International Affairs.  Wait until I get this wrong.  Even worse.  You're going to shoot ‑‑ that was okay.

So we've got an amazingly cool panel.  Thankfully, we have two of our young people.  By young, I mean they're children.  We have Joy in the back who comes all the way from Macaulay House, which is a school in Johannesburg, and we have ‑‑ if you think his first name is a little difficult to pronounce, you can just call him Race Cap.  He has been eyeing all the Mercedes Benz he has been seeing on the street here.  They have got a cool role.  They'll be floating around between your different groups and sharing their views and perspectives, and they'll have remarks by the end of it.  They may move on now.  Tom is going to start.  You get to the pedestal or this podium.  Should we start with you, Steffen?  Please.  Will you join us here?

Do you want to do it from there?  Okay.

>> STEFFEN EISENTRAUT:  So, good morning, everyone.  First of all, excuse me rough voice.  I have a bit of a cold.  I would first like to say a few words.  For the protection of minors on the internet, appointing media authorities, youth authorities, and the federal ministry for family affairs and our central activities.  We exercise in this context.  Mainly the right to participation and protection.  We come across problematic content of youth protection and even criminal offenses.

However, when it comes to children's privacy online, we enter very complex territory.  Although we see there are more and more debates about personality rights and data protection about sensitive pictures, of children on social media, our kid influencers and video platforms, we are far and away from a consensus in society how do deal with such phenomena.  Also, unlike many other problems, we are dealing with be it child sexual abuse material or propaganda, infringe am of the right of privacy we're often in a gray area, and there's often no regulation.  There are not many options for action for us besides raising awareness.  We publish two reports recently concerning the issue of privacy.  One is about kids photos on Instagram, and when starting to scroll through the photos in our research, it immediately became clear that only a few users made sufficient effort to hide the children's identities.

The other report is about child influence on Youtube where we looked at some of the most popular kids and Youtube channels in Germany.

Since it's a personal issue, which only the affected can determine, we developed some guiding questions to find evidence.  That's does the video reveal personal data?  Almost 94 percent of children were filmed in private or intimate situations, in bed wearing pj's, during their body care your teen, or sick with swollen eyes and runny nose.  In order to give you an impression of what we are talking about, I brought these screen shots.  47 percent of all cases showed children in embarrassing situations.  Embarrassing pranks, funnily challenges or angry outbursts might be amusing for the viewers, but this can lead children up to mockery or harassment or bullying.  How can we tackle this problem?

Many factors are necessary, including sensitizing parents.  We can hold the IT companies responsible.  Most of the services have an age restriction for 16 years for YouTube or 13 years for Instagram, but the services are used by younger children, nonetheless.

We have to note there is a huge gap between what community guidelines say and what the reality is.  In a discussion I would, therefore, like to emphasize the need tore risk management on social media platform that is take into account that needs and demands of young users involve as they grow older, and IT companies have to find a better way for their need for protection and need for participation according to their age.  They have to acknowledge both the right to participation and their right to protection.  Thank you.

(Applause)

>> It's probably easier for people to see.

>> REBEKKA WEIL:  Oh, yeah.  I didn't bring slides because three minutes is ‑‑ yeah, it's easier.  Okay.  I'll wait.  It's not ‑‑ okay.  Thank you, first of all, for inviting me today.  I'm Rebekka.  We are the federal association for the digital economy, so I feel you kind of addressed us a minute ago because child protection and right of children of minors, and very important issue and topic for the tech company, of course.

Everything becomes digital and we have to address that children are on the internet.  They are using digital services, and tech companies are already doing quite a lot to protect and safeguard children's rights.  Starts with digital ID mechanisms with, of course, terms and conditions that say who is supposed to use the service.  Some apps are already designed in a very child‑friendly way to guide them into using the service responsibly, but to my mind, protecting children's rights is not something only the tech companies can achieve because we have to work together.

We already made some very strong examples just a minute ago that showed that the content that is available online is not really put up by the children themselves.  It's put up by the parents, by other people, and I guess the key to addressing that we really don't want to see specific content on the internet is by educating every single user because when parents put up pictures of their children, they really should take a step back before they post content online and ask themselves whether they would want their children to see this specific picture or video they filmed at home in five or ten years.  I think that's the first step we have to make.

If you do that, I think we can achieve I important thing addressing all the people that are already concerned with.  Their child's safety and protecting their rights online.  We also have to address all the content that's really illegal, and doing that, I think, needs to be a global debate.  I don't know whether you all remember the huge dispute that was going on in about the picture of the Vietnam War.  It was filtered from the internet, and that is something that is not supposed to be happening because we want to use the internet for educational purposes as well, and it might be blurry and a very fine line between what we are supposed to see on the internet, which is really educational content, but the picture is famous for specific reasons, and it needs to be shown and what we don't want to see on the internet.

I think if we address the educational issues first, we can then together with the tech companies develop a framework that will really protect children in the end.  Thanks.

(Applause)

>> PHAKAMILE KHUMALO:  Good morning.  Don't steal my three minutes, Sonia.  Good morning.  My name is Phakamile.  I'm from South Africa.  I run the web Rangers program in South Africa.  Digital and media literacy program that empowers people with digital literacy skills.  As more and more young people begin to engage in the digital world, I'm afraid the parental interventions government legislation, tech support and tech policies are just not enough anymore to be able to insure that children's privacy and protection is something that is on a world stage and something that is for the powerful.  I think without children's participation, without voicing their opinions, without them having power to lead the discussions, like these then we really are reaching the kind of impact we want to have, and as leaders in the issues that affect them on the most.  How do we get them to become active digital citizens and use the internet responsibly and safely, and also take advantage of the opportunities the internet has to offer whilst being safe.

It's online safety, and not just teaching them how to be safe online, but also equipping them with critical thinking skills and how to become critical consumers of news and information that they receive.  Knowing respect from what's not, and also, it's knowing how to protect, and knowing what different organizations and digital platforms use the data for.  That's the third one.

The fourth one is using the internet and being able ‑‑ being in the position to use the internet for good, and I think that we all ‑‑ we all agree that there's a huge possibility for our young people to use the internet for good, but it's a long process to get them there, to get them to be able to understand the possibilities and opportunities that the internet has to offer for young people and how do they use that for good.

The fourth one is being able to actively participate in policy making processes on issues that affect them and that are related to children and we brought two beautiful examples of with web Rangers that are active digital citizens and can participate in for ups like this because they've gone through a digital literacy program that has empowered them on talk on important issues that affect them.  How do we participate meaningfully and fully in discussions like these so we can drive solutions based on the experiences.  If you are going to talk about not having them in those discussions, we have no way in these discussions.  Please join my section.  It's going to be really cool.  Thank you.

(Applause)

>> WILLIAM BIRD:  Steffen, join us please.  Are you going to speak from there?  Okay.  He will be speaking from there.  Please.  Your three minutes begin.

>> STEFFEN EISENTRAUT:  Excuse my voice as I'm suffering from a bad cold like my neighbor here.  Within the context of the digital environment and for most it is addressed to states because it is the responsibility of states in order to enact the ‑‑ in terms of providing for the protection of the rights of the children.

Also, the most important basic thing is article 16 in the convention of the rights of the child that expresses the right for ‑‑ it is also for the exercise of other rights.  Freedom of expression, freedom of assembly, and bottom of all, perfection.  It separates the rights of privacy from protection of children.  One, they are using the internet.  That stretch of privacy arrives from the acts of others, of course, whether parents, peers, or strangers, as well as from the later processing activities of public institution, which it's educational, public service and others.

Children's privacy can also be undermined by the action.  They don't understand the specific functioning of commercial practices of the services they use.  They do not have knowledge about how to protect their privacy.

We have found that in some cases parents can monitor children, which is, of course, how their parents are to be online.  And at the end of the day any measures which may restrict children's rights to privacy must be carried out in accordance with the new.  Of course, it is a responsibility of the states to enact such and to monitor, which is very important also within the ‑‑ to monitor the activities of the providers of the internet.

It seems that nothing is becoming a secret now when it comes to information.  I was told once that information ‑‑ your information are in the cloud.  Everything, everything, in our ‑‑ it is in the cloud.  It is not only the big brother.  I want to end here by saying by raising a question, is privacy absolute?  We have to discuss this issue because we still have time.  Thank you very much.

>> WILLIAM BIRD:  In the final three minutes, you can bring up the slides, please.  Sonia, thank you.

>> SONIA LIVINGSTONE:  Thank you so much.  My name is Sonia.  I'm from the London school of economics in the U.K., and I'm a researcher, and I'm going to talk to you about the research, and, William, I need a clicker, please.

So I begin with some questions which are raised by children.  Thank you.  These questions come from workshops that I recently ran with children around the U.K., but I think children in many countries would ask exactly these questions.  I want you to think about how creative and how diverse and how thoughtful these questions are.  They've come from children age 11 to 16, and they immediately take us, I think, beyond the question of personal privacy from their parents or from their friends, which is important, in which others have talked about.

Also, to the question of their privacy in relation to the state and in relation to business, commerce.  These are the areas that we're not engaging children in enough, but they are aware, and they have, I think, some fantastic questions which we are not yet giving them answers to.  This is just the selection.  There could be many more.

In the research we went around schools and tried to encourage children to think about their privacy and their data in the digital environment, including the interpersonal relations which are now mediated through the digital world and also the ways in which the platforms and services in which they conduct their lives, including their education and their ‑‑ all of those platforms and services are now also collecting their data in crucial ways.  Perhaps the best game we played was to give them the choice of here are all the different kinds of information.

Not just your name and your address.  Those are important.  But also what you like, who you know, how well you're doing at school, whether you have special educational needs, what your religion is, what your sexuality is, where you go, and we said who do you want to share this with?  Your school, your doctor, your family, your friends, the platforms, the government, nobody?  When you provide the answer nobody, it's a popular answer.

The project was done within a child rights frame, so we also wanted to give the answers backs to the young people insofar as the answers are available.  And so we built a tool kit, which is at my privacy.UK.  I made it dot‑UK because I don't want to make it.  These may be different issues in different parts of the world.  This is for you to discuss.  We wanted to provide answers.  What's the issue of online privacy in who has my data?  Who is tracking me?  What are my rights?  What can go wrong?  This is an interesting question.  It's not obvious to children.  They know what can go wrong when their parents show something embarrassing.  They know what can go wrong when a stranger discovers their GPS and knows where they are, but they're not sure what goes wrong when Instagram collects their data are auto their government collects their data, and they want to know.  We also want to put in what children want and what did they ask and some advice.

How to protect my privacy, where to get help, and some games and we have crowd sourced the answers from many organizations around the world who produced some brilliant resources.  In my workshop, I'm going to talk about parents, and we also made a brief for parents.  Just two pages.  And a tab on the tool kit for parents because parents don't know the answer to these questions either.  The really crucial thing then is to help parents learn with their children and teachers.  Teachers don't know the answers.  My question in my group is how can we empower parents to know some answers and how did K we promote regulations so the parents don't have to be responsible for all of these complex issues?  Thank you very much.

(Applause)

>> WILLIAM BIRD:  Please disburse into your sections, and then you don't need to panic because our superb experts are going to be rotate, so those of you who are sitting down, this is a chance for a backup exercise.  You can get there to sit, and we're going to rotate the experts, and we're going to be doing ‑‑ they're going to be rotating every ten, 12 ‑‑ ten to 12 minutes.  You've got 12 minutes with each of them.  At the end of which the bleeper will go off, and they will stand up and retake.  In the period that they're with you, use your notepads and write things up.  We will then collect those from you as they rotate, and we're going to be putting them up on that wall over there in the corner where you can see they're color‑coded to the things on each of your finely crafted leaflets.  I hope this is clear to everyone.  If it isn't, I'm sorry.

You have to watch what others are doing and work it out from there because I don't have time to go through it again.  We've got one, two, three ‑‑ the people along here, if you can join this big circle here on this side of the table.  Then I'm going to send off our experts to go and join the movement.  We're going to have to pick a circle to go to and go there, and then we'll start rotating you.  Thank you.  We're going to have to have another one.

>> Which stakeholder group?  They'll say more guidance needed.  Can we do some even spreading?  The people that have just come in the room, lady, Madam in yellow, sit down if you can join this group over here just so we can spread some of you out.  Then can we get some of the people ‑‑ send some of your people in your circle over to here.  The smaller the circles, the better because then you have much more detailed discussion.  People from Phakamile's group, five of you stand up and join Steffen.  Thank you.  Daniela.  Where is Daniela with her timer?  Daniela with the timer.  Okay.  There you go.  Those other two or three join this group over here.  Then we're going to start the timer.  Okay.  We are starting on 11 minutes from 12 minutes.  Your time begins now in your groups.  Begin.  Start talking.  Start noting things down.

(Working in groups)

>> WILLIAM BIRD:  Two minutes, groups.  Two minutes to write your stuff down. Okay.  That's fine.  Experts, I need you to stand up and rotate clockwise.  Sonia.  Okay.  Experts, you need to rotate.  Steffen, you're coming this way.  Pagamina, Ms. Coralo.

As you stand up, people from the teams that I'm hosting will come and take your post‑it notes from you where, please do not be alarmed.  They're not stealing your ideas.  They're merely taking them to post on the walls.  Experts arise and rotate, please.  Experts, arise and rotate.  Doctor.  Please, Rebekka.  Thank you very much.  I'm calling you all doctor.  I don't know whether you are.  Rotate.  Thank you.  Have you got your new person yet?  Pagamina joined you.  Sonia, you need to go around.  Thank you very much.  Your next ten minutes, 11 minutes starts now.  Begin.

(Working in groups)

>> WILLIAM BIRD:  Two minutes, everyone.  Two minutes.

That's the ominous sound of rotation.  Okay.  Ten seconds to wrap up your last points.  Then experts, rotate.  Experts, rotate.  Steffen's voice is getting tired.  He is going to a bigger group.  Thank you.  Rotate.  Rotate, please.

(Working in groups)

>> WILLIAM BIRD:  Rotate, please.  Dr. Livingston.  Please also remember to put your thoughts down.  Those of you that haven't yet had a chance, put your thoughts down on the post‑it notes.  Thank you very much.

Two minutes.  Wrap up the conversation.  Please hand your notes to the people in your groups or the people that are coming around collecting them.  We have time for feedback so everyone can get a sense of what happened in some of your group discussions.  This group over here, second group.  Okay.  We're going to ‑‑ what I would like you to do is raise your hand.  We're trying to get a sense of the thing on the top is what we're going to ask you to do just before we wrap this un.  Start thinking about those questions.  I learned it, I liked it, I learned it, I we should.  Stay where you are.  Sorry.  I thought you were getting ready to rotate.  Anyone who wants to talk to the general group get a bit of a sense of the overall ‑‑ something that you learned, something that you thought was interesting in response to the three questions.  If you can stand up or put your hand up for now, and then we're going to bring it over to you.  This is to allow if there's something that you think you learned in the group or something you would like to share with the rest of all of the people and the session.

No one wants to say anything?  Really?  Is there anyone that wants to give a comment to the group?  There's a hand here.  That lady there, please.  If you can give us your name when you ‑‑ and stabbed up, please.

>> My name is ‑‑ I am from ‑‑ it's difficult to say where I am from now.  I work for research in music, and I liked the participate other dynamic very much.  I think it was very productive.  I learned about different initiatives in the world.  We shared together many reflections and actions that are being taken.  I wish that we all together can achieve a safer internet.  Particularly for children.

>> WILLIAM BIRD:  Thank you.  You know what else?  You are all talked out.  What I'm going to do now quickly is I'm addressing my orders.  We'll take one more comment from you.

>> Hi.  Thanks a lot.

>> WILLIAM BIRD:  Can you give us your name, please?

>> Oh, yeah.  Rebekka.  I think what came out of all of the discussions is that we have to ‑‑ if we want to address children's rights and the internet and data usage and data sharing, we also have to have a conversation with the parents, and I think that's not being done enough as of yet.  I will take that back to the companies, of course, but I would like us all to address the idea in the future because I think it's the best way to inform children and really reach a state where they're better protected in the internet.  I found that most interesting.

>> Lovely.  Sonia, yes.  I'm down each of our next person to give, like, their 30‑second wrap‑up.  Then I will have the real experts give us their views.

>> People are very keen to answer, and that is what can states and companies do to help parents?  Parents are really struggling with these issues.  We try to bring in what expertise can parents bring and how can children be involved, but really the census that parents need a lot of different kinds of support in very ‑‑ to recognize there are very different kinds of contexts.

>> I think the role of parent is the most important.  When we talk about children, everything starts at home.  Upbringing the child before the child goes to school.  It is a role of the parents.  The media should disseminate all information to parents and caregivers.  There should be continued communication between parents and children in order to build trust, and trust is the word between parents and children.  Of course, when children are becoming adolescents, they virtually lose trust of their parents, and they, see themselves with friends.

>> They understand how to handle children.  Building trust with a child, and listening to the child is the most important aspect.

>> WILLIAM BIRD:  Thank you for those words.  Steffen.

>> STEFFEN EISENTRAUT: Thank you.  What we found interesting in our discussions where we discussed the questions where tech companies talk about what they could do or contributions there are some companies, some flight companies that do that with videos, with child‑friendly videos like comics, and it isn't boring at all to watch these videos and one understands everything.  They can provide guidelines in a child‑friendly way, and they should start to recognize that young users are using their services, and they can provide a safe environment for young people.  At the moment we have age restrictions like 13 or 16‑year‑old, depending on the service.  If you look at YouTube, Instagram, Snapchat, What's App.  All these services that are used by lots of young persons.

I think this gap between the able restriction and the reality, this has to be closed, and companies should start to acknowledge that ‑‑ that they are used by younger users, that young users want to participate, and it would ‑‑ wouldn't be an option to just shut down or to exclude all the young people from these big platforms.  We also talked about YouTube kids, which could be a good start.  A good step.  It's interesting for very young children like 8‑year‑olds or so.  What about the age group between 12 and 16?  There's nothing there for them.

(Session concluded)