IGF 2019 – Day 3 – Convention Hall I-C – OF #28 Internet Governance with and for the Citizens

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> Hello there.  I'm one of mission public and with our team that is here, eve.  Nadia ‑‑ where's Nadia and Pascal.

Okay.  We welcome you today to this open forum.  We are very proud and happy to have you here.

It's a project that we've been studying for 2.5, to three years ago, and it was quite a bit ‑‑ quite a crazy idea to bring citizens into internet.  And when we mean citizens, we are meaning ordinary citizens, so people who have no idea and no knowledge on what is internet governance, but they have a role to play in that story.  We are very proud, and it's part of it.  We want it.  We have been organizing 12 preliminary discussions in 2018 in order to understand priorities and agenda of citizens on internet.

And then in 2019, this year, we've been organizing workshops in five countries of the world with deliberative process with balanced materials of randomly selected citizens and the gathering of groups that we deliver back to policymakers this year and decision‑makers this year, and, so we've been very happy we've been doing this cycle, and we are more than happy that it is not over and that we want also today to launch next year and next year we want to scale that process to something, like, 100 countries, that's our ambition, and we want to have that in June next year.

But today we're going to talk of this phase of dialogs and workshops, and you talk about the results of disinformation and digital identity but also on messages to IGF from the citizens this is one of the topics that we addressed this year.

Maybe now we can have the presentation.

Okay.  So this is our goal to bring citizens into policies and say what we do at Missions Publiques, and we are very proud of having gathered a broad coalition of actors to prepare for this we have the preliminary discussions and the workshop.

During today, we are going to also have participation because in a way we like participation.  We love it, so we have a Slido running, and we will be asking some of the questions we asked to the citizens during the session, so each time you will see the Slido going, and we will ask you a question.  We'll be able to answer it and later.

We will give you a video impression of the work done this year and the five workshops done in the countries, so now we can give you this insight.


(Speaking Non‑English)


>> Thank you very much for launching that.


>> So we are going to ‑‑ so when we say we work with ordinary citizens, let's look at the group, so in the 300 participants that we had, we had 49%, male 51%, so it's a very good gender balanced for the participants.


>> Thank you.

In terms of age participation, we had participants from 17 to 82, and that's also a very good number for us and the participation was also quite good.

When we look at profession and what people did, we had also very good participation between all type of occupation and for that also very good diversity.

Then we also ‑‑ when we do such deliberative processes, we like to understand the difference between when people enter the process and go out of the process.  What did they learn during the day?  Did they learn something?  And did it change their opinion?  And for that we have questions we ask in the morning and in the afternoon.

And here you can see we asked them in the morning and in the afternoon, and they see the internet is an opportunity more than a threat or on the contrary or you see between the day you have a good shift to more opportunity, which is quite interesting knowing the fact that we ‑‑ they had the full day like disinformation, identity, and people get more confident about knowing and knowing more about the topic.

And the next one we asked them ‑‑ were there no different topics like disinformation, so you see morning/afternoon it's a huge shift and also on digital identity even more ‑‑ even bigger, and on internet governance, and so we learned people learned during the day and had a good discussion on the topic.

So now we have a question that should have appeared on the Slido, and this is one of the questions we asked the citizen, and you can join it and answer it

Now we'll turn to our national partners from last year and this year, and I will first start with Atha, you were our partner in Uganda in 2018, and you ran the preliminary discussion.  I'd like to ask you:  Why did you decide to join as a national partner and how fast this experience?

>> Good afternoon.

We decided to join as national partners based on an idea meeting that we had with Antoine.  I think it must have been at the IGF in 2017 or thereabout, and what came out for us was the misconception that many people have about the internet.

So when we reviewed the questions and the guiding principles behind which Antoine wanted to achieve, it was something we really wanted to be a part of.  Something we wanted to join, and it didn't disappoint because the issues that we in Uganda considered, the internet not necessarily what is the internet.  So it was an opportunity for us to try and clear the air, again, like the results you've seen many people had misconceptions in the morning by then by evening many of those have been cleared, and we look forward to doing this again next year.

>> ANTOINE VERGNE:  Thank you.

I turn to Peter, Peter?  Yes?

>> Antoine, I just realized this microphone comes out of this headset but all the other ones do, and I don't understand ‑‑

>> ANTOINE VERGNE:  I will switch the microphone.

>> Well, I can manage with the transcript.  I was just discovering ‑‑ I was hoping it would work both ways.

>> ANTOINE VERGNE:  Maybe it's on the team there.

>> It has nothing to do with that ‑‑ the microphone doesn't come out of the headsets, no microphone will, I'm sorry.

>> ANTOINE VERGNE:  Is it okay for you, Vince, to put on the translation?

(Speaker Not Mic'd)

>> That's not going to work; is that right?  Well, I'll just use the transcript and hope you're very clear so the transcriber gets it right, yes, okay.

>> ANTOINE VERGNE:  I'll try to practice my accent.


>> ANTOINE VERGNE:  Let's see if that works as the French ‑‑

Peter, you are the German partner this year?

>> Yes.

>> ANTOINE VERGNE:  Same question:  Why did you join and what happened?  How fast was this experience in joining the project?

>> Yes, thank you to be here.

Me, I'm coming from Mannheim.  It's in the southwest of Germany.  Mannheim is a city with 320,000 inhabitants and 2 of those inhabitants are also here in this room.  I say a very warm welcome to Elana and the very frank from Mannheim.

It's very interesting for us to make a cooperation.  And to work with you, with Missions Publiques.  We are ‑‑ my job ‑‑ I'm the head of the office for democracy and strategy at the City of Mannheim, and I'm also responsible for the online communication of the municipality, and so we had three reasons and three topics which were important to collaborate with Missions Publiques and with you, Antoine.

The first actuality in this space ‑‑ a huge relaunch with our digital communication channels.  In this process we are very open for the issues of the citizens and the civil society.  We want to focus the requirements from our citizens, so we have similar questions to the society with Missions Publiques and perhaps we get interesting ideas also from this from your process.

The reason why, we are responsible in Mannheim for the participation of political issues for our citizens, and we get in the last year some experiences in the random selection and invitation of our citizens for such processes and events.

I just wanted ‑‑ I just want you, and you to make this also here in your workshop, and, so it fits together well in this moment.  And the reason Number 3, it's a huge interest in Mannheim to cooperate with citizens with an international contact context and international impact.

We took part of the global meeting of the global parliament of mayors in November.  It was two weeks ago in Durbin, South Africa, and he was elected as chairman of GPM the global parliament of mayors, and he is very interested in this collaboration and at the international level.

>> ANTOINE VERGNE:  Thank you, Peter.

I just have to tell you a secret.  My family is from Alsace, and one of the family names is Mannheimer so ‑‑


>> ANTOINE VERGNE:  Today we are ‑‑

>> You're always welcome in Mannheim.  It was last week ‑‑

>> ANTOINE VERGNE:  Thank you, Peter.

And so now we are going to show some examples of disinformation.  We're going to dive into the first topic of disinformation, so one question we asked of ‑‑ problematic is the spread of disinformation for you and for the world, and as you see, people in the five countries are ‑‑ see that as quite problematic but much more for the world than for themselves.  They have a kind of discrepancy between who they field themselves of disinformation and hold the impression that the rest of the world is exposed to that, yes?

>> Have is there been differences between the countries because I assume this is amongst all the participants, but you considered it in different places?

>> ANTOINE VERGNE:  Yes, it is in this case particularly there is north‑south global south difference and, of course, if you rank from the less ‑‑ well, if you rank from the less problematic to the most problematic, you would have Germany, then you would have Rwanda, Bangladesh and Fuji Camp and Brazil and Japan.  You would say Alsace is not really working, but this would be the ranking.

On the second big question we asked citizens to work on, and this was a qualitative work scenario we asked them to rank and run how to tackle disinformation and what it came with was first education.  It was the highest ranked and very high ranked in comparison to the other one, so what does that first mean, and you have a quote for one of the citizen is that people see one of their responsibilities as a person and also to get educated and be active themselves.

Second ranked was fact‑checking tools, the algorithmic test and tools analysts making fact‑checking and relying on third parties in order to tackle and spot disinformation.

And the third one and the least important for citizens was regulations, self‑regulation by governments and companies, so that it did not seem to gain the most interest from the citizens as a solution.

I would like our two persons in the room to react on that, and I turn now to Cedric.

You are working at UNESCO and UNESCO is working on freedom of expression on access to information and also disinformation.  You are hitting the UNESCO sense and culture and how do you react to the first ‑‑ the results on education?  What people say is our part and responsibility?  Is it enough for you to tackle disinformation through education or is it this ‑‑ is there more to that?

>> Thank you so much, Antoine for inviting UNESCO to join, and we are a strategic partner and fully supporting this, and I think you showed also in your slides how the participants and citizens are learning throughout this exercise, but the numbers are quite ‑‑ quite staggering.  86% find that problematic.  39% in the Slido you said showed a strong exposure to disinformation and the first source of information from these people who are also on the internet ‑‑ so there is a big problem and, of course, you will not be astonished to hear from UNESCO that we are a rights‑based approach to tackle this situation in a way and education as many of the citizens said is, of course, first priority.

Now, is it their responsibility only?  We don't think so.  We have in UNESCO a longstanding program in information.  We have it in the curricula, and we help teachers, pre and in‑service teacher institutions, to teach media and information literacy, too.  We have books.  We are doing research, too, so education is central, but it's not the only responsibility, I would say.

Of course, fact‑checking tools are important, too.  Self‑regulation of the media, of providers of news is important too ‑‑ is second.

As we heard at the discussion of the Internet Governance Forum that some thought also that there needs perhaps in the way that internet leaders see their business models, which is sometimes linked to that.

Now, for UNESCO, we are working on freedom of expression, and we are also strengthening the capacity of journalists which we have education journalists' curricula which we have been updating in this information, deep fakes, and so on and how to check this.

Now, the first approach I would like to share is the one of internet universality and the Rome approach to UNESCO, its human rights, accessible, and we help in countries in doing that, but we do also very concrete work because the instance for the multistakeholder‑shaped internet.

And we are, for example, with the ITU co‑vice chairing the broadband commission where we created a working group on disinformation, and we have Facebook, Twitter.  And many of the key leaders there, and we will publish new research at the beginning of next year, and we have very concrete measures, and I will end on that in terms of the judiciary.

We have ‑‑ we're training in Latin America alone we have trained 13,000 judicial operators.  And so if we're speaking about same rights online as offline, there's also a need to address that because there's more and more being brought, and I have to excuse myself.  I have to run off to the session with our Chief Justices, which is later.

So UNESCO will continue to protect human rights around the world and develop sustainable ways to counter disinformation, and we are keen in doing that with all of you thank you.

>> ANTOINE VERGNE:  Cedric, head off to your next session, if you have to, and thank you for reacting to that question and that result and related to the work you are doing.

You have a question?

>> My name is ‑‑


>> I'm from Afghanistan.  I'm a member of the accountability of internet.  I'm looking for information for the equality perspective that disinformation, of course, especially in the countries like ‑‑ developing countries raise more problem because there is no one accountable on this one, and I think there should be some law and some condition in the law and there should be policy and regulation for that one; otherwise, disinformation will speed more and effect on the people and the human and sometimes, like, if you cannot stop the technology ‑‑ I think technology is improving with vendor internet improving, so this is raising, and I think like some countries did a tech analysis to have accountable on Facebook or some countries stop some of the social media so faced such as issue but like in China and the EU, I saw they brought some condition for one Facebook ‑‑

>> ANTOINE VERGNE:  Uh‑huh.  In the third category regulation you would have to choose a way to tackle disinformation.

>> If there is a regulation and if there is some condition in the law for some policy steps to the people should enter, which is not a good way among the cities and the people.  Thank you.

>> ANTOINE VERGNE:  Thank you very much.  Thank you for the reaction.

I turn to Nina.  I wanted to ask you on the contract for the web and the contract for the web say it's a global plan to make the online world safe and empowering for everyone, so how does that reflect with the result we have shown and the consensus and citizens put on education and the question of regulation and the difference does that work with the contract for the web and do you have to change the contract for the web or do citizens have to change to adopt the contract for the web.  What is your reaction for that?

>> VALENTINA SCIALPI:  Thank you very much.  Good afternoon, everyone.

Greetings from the worldwide web foundation.

We're happy to be part of this initiative because the vision of a man that put together the protocol that gave us the worldwide web is that everyone should be a creator and everyone should also benefit, so this is very important.  Ben is here, and he can speak more to the building up of the worldwide today.  It has always been in the spirit of collaboration and cocreation.  That everyone should be able to contribute and everyone should be able to benefit.  I haven't said that.

The issues that we are raising in this dialog are the very same issues that brought us to the contract for the web.

If you want to look at what we do as the web foundation, we don't do contract.  We do something else.  There's a slight difference.  We do three things:  We mobilize coalitions to be sure to be safe and the web we want.  We do research, and we lead policy thoughts and changes, and with this exchange in working with all of our partners that brought about the contract for the web.

There are nine principles in the contract for the web, three to governments, three to private ‑‑ the industry, two for citizens and one for all of us.

I want to go back to Principle 8 of the contract for the web.  It enjoins citizens to use and create and to maintain civil discourse.

In many places, we have seen people who knows what their rights are.  We have some citizens who knows what governments you do for them.  We have same people who knows what they have a right to, but nobody is always speaking about your own responsibilities.  Who does hate speech?  Who brings about misinformation?  It's the users, so we all have rights, but we also have responsibilities, and that is the reason why we think we should have more debates, we should have more discussion.  No, we are not changing the contract.  We are building it.

Last year we launched the principles.  Two days ago we launched the full contract, now we are looking into the actions in the country.  And this we believe will be one of those actions let's have a dialog and see what works and what doesn't work.  Let's see who does work and what can be done?

And so it's very important that we work together, we the internet, and the worldwide web and in this case, the contract for the web to make sure that as we are benefiting from the web, we balance responsibilities and rights.

And I'm not just talking about government.  There are things that governments should do, there are things that industry should do.  But I when I log in, there are also my rights on one hand and my responsibilities.  It is for everyone.  It is because everyone.  It is with everyone, and we're all into it, and that's why it's one vision, one web, one internet.  We are all here, and we all have a part to play.

>> ANTOINE VERGNE:  Thank you very much, Nina.

And one of the deliberations was the rights and responsibilities, so I don't have the reasons here, but they are in the database, and we will exploit them and have some vision and insights on what citizens thinks are responsibilities and rights.

I would like to now turn to the results of the Slido because ‑‑ okay.  On this, three participants but 100 percent very problematic, so you are quite aligned but much more concerned the citizens.  There's three of you.

And the next one:  How to tackle disinformation.  Here we have quite a difference.  You can see first education, and then regulations, said regulation and not much about fact‑checking, so you don't believe much in fact‑checking tools.  That's not interesting.  One wants to react on that and the clock is running, so we go.

I was thinking ‑‑ does anyone want to comment on why one of the choice ‑‑ yes, it's changing, okay, it's changing a bit.  Why not fact‑checking tools?  Maybe that's an interesting question.  Okay.

>> Okay.  My questions is a little bit about something that was just said earlier about the disinformation if I could pose a question or comment about that.  I'll be very brief.


>> Very quickly.  I'm a little concerned sometimes that combating disinformation can have an unintended negative consequence.  Some of the things that gets labeled as disinformation can be dissent in society, and I think of examination in my country in the United States where a number of investigative reporters have found themselves enable ‑‑ they sort of lost their voice.  They declared themselves excluded from the legacy media, so they moved online but online at times.  In fact, they've been formally labeled as fake or fake news or disinformation when, in fact, I think there's good evidence it was dissecting information.  I think it's careful to go after disinformation and not label dissent in our society.

>> ANTOINE VERGNE:  Thank you.  The citizens had a very big discussion on the set here.  Satirical content and for them this was a big topic around disinformation, when is it disinformation and when is it satire and how to handle satire in relation to disinformation.  That was for the citizens, a very big topic.

So, yes, the lines are to be drawn, and I use this opportunity to tell you we have with us the balance information items that we presented to the citizens where all those definitions are also put into that we are sure that we have a common definitions of where we talk about when we talk about the citizens, and say we meant when we meant disinformation in that dialog.

Vince, you wanted to say something?

>> Thank you very much, Mr. Chairman.

I wanted to make two or three observations.  The first one is a counterfactual piece of information with regard to education.  What we have found in the United States is the fact that you have a good education does not necessarily prevent you from receiving and propagating misinformation to look for.  There's a situation where uneducated people are drawn into the disinformation loop.  We see this in the U.S. with the right and the left wing online and television media, so we should be careful not to jump to the conclusion that just because people are well educated, that's proof against the ability to resist misinformation and disinformation.

The earlier comment about dissent also has another phenomenon.  If you repeatedly tell people "X" is not true, sometimes they remember the "X" part, and they don't remember the not true part, and so we found by repeating a piece of disinformation and saying it's not true, we re‑enforce the disinformation you have to talk about psychologists about that.

There was a fourth option of the three that I see up there, and that's called critical thinking, and that goes together with fact‑checking except the problem of figuring out what are the facts and what sources do I trust for factual information to help me distinguish disinformation from good‑quality information?  And so even though we should be trained in critical thinking, I think everybody should feel that responsibility for thinking what they're seeing or hearing, it turns out to be hard to apply if you don't have a good source of facts.

>> ANTOINE VERGNE:  Thank you on that, indeed citizens ‑‑ so one of the subcategories of fact‑checking was critical thinking as you label it in the sense that citizens said we have to learn how to spot what is disinformation, so it was part of the insight they gave.

Thank you.

Now we're going over to digital identity, and I will present two reasons.  The first one ‑‑ we asked which citizens to model what they would prefer for digital identity.  A model where they have one central identity where they put everything, a model where they have one identity for each use case ‑‑ so for each account each channel of communication that they use and in between a couple of identities with possible one identity for health questions.  One identity for everything that has to do with communication and one for finances, for example, so this was the three models we asked them to reflect on.

And what we saw is that the one with a couple of identities, one in a way and the number here ‑‑ the 40% may not be the highest majority but when we looked at the arguments why it chose that one is indeed and what we see is because they were saying the tradeoff is really between usability and security, so for them having one for each use case is a good tradeoff between having all in one place and having a risk of being hacked and one where you have to ‑‑ for each and every service each and every use you have to create ‑‑ have a new identity.  That was the reasoning behind what this number says.

The second question we asked was about:  Who should decide?  The question of governance of digital identity.  And for that, we asked them to fill in a table with the different actors we know here, so the different categories of stakeholder and the level of power of process in deciding how to govern digital identity.

And what you see the quotes from one group ‑‑ so it's one table from Rhonda, and this table gave the arguments why they think those groups of people should have this role.  And so as you see, it's kind of representative of the kind of arguments you have at those tables during the deliberation, and the model which came first was co‑deciding, meaning something more than multistakeholder.  They had a preference for every stakeholder having a voice and for the system having to find a consensus for solution.

So what they saw on that data ‑‑ that if this doesn't work, there seems to be a preference for governance and private sector to decide on digital identities.  Governance and companies, so that's how citizens ‑‑ the governance for digital situation.

I turn now to Sharon.  Sharon, you're here.  And how it has an impact on digital society and digital encryption how do you see that to your work and how do it relate to that?

>> Thank you, Antoine.

It's a great pleasure to be supporting your project which we will ‑‑ I hope we will continue to work together next year as well.

Encryption as you said is a big part of managing our digital potentially multiple online identities, and it's a very important tool.  We talked about opportunities and threats.  They both coexist in the internet world, but encryption is a very important tool to make sure that we ‑‑ the benefits the opportunities still outweigh the threats.

Securing our communications is not a high‑tech thing.  It's like if you look at your tablets people still try to have some sort of private communication and encryption ‑‑ even if you're dealing with government, eGovernment services, digital health, banking, finance or while simply chatting with friends we rely on encryption in our everyday life, so it's not a technical buzzword, which is incomprehensible for mere mortals and every day citizens, which is why your work is very important for humanizing the narrative because we need to raise awareness there is a global trend for mostly law enforcement purposes and legitimate purposes, too, for crime prevention, for accessing information when it's necessary to prevent crimes or when you have court decisions, there is a global trend to weaken encryption or to access encrypted information through either encryption vectors or other initiatives like middle men provisions or like the gold proposal in the U.K.

I don't really want to point fingers.  We need to raise awareness both in policymakers and also in the community because as you're trying to ‑‑ if you take encryption back‑doors, for example ‑‑ if you have encryption back‑doors and weaken encryption, the bad actors will also try to use and potentially succeed in accessing that information so as we're trying to provide security, we're compromising the security of individuals, privacy of individuals because constantly we still have this false dichotomy of privacy versus security, and they actually have to coexist privacy and security cannot be isolated from each other if you're talking about securing our online communications.

So this is very, very important in our work, and also we have the security versus security argument now, and we really need to humanize this dialog and while governments or policymakers have legitimate interests in accessing some certain information, exceptional access might not be the best idea as you can compromise network security and also for especially a very human population for vulnerable groups like indigenous communities, LBGTQ+ community, so they also in order not to be discriminated, and these are very legitimate problems although it sounds like ‑‑ but changing the narrative on encryption we should all do, and I congratulate your work together and hope to work together more in the following years.

>> ANTOINE VERGNE:  Thank you.

There is one question we asked to the groups was:  Who would you place the cursor between anonymity and transparency on the internet?  And the citizens worked so well that they crossed both, and they gave the arguments why so now we have to extract that qualitative data, so I don't have the quantitative one, and I know they talked about that and look for what they say for them and was speaking for one or another and if they have a presence.

>> When you're talking to people, again, coming from the privacy versus security, I have nothing to hide argument it's still very strong.

For a starter, privacy is not about hiding information it's about empowering people whether or not to decide, sharing your information, so I have nothing to hide argument fails every time.

There's several ways to address that I usually ask for their banking password or some other very private information, and then they start to think it's a matter of sending out the message on that, too.

>> ANTOINE VERGNE:  Thank you.

And then to your neighbor, so you have been waiting the week, the entire week, and you're still here, and I thank you very much, and the opportunity thank you very much, and you're the first partner on board so thank you for that.  You will be in the beginning including citizens in the discussion is something very important, and I remember our discussion on Geneva, and you said you have been on board, and now you've been working at the core of the IGF process.  Is it still a good idea or how do you relate that to the week that you have been experiencing until now?

>> Thank you, Antoine.

And congratulations for the very good achievements, and we think it's very important because one of the aims that we set ourselves as the humble host country and knowing this is about that process, and so forth and still we wanted to enlarge the scope of the debate by various means involving parliamentarians and involving SMEs and also by involving ‑‑ what you call ordinary citizens, so the people that are not in a daily day‑to‑day context confronted with all these internet discussions that we are having here but while, of course, exposed to the results of what we are discussing here, and I think that's really crucial, and you could argue that the citizens' opinion is also somehow represented by the parliamentarians, by the governments.  And if they're democratically elected, they reflect what is going on but still I think in such a complex and fast changing world, it is crucial to have a direct feedback also to this community from the citizens, so really I think this kind of involvement of the citizens should be continued and should be ‑‑ and should be enlarged, and we very much welcome the diversity of the citizens that you involve.

As you said, general diversity, regional diversity, stakeholder diversity ‑‑ that's so important, so it gives not every single voice but all together it gives a really nice picture of a mosaic that comes out of the these small pieces a very nice picture, and we should definitely try to keep this alive.

>> ANTOINE VERGNE:  Thank you very much.

Yes, reaction here.

>> Yeah, I don't know I have permission to ask a question or not, but I have a question about the privacy of data individual and there should be encryption.  But when we encrypt the data or encrypt the data, there's going to be some cost to the data.  That cost will be suffered to the customer or will it be paid of the responsible service provider?  I don't know who will pay for this and the cost of this in connection with the information?

>> Do you want to mention that or if it's ‑‑ the cost of the individual?

>> There's always several business models for that but ideally we support having default and end‑to‑end encryption and a strong one provided to the end user, so the burden should not be on the end user financially in the ideal situation, of course, we are hearing some concerns coming from service providers, too, but it's the ideal; is every end user should have the opportunity to have very strong encryption by default.

>> ANTOINE VERGNE:  And now to Vince.  And, Vince, you were a member of the high‑level panel on digital cooperation, and we have seen in the governance the part of digital identity that citizens could decide on a model.  Is this something for IGF+ and that track and what your stake on the wish of citizens to have a model where we have more than multistakeholder?

>> So this is a purely big topic, so I'm going to pick a few bits of it.

With regard to the high‑level panel, what we were looking for is digital cooperation across international boundaries.  And with regard to identity, let me pose for you one of the more important benefits of having strong authentication.

You have the ability to assert that this is me, and I signed this document or I made this statement, then you defend yourself against someone else trying to claim to be you and trying to do something, which is effectively disinformation.

So strong authentication is your friend here.  This is not an argument that says everything you do has to be strongly authenticated.  It's an argument for having the tool available when you need it and want it.

You can easily imagine having different levels of strength in the ability to strongly authenticate.  In the case of a contract, for example, which takes place either domestically or internationally, you might want very, very strong evidence associated with the cryptographic individual, so that it would be hard for someone to engage in a contract that draws you into a commitment that you didn't make.

On the other hand, you could imagine very lightweight kinds of authentication where the amount of the information of identity is limited to you're an adult or not an adult or you live in a particular locale or a different one without giving all the other details of your identity.

So this leads me ‑‑ I don't want to misrepresent this.  My personal view derived from having participated in this high‑level panel is that there would be good reason to have more than one available identity for different purposes.  A purpose‑built identities instead of a single one because if you have only one.  And if somehow it's penetrated, then everything ‑‑ all bets are off.

However, I have to tell you as an engineer, that there's this little technical problem.  At Google we make very heavy use of strong authentication.  We use two‑factor authentication.  We have a physical device that has the cryptographic keys in it, and we register those devices, so that we can't even get into our systems without using the two‑factor authentication, so I feel really good about that.

And then I think I have about 300 different accounts scattered around the internet for different purposes.  If I had to have a separate physical cryptographic key for each one, I'd have a big bag, I'd probably be more healthy, but I'd have a big bag if you love 300 of these things, and I would be ‑‑ full of these things, and I would be trying to figure out which one to use.  This is an opportunity for somebody to build a product that has the ability to hold hundreds of cryptographic individuals, so you could use the same thing calling on the appropriate ones which also suggest standards which also suggest digital cooperation across boundaries to establish standards, not only for the technical side of things but for the bonafides that you present in order to authenticate before you get your credential, so this, I hope, it turns out to be a very rich opportunity, a very concrete thing that we could do in the digital cooperation space from the IGF+ perspective we should be bringing as many use cases as we can to the people who could develop these products and services, so that they do something that turns out to be actually useful and useable.

Could I ask a completely unrelated question just to get it on the table?

The data that we just saw from the exercise of this program showed a significant shift in attitude from morning to afternoon.  It's really important that the reasons that led to the shift be exposed to the rest of the population so a very important question is:  How are you going to get that learning into the hands of the rest of the population and not the 75 people that happen to be in this room in Tokyo at the time?  You don't need to answer the question now, I just want you to know that's really important.

>> ANTOINE VERGNE:  Yes, it is, and indeed ‑‑ one way we like to do with that is work with national partners, so that we can use it in the countries to raise awareness.  And, of course, if we start dreaming, we could have such dialogs in thousands of places every year, and then we would have enough people that would learn that, but it's maybe in the coming years we have to win that.

Max, you have something to say?

>> MAX SENGES:  I'm Max, and I also work with Google, and I have a pleasure to work with Antoine and his team as an academic advisory board and; hence, have a fairly good understanding and congratulations that you got to this point.  It's really awesome.

A couple of points, building on what Vince said ‑‑ I agree that use cases are a very, very good thing to consider as we are discussing governance points; however, I do not speak for Google, but I personally would strongly disagree that there is a co‑voting model that involves citizens.  I think the multistakeholder governance model is quite evolved and there are different roles for different stakeholders, and it's a misunderstanding that the governance actually happens here; right?  We're exchanging; we're deliberating; we're thinking about solutions, and then every stakeholder goes back, you know, the companies are not involved in the law‑making.  The governance are not involved in the product‑making and the coding and; hence, you know, I think everybody has their role and idea to bring parliamentarians to represent the people I think is really, really good, which is a big step forward.

We have 120 parliamentarians participating this year, and I think the participation by NGOs that really bring expertise and be efficiency watchdogs and human rights watchdogs and contribute to the conversation is really good.

Allow me to add to Vince's points about the benefits and the qualities of this exercise.  I think to understand the reasons for the shifts in opinion is really the strength of this exercise, to actually understand also how we can come together for the qualitative analysis, and the data between morning and afternoon is really, really, really important because, basically, what that shows you is that you can argue a normal user questionnaire will not get you the right results if you just went out, and you did a direct democracy exercise, and you asked people:  Do you want that or that?  And, you know, that changes really significantly after a thorough deliberation of the pros and cons and probably we should not go out and ask people on the street about it.

Very importantly, I don't know if you mentioned this before, if you did, apologies.  The balanced briefing materials very, very valuable resource for this community because they are peer‑reviewed, and they list not only the challenge itself and the topic itself, but the different options that are on the table for how to solve it, and not only that these are listed ‑‑ they even include the pros and cons and explain what's good and what's bad.

In this case, we have put the balanced briefing materials at least for the better part of the exercise on the IGF Wiki which you can find IGFWiki.org, and I hope we continue to evolve and update them as a resource for the community.  Thank you.

>> ANTOINE VERGNE:  Thank you, Max, and thank you for the coordination and collaboration.

Andrey, you wanted to say something, and then I give the last word to Elena.

>> ANDREY SHCHERBOVICH:  I'm with the national partner of global base in Russia representing National Research and economics, and I'd like to have a proposal just to sort of discussion.

I formulated this proposal maybe it is possible to create the next stage of debates which be focused not within single country but on international ‑‑ with representatives of different countries and stakeholder groups.  I think this forum for debates will be useful.

>> ANTOINE VERGNE:  Thanks, Andrey.

We talked in the morning ‑‑ I think it's the next steps in deliberation and multilingualism, and I think you're right, and this is something we would like to test or so next year and also talk with other partners and thank you for your recommendation to do that, and I'm happy to work with you on that.

Now I turn to Elena, and you're here.  You participated in the dialog in Mannheim, and you are bringing with you a message to decision‑makers that you've been producing in the different workshops, and I'd like to hear you on that.

>> We collected different statements in the workshop, and I had to choose one and the one that I chose was ‑‑ chose now was:  We wish in all schools children learn different internet the hardware, software and the processes so one can become around a responsible internet user, and the reason I chose that statement was because by being at this conference now, I ‑‑ I was surprised and also shocked about how many of the words and terms used I still don't know, and that's ‑‑ and we had a lot of discussions in the workshop about disinformation internet security and what we're discussing now as well, and I think to be able to handle those risks and to tackle down the myth that we have as citizens about those things because we are always hearing about those things and have a certain industry of the internet because of that, and we just need to know more about it so need more about the processes behind it, for how the internet really works, and then we can really then we can start use ‑‑ start to make profit of it and to use the internet from its best way, yes, and for me the best way or the first way to start to get the information is at school because we are familiar and learn and get the necessary information.  So with citizens as well be able to participate and to be able to be asked when you ask me in the street, I could give you the answer then.

>> ANTOINE VERGNE:  Thank you very much, Elena.

I have two contributions here.  But before that, I would like to hear Frank with the second citizen that participated in Mannheim in Germany and also had a second message, and then I will take two more contributions, and then we will close the session.

>> Okay, oh, it's just you.

>> There we go.

>> Okay, okay.  With a lot of topics isn't it, and we have to choose one out of the them, and that's what I have chosen so far.  We think it's a key in the protection of our personal data must be examined as a priority at the international governance forum.  The applications that we'll need a digital identity depends on it.  It is the citizens, so all of us have a right of transparency.

At the end ask the question who is the owner of the data?  Isn't it?  Who can create data?  Who can change data, who can manipulate data?  And who can delete data.

And the last point is a very important point.  I also have to write that data along or I don't anymore ‑‑


>> That I have to lead it in a secure and creative way, and I think the artificial intelligence ‑‑ some companies or some states, governments can create new data that I never put in the system, that are data all behind and the background using for control, using for manipulation can use for new business models, but that are related to my identity.  I never have seen it.

And the most important point for my perspective is to have a transparency.  This is the most thing that I'm missing.  At the moment I have no transparency.  I can look ‑‑ and let me say I created as data in the different software systems and the different applications that I can do, but what is behind it?  I don't have any clue what's going on there, and that's the point here:  We need rules.  We need rules worldwide because the data centers, the applications are running worldwide and are running not in Germany to be brutally honest, and we need let me say international regulation, international catalog, what is going on, what can be used, what cannot and especially for the deletion of data this should be also an agreement on the international base that's very important for my side.

>> ANTOINE VERGNE:  Thank you very much, Frank.  This point on data was a big discussion all around the world, so that was very important.

You had one question, and I turn to you, Vince.

>> Well, it's not exactly a question but a comment, but I will be very concise.

First off I would like to thank you as an organizer because contrary to many of the sessions I've been here at the IGF this has been straightforward and easy to understand, and this goes to an important point that I would, like, to make because I'm here representing the consulate of Europe and also representing an ordinary citizen.  And when I think of the debates that we're having here at the Internet Governance Forum.  I think we're oftentimes we're very technical.  We use very difficult words that are not very accessible to a lot of the population, and I think it's just important to remind ourselves that each and every one of us here is also a user and what we're deciding about is essentially the future ‑‑ or well, we're not deciding.  But what we are talking about is the future of the users.

So just as you said earlier, education is very important but what is also very important is that the topics that we're talking about ‑‑ they become accessible to the ordinary citizen in this world who are affected by what we're talking about, so that they also are empowered.  That they can bring in their own views.  That they are empowered to decide on their own what is important to them and what is not.

So in terms of developing this format, I would actually be highly in favor of developing a format where ordinary citizens are much more involved in a structure of an IGF in this dialog and internet governance.  And just to make a last point in the council of Europe the youth department, we use a co‑management system where we as representatives of the youth organizations work together with representatives of governments to formulate policy, and this exactly forces us to not talk in a technical policy language but to be able to break down what we're discussing to something that's accessible and easy to understand for the average young citizen we are talking about in Europe, and I think that's very important.  And for that reason, I would like to call upon a more easily accessible dialog and thank you very much for your contribution.  I think it's very valuable.

>> ANTOINE VERGNE:  Thank you very much for your comment.

Vince, I give you the floor, and then we close the session.

>> Thank you very much.  Sorry, I have to switch back and forth because the microphone problem.

Two comments, first of all, with regard to this question of understanding how does the internet work it's a pretty big complex system, do you know before we allow children to drive cars in the United States, they have to take a course called a driver's training course.  Maybe we should have an internet driver's license where you have to pass an exam that says you understand enough about the internet so that you can feel like you know how to navigate this complex environment safely.  So it's only tongue and cheek, and I really think something like an internet driver's license would be a great course to have in school.

The second point has to do with what my friend, Frank, had to say.  There's a very interesting problem in the internet, and that is information about you ends up on the network coming from other sources than you; in particular, it comes from your friends and your family and your colleagues and ‑‑ and your colleagues and people that you don't know.  I don't know how many pictures of photographs go up on the net into the social media.  Sometimes they were pictures of taken of someone else, but you were caught in the picture.  I don't know how to cope with the fact that there's a lot of stuff about each of us that shows up in places we never go in the net; would not know even how to search for and in a way if you really wanted to take pictures ‑‑

For example, if you wanted to discover where every picture of you is on the net, the only way that would work is if we used really good facial recognition and some of us run away screaming when we think about that possibility, so this is an almost unsolvable problem to figure out how to track down and be aware of information about you and your business that you didn't put in the system but somebody else did even inadvertently.

>> ANTOINE VERGNE:  Thank you very much and thank you for all the contribution and for being there this afternoon.

We are now studying the next phase, and that's 2020.  We want to scale the process, and we would like to invite you to be part of that, and you can show the last Slido ‑‑ we can think of a partner versus and be partner to a country and be a partner strategically, and we would be very glad to have you.

We have the copies of the materials you were mentioning, Max, as a paper if you would like to take one you can find them, and you can talk to Benoit or discuss with me or anyone, and I really, really thank you for having contributed to that and for being part of this project.  Thank you very much.