IGF 2019 – Day 0 – Raum II – High Level Internet Governance Exchange Panels on Data Governance and Safety

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> Excuse me, ladies and gentlemen.  May I have your attention, please?

So actually, the main program right now is a little bit delayed, and, yes, this is all dependent on our panels right now, on all the nine panels.  So, yes, feel free to drink another cup of coffee or another glass of water or something like that, and come back in around about 20 or 30 minutes.  Thank you for your attention.

>> JOHN DENTON:  But it must be able to satisfy the needs of civil society and the business, the business community and citizens more broadly, and be inclusive and ensure that like the ICC, which represents world business, not just northern hemisphere business, but also southern hemisphere, developing and developed countries, big companies and small companies that if the Internet is not governed in that way, then frankly, what are we doing here.  I mean, that will happen, isn't it?

So our aim is to ensure historically the free flow of goods and services and now the free flow of data in that part of the challenges of the ICC, but from us the issue of free flow of data must protect citizens' rights and privacy and enable the Internet to be protected on appropriate security and cybersecurity issues as well.  And in that context, we launched a campaign to make technology work for all.  And it won't work for all in this appropriate governance over the issue of data and we get the balance right between sovereignty and ‑‑ and the interests of those participants in the Internet.

By the way, one thing I did notice in the previous sort of plenary is that one voice that's not represented anywhere here, that hopefully will be is a third of the users of Internet are actually under the age of 18 and where are they?  I mean, hello!

But it is actually kind of alarming that here we all sit around and yet one of the great user groups which is actually youth is not appropriately represented in this discussion.  So of course, Minister Mamonov.  Instead of waiting for Wolf Hisserich, if I could turn to you, Jonathan.  We only have 30 minutes.  So over to you.

>> JONATHAN KALLMER: It's a real pleasure and honor to be here, ITI, my organization represents 70 of the world's most innovative companies.  We have our largest presence in Washington, D.C.  We also have an office in Brussels.  We do work around the world.  It strikes me that the motto of the IGF this year, one world, one net, one vision is particularly appropriate for the topic that we are going to discuss.  And in particular, the notion of sovereignty, which is a reality important term.  It's being used a lot, and it reflects what I think is a very legitimate, challenging, question that actually came up in the opening speeches that we just saw concerning how to reconcile very legitimate national considerations, national values, values and preferences that occur at the regional or local level with what is ultimately a global endeavor, and a global imperative.  I think it's almost cliche now to talk about the ways in which the Internet depends on an open, global system and approach to succeed and to deliver all of the benefits of standards of living and growth and innovation to the world.

Tackling the very legitimate public policy challenges, of which there's none more important than how to resolve issues around data rights is similarly global.  And so the idea that I wanted to leave with the few minutes I have is just to implore people, regardless of nationality, regardless of position, whether you are with industry, with government, with civil society, or otherwise, not toss supplicate.  They are incredibly value and make us who we are, but to recognize that the challenges we face ultimately are global in character.  We can only truly overcome them by viewing them at that level, and with that kind of a mind‑set, I think we are best positioned to succeed.

So thank you, John, I will yield the balance of my time to the rest of the panel.

>> JOHN DENTON: We are working like an egg timer here.  We have the pleasure of having Minister Mamonov with us as a user of telegram.  I would like to know how you see enabling policy environments being informed to make all of this happen so the Russian experience would be very useful for us.

>> MIKHAIL MAMONOV: Okay.  So although you need to speak on behalf of the youth, I would rather say that Russia exercises rather conservative approach to data management, in many aspects, and there are reasons for that.  I will just maybe dwell on it some in my speaking points.  So when we speak about the trusted ‑‑ the trusted realm of our cyberspace, if I can put it this way, for information space, we have three different blocks here.  These are trust networks, trusted data, and trusted infrastructure.  So you cannot have trusted data unless the other two elements are in place.  So to ensure the trustworthiness of data in our world is really, really, challenging issue that has nothing to do with data per se.  Although what we see is that ‑‑ well, in a data‑centric economy, we ‑‑ Russia is a small market, even if united with the Eurasian union markets in terms of data.  It's a small market.  We cannot force our AI without developing norms and principles of exchange of data and other big data information with our, first of all, European partners.

And for us identifying, identifying the unified norms of data exchange, least on the content of Eurasia is of paramount importance.  The other thing, what is a necessary, I will speak on another point that I want to highlight here, maybe a little bit more time for Q&A later.

We, in Russia do not really control or somehow register fake news per se.  Although we do acknowledge the problem of that, because lots of individuals, corporations, political entities, for those, I don't know, commercial or political gains seek to disseminate information that is a best, not qualified or at first is called ‑‑ can be called fake news.  This is a huge problem we are trying to deal with.  And I don't personally believe in registers when we label some companies, I don't know media agencies and some individuals as fake news providers.  We need to be more nuanced here.  Nuanced is everything about the digital economy.  In Russia, we are now developing the national data management system to increase trustworthy data in the society.  It's basically a nation‑based ‑‑ nation‑based endeavor.  The main idea is the principles on which the depersonalized data will be available to businesses and the society, thus ensuring smoother provision of civil services and bigger gains for Russian local businesses.  And, of course, we are ready to work here with the international communicate to improve our national data management system, unfortunately, we cannot be speaking about the exposure to best practices in this sphere.  We don't have any best practices so far.

One of the hugest economies of the digital economy, is the phenomena do not receive proper legal clarification within the necessary time frame.  This is one of the challenges.  It has an objective nature as far as what to do about it.

So basically, I think I have already abused my time.  My speech is somewhat hectic but when you speak about data, it's everything.  It's hard to encompass everything in one speech.

>> JOHN DENTON: But you did extremely well.

Of course we are out of order.  I would like to flick back to the beginning.  It's a bit like a piece of modern art.  Wolf Hisserich is with us.  I will ask you to make some comments.  It's all well and good to talk about the power of the Internet and interconnectedness, but the reality is what is civil society?  What if citizens don't have trust with it.  There's declining levels of trust in most institutions.  Do we ensure that civil societies and the citizens have trust in the Internet and the interconnectedness that we are all arguing for?

>> WOLF HISSERICH: Thank you.  I'm prepared a short presentation.  I'm sure you will see a lot and so we will just skip it.  I represent Qwant.  I'm in charge in Germany for the expansion internationally.  There's a presentation ‑‑ I will talk about it.  Are you fine with that, if I talk about it or you want me to show the presentation?

Should I hijack the time?  So very quickly ‑‑ hmm.

Not good, hmm?

I need to talk about it.

So it's a search engine.  We just talked about Russia.  I know the founder and CEO of Yandex, for those who are not familiar with the search engine.  We are living in the world of Google, which is the de facto monopoly in many areas.  Why is it so important to have choice?  You go to a supermarket, you go to French supermarket, you have choices like crazy.  With entering the Internet, we always take the same door.

Talking about trust, a lot of people have a lot of question marks regarding today's management of data.  We are not tracking and tracing.  I think this is important.  And like I said, with Yandex, you see the diversifying.  It's now a very strong shopping portal and you see the taxes, not only in Russia, but Istanbul and Tel Aviv.  And there you see a little bit of a development.

We as Qwant strongly believe in data security, data privacy.  Search engines, by the way are not so ‑‑ only important for us as users, as customers, but you can also use them for the government, the industry, think about a search engine of objects and the next level of the Internet, web 4.0.  Why doesn't IoT work like it should today?  Because there's a big lack of trust.  If you are able to match this, if you are able to find for European point of view, a strong counterbalance, where the winner takes it all, we truly believe from Qwant, from a European perspective, that it is highly important in the ‑‑ on the topic of European digital sovereignty, which has been sounded as a lame joke and now it's something quite different.

That is the bit where we see our role.

>> JOHN DENTON: Thank you, Wolf.  Hopefully we have time for questions.  We heard about the concepts of trust there and data sovereignty and the potential is argued for the interconnectedness and what it unleashes.  How do we balance this, Sonja with the rights of consumers?  Who is going to talk about that?  Are you going to talk about consumers?

>> SONJA JOST: Well, I can.  I can talk.

>> JOHN DENTON: In my notes it says direct the question to you to talk about consumers.  So I'm only doing what I'm told.

>> SONJA JOST: Sure, I can talk about consumers.  When we talk about consumers, we are also talking about data.  And if you have a look ‑‑ at the moment, you see two different opponent groups, when it comes to this topic, the first group, they say, well, data are very, very individual.  They are supposed to be private.  And the individual who generates the data should own those data.  While on the other hand, you have a group that comes from the view of society, they say, well, you can benefit a lot if you link different data sets together.  You can find out patterns and, for example, use these patterns to ‑‑ to heal certain diseases.  So that ‑‑ for this reason, data ‑‑ yeah can be or should be used, for example, by start‑ups, by small companies, for the benefit of all of us.

I do not really think that there's a contradiction in both views, because there's the opportunity to make data anonymous, and you can generalize data, in a way that all the different relationships between the data and are saved, but you cut off a certain string of those data that leads down to an individual person.  And if you cut off the string, the data sets are still very, very valuable.  They contain a lot of information.  You can create a lot of different business models out of it.  And it is not necessary to make money with data through a surveillance capitalism only.  There's a lot of other opportunities out there.

So we believe ‑‑ I have also founded my own start‑up several years ago, and I was for several years a board member of the German Start‑up Association and we do think that you can make a lot of money with the data without harming the privacy of the individual.  For sure, this was ‑‑ it depends a little built where you grew up, where you are coming from, I grew up in the liberal world and I strongly believe that ‑‑ that I should have a free will and I should decide freely the way that I spend my money and just to make it a little bit more specific what ‑‑ if you do not cut these strings from data and down to the individual.

What happens nowadays is all of those strings, they are used to build a very, very close network.  Yeah?  Through the individual person.  So we cannot really decide free.  We are controlled.  We are manipulated.  The Internet I see is different from what you see or you.  It's totally different.

So let us cut by regulation.  This is what I claim.  These strings, let the people decide when they want to share the individualism in order to get a recommendation for a restaurant or whatever, but nevertheless let us use all the information within data sets, in order to create new business, to help the society to further progress.

Thanks.

>> JOHN DENTON: Thanks, Sonja.  So do you have any perspectives at this point?

>> SIGRID NIKUTTA: Thank you.  So what I'm doing here?  I'm slightly different from all the others, because I'm the Chairwoman.  BVG.  The BVG is located here in Berlin, and it's Germany's biggest public transport company here in Berlin.

We operate all these yellow subways, trams and buses in the city, 24 hours, seven days a week.  Over 1 billion passengers per year use our services.  And perhaps you can imagine we generate really millions of data every day, every hour.

Traffic data, vehicle data, customer data, personal data.  And, of course, we use these data to improve our services.  But data is not only for our company, but the key to a more sustainable city of Berlin.  We use this data to develop our transport services and also to connect different mobility services in the city.

In our app, for example, customers can compare book and pay several mobility services, such as public transport, taxi, ride pooling, bike and other cost sharing.  Our aim in Berlin is with this data, to reduce traffic and to foster shared mobility.  With this data, we can really help to improve the customer's journey here in Berlin.

And why I am saying we are different?  The BVG is a public company.  We serve as a public course.  Therefore, the data we generate belongs to everyone.

It's a common good.  But this often leads to a fourth assumption.  So the assumption is sometimes that we must make our data public to everybody and to everyone.  For example, to start up enterprises, which develop new mobility platforms, as well as multinational companies.  People expect us to give away our data for a public cause for free.  We believe the opposite is true.  Because we are a public company, we have a particular responsibility for this data.  We guarantee for a responsible use of this data without any intention of commercial profit.

We are bound to strict data production.  We are bound to strict data protection regulations.  Customer will only use our services and give us their data when they know that the data is reality safe in the BVG.

You overawe this ‑‑ you oversaw this saying, data is the new oil.  Nobody would expect a state‑owned oil company to give away their oil for free as a present for everyone.  And it's the same with data.  In our opinion, politics must watch out that data is not concentrated in the hand of a few multinationals and instead strive for a common benefit.  I think the common benefit is very important.

So should public companies give away their data?  Of course, if it's for the reason of making a better city.  And not the reason to earn money.  So public data is very important and I think it's really great when other companies use this public data on a nonprofit basis.  Thanks a lot.

>> JOHN DENTON: Thank you, Sigrid.  I think up with of the interesting observations from the data itself, when a lot of publicly available data is actually put out there.  And you have seen this and you have seen it at different start‑ups to use that, what is interesting is that at the commencement, there's been quite a good dispersity ‑‑ diversity in the users, but over a really quick period of time, certain entities start emerging and actually start reflecting and behaving in a way which resembles monopolistic behaviors.  So this will be very important as well.

One the things I like about the Internet Governance Forum, it brings together a lot of different voices and I hope with the debates that go on this week, that will not be lost as we think about the Next Generation of the Internet Governance Forum.  But one of the key voices that needs to be heard is the operators.  We are talking about all of these new kind of frameworks, how is that going to impact the way the operators actually play?

>> DUNCAN MCINTOSH:  Good morning, everyone.  I'm with the APNIC foundation.  I'm bringing the perspective from Asia but as John mentioned also operators.  For those who know the Internet registries like APNIC, we provide the IP addresses, the IP address resources that every network operator around the world needs to connect everybody's device.  So everyone in this room, if you go to your settings, look in there and you will find your IP address that you have been provided by your provider on a daily basis, dynamically.  And that's managed by five registries around the world and APNIC.

Were a nonprofit.  It was decided all the registries, the five registries should be.  As a nonprofit membership organization, we have around 16,000 direct and indirect members in the Asia Pacific region, it's 56 economies.  We range from the world's largest mobile phone operator in China to what I estimate is the smallest ISP in the Pacific Island economies.  Very diverse membership.

And what they bring to the debate and the discussion around data rights is an understanding of the network operator and the technical community that they have a very important role to play.  They are the ones literally transmitting data.  They want to participate in forums like this through APNIC and their own entities to say that as you think about data rights and want to develop policy and regulation around data rights, you think about the impact of that policy and regulation on the actual network providers themselves.

Because one of the implications of new policy and regulation around data for network operators is inevitably increased costs and that cost flows on to the users.  So as you develop policy and regulation around data rights, think carefully about the impact of that.  Will it make the networks more efficient?  Obviously it's a really important area and we want to bring in better governance for these issues, but also the impact of that ultimately on the users, because increased regulation or regulation that's not thought through, particularly in the context of what we would be concerned about is we're very committed to a global, open and stable and secure Internet for everybody.

And regulation can cause fragmentation.  That's one example of the issues.  And it's for that reason we really value the idea and we look to the IGF is as a forum that we can participate in.  We really encourage everybody to engage here because this is the place where we can see the true multi‑stakeholder community of which we are just one member, can come together and discuss these type of issues.

So I will stop there.  Thanks very much.

>> JOHN DENTON: Thanks very much, Duncan.  Switzerland has played a very important role in the IGF and one of the issues that your organization grapples with in Switzerland.  I would say there's a tension between protecting the personal data of individuals and unleashing the economic opportunities of the Internet, okay?  So let's acknowledge this is a tension there.

One the dangers is that there will be a tradeoff and the tradeoff will emerge which will diminish the rights of individuals which is often the way that a lot of these global discussions occur, but there is actually a very strong requirement for balance here.  So how is Switzerland grappling with this issue and what recommendations you to have for us?

>> PHILIPP METZGER: Thank you very much.  Well, I think it's maybe a bit early to give the global communication recommendation from one small country.  As Switzerland and also as the federal office of communication office that I'm representing here which is part of the government structure, of course, we see the entire process and the debate around data, both at the national level but also at the international level.  And I think it was very inspiring, certainly to me, what we heard this morning from a corporate perspective.  I think many things that were said are certainly very dear to us, you know, when it comes to the issue of transparency and the question of how the actors are enabled to use their data, and which is probably a good example where we have no other choice than discussing and dealing with this at the global level and in a multi‑stakeholder setting, because this is not something that just one actor can determine.

And right now, if you look at our discussion in Switzerland, of course the protection is a key issue.  I think there is a lot of stuff going on.  There's legislation going on when it comes to modernizing data protection.]

There is, of course, legislation going on when its to electronic identification, I think.  I think the e‑identity is a key aspect in the digital world and when it comes to data, there are other efforts by the public policy decision makers, to address transparency issues, and maybe to pick you up on one thing you mentioned, the youth initially, the youngsters that are maybe not so much represented here today, at least we can't see it in the gleaming light.  We had a national conference back in September?  Switzerland which is a regular thing we do, a multi‑stakeholder conference about assessing where Switzerland as a country is heading in terms of digital development and what kind of framework conditions we should have.

So, of course this involves everyone.  It's truly multi‑stakeholder and we put a particular emphasis there on how the young people see this in our country.  And what was quite striking is when it comes to data, that they really were quite open to their data being used for a good purpose or for their own benefit, but they were equally clear that transparency on one hand was key for them.  They wanted to know where their data was going, what was happening with the data are for those who collect those data and secondly, they wanted to have a say in how the data can be used, even if it's just transparent, that's not good enough.  I think if we look at the tension that you were underlying, the protection on one hand, and then on the other hand how can we use data, I think right now, agency is a key ‑‑ agency is a key discussion point for us.  How can we enable stakeholders, citizens, individuals, to have agency when it comes to their own data?  And determine in a given setting, in a concrete application, how they want to use it or how they want it to be used.  If I take mobility, for example, we have someone from the public transportation.

We have quite a concrete discussion, including academia, including start‑ups, including civil society, and, of course, the transport sector, both private and public, on how there could be applications and there could be digital tools that allow the individual to determine how we want the mobility data to be put in the system, to be used, and what kind of benefit it could get ‑‑ I mean, they could get as individuals, but, of course, also, what they think is important as a public ‑‑ from a public perspective, what data could be used for mobility.  It boils down to the question of what agency can we give and what form does it take?  Because it takes a lot of different technical tools.  I mean, there are many options.  There are different ways also of analyzing data, as you mentioned maybe without reaching the privacy or the anonymity.  There's homomorphic computing methods.  I think right now, from our perspective, it's key that we go forward and be concrete in what the individual can actually do with this data.

>> JOHN DENTON: Thanks very much.  Now, look, I have no idea how much more time I have, but I'm working on the basis that I have another ten minutes and we'll seize the ten minutes.  I want to opt discussion up to participation from the audience.  So if you go ‑‑ here we go straightaway.  The gentleman at the very front with the blue sweater on who is standing up.  Give the man a microphone.  Can you also say who you are?

>> AUDIENCE MEMBER: Yes, hello, my name is Mathias.  I work with a new governance organization.  We work closely with My Data Global Organization.  We united 100 organizations around the world to define a governance model for the free flow of personal data which is truly human centric and we work closely with the European Commission to set this in stone and define open standards for everybody to reuse, not only technical standards but legal standards, business model standards as well.

And my question is more of a thought.  In ensuring this trust issue and making sure that everybody benefits from this free flow of data, there is a principle that we're kind of testing, is a separation of powers principle.  What if we say strongly that the organizations that store and process data or not the same ones that handle the organizations and the rights of people on their data?  This has several advantages.  First one being that we want to depend on one single platform, that will give me all of your data and I will manage everything and your authorizations.  Secondly is that it will be truly human centric, only rights have to be kind of centralized and all data can be stored and managed where it is.

And thirdly, that it's truly an open ecosystem, where data can freely flow and people can trust the rights because the organization handling the transparency and the rights don't touch data and have no interest in handling the data.  So the ‑‑ my question is more to hear your thoughts.  Anybody but more precisely but Sonja, Sigrid or John, about this principle and what this could bring to trust and free flow of personal data.  Thank you very much.

>> JOHN DENTON: I should have said before we started that there's no such thing as a good question or a bad question, but there is such a thing as a long question or a short question.  In order to make certain that we have a lot of access.  Sigrid or Sonja, do you want to respond to that in terms of the governance issue in the operation of powers?

>> SONJA JOST:  Well, I believe that separation of powers always a good thing.  Here it would really depend on how do you organize it?  How do you structure it, if you store data and only reprocess them, how can you earn money with this and so on and so on.  So I think, yes, we do need a separation of power and this a for sure.  I do not doubt this in any way.  The question is:  How do we really perform it or execute it.

>> JOHN DENTON: We have time for one more quick question.  And I will try to wrap it up so we can move into the next section.  Is there another question out there?  Okay.  Well, at the very back, please.  Please stand up.  Another man.  Is there a microphone?  Would you like to come forward and we can give you one from here.

>> AUDIENCE MEMBER: I watch movies and my name is Wilhelm.  I watched "The Big Hurt."  This was quite a skew.  If you look at it, how people's data are being used.  How do you really keep people's data safe if they don't want their data out there, for instance?  And the influence on politics as well.

You look at this and you think it's maybe just like a movie, but it's actually quite scary, if you look at it, that people's data is not safe and how people can actually be manipulated in making decisions.

>> JOHN DENTON: Thanks very much.  I was going to ask for you to respond to that question.  I was with the Prime Minister of Estonia the other day.  And in Estonia, they have made it a criminal offense for the misuse of personal data and that exercises the mind about that.

I'm wondering, what is the Swiss experience on this?

>> PHILIPP METZGER: Well, I think ultimately it boils down to good governance.  So how are the actors dealing with behavior acting and what kind of respect for the ‑‑ for the legal positions of the individuals do they have?  Because we clearly see in the marketplace very different behaviors, some are clearly reprehensible and others are criminal and others are extremely oil to the customer respecting the data privacy.  I have think it depends very much open the I have in when you operate and the degree of trust between the different actors and then with regard to question, how much of a relation has to be in post because if there's a very untidy environment, you will have to legislate, regulate much more strictly when there is a sort of climate of mutual respect, generally speaking and of good governance.  I think it can be done with more light touch approaches.  It will ultimately depend very much on the environment within which you are operating and on the philosophy that this ‑‑ the community jointly can adhere to over too I am.

>> JOHN DENTON: Thank you very much.  Why don't ‑‑ I really have to bring it to a close now.  A couple of big ideas running, as people worry about this issue of data, about the potential for decoupling, and there's some thought going on about that and the other is the establishment of a similar but different IPCC, which is ‑‑ would, around this issue like imminent persons globally to grapple with this issue.  These are ideas that the Internet Governance Forum should be thinking about as.  With how do we take the institutional arrangements and how do we create some form of global international institutions to support this?  In the interim, global organizations like mine, ICC, we cannot for profit.  And we can govern certain spaces and we are more than happy to support this.  We want to enable business worldwide to secure, peace, prosperity for all.  We don't want to see the Internet break down.  It's in our interest and all of your interests.

I'm proud to be a gathering of 10 techie nerds in Berlin.  It's a lot of fun.  Enjoy the rest of the day.  Thank you.

(Applause)

 

FINISHED FILE

 

INTERNET GOVERNANCE FORUM 

BERLIN, GERMANY

HIGH LEVEL INTERNET GOVERNANCE EXCHANGE PANELS ON DATA GOVERNANCE AND SAFETY PANEL II

25 NOVEMBER 2019

11:00

ROOM 2

 

 

Services Provided By:

Caption First, Inc.

P.O. Box 3066

Monument, CO 80132

1 877 825 5234

+001 719 481 9835

Www.captionfirst.com

 

***

 

This text, document, or file is based on live transcription.  Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.  This text, document, or file is not to be distributed or used in any way that may violate copyright law.

 

***

 

>> MODERATOR:  Hello?  Welcome, everybody!  We will start with the panel either way because we do not have much time left.  So please take a seat.

So welcome to the panel "Safety and the Right to Protection."

I think there are still two panelists missing, but we will start.  And actually, we have so many great people here, who have important messages to say, so I would like to start with you, please proceed.

>> So I will talk about one user group.  For anyone who was in the last session, it was ‑‑ I was sitting next to John earlier.  He said, you know, there's a third of the people on the Internet are under the age of 18.  They are children.  And they are not in this room.  Apart from the one that I brought, who is going to be on a panel later.  I'm actually unabashedly going to talk about children, why we should consider them, how we should consider them, and if the clock isn't run down by then, maybe a little bit about what good looks like.  You know, I will ask whoever is talking to stop talking.  We are so few people.  Thank you.

So there are nearly 1 billion children online and 170,000 come online every day.  And the Internet was not really imagined as a place where childhood would happen.  So when this was ‑‑ and Tim is here amongst us today, but when the foundered, had their utopian view of what good looked like, it was that all users would be equal, and the problem with that idea and is that if all users are considered equal, then a child is treated as if they were an adult.  And is that the fundamental reason that we to have another think about the Internet in relation to children.  And rather than see them as victims of some extreme action, actually think more profoundly about whether it meets their needs.  I would say there's a correlation between actually upholding their rights and actually them having a safe and secure time online.

Specifically, what I wanted to say is we all know as members of ‑‑ as people who have been children or who have children or relate to children in different ways, that a child of 4 or 7 or 12 or 17, have very different needs.  For example, if you are three to five, yeah, you are just beginning to understand that people see the world differently than you, but you have not one critical way of understanding information and you take what you are told as verbatim.

And it is not until you are a teenager, that you run away from it.  We have to imagine that all of these kids in all of their varying needs are all using the technology defined for adults.  So the reason that we have made arrangements for children and there is a global consensus about this in the offline world is because they have the specific development needs, because they need specific privileges, because they need the handrails of life and there is only one environment now that childhood plays out in which this is not operating and that is the digital so this vast demographic, this one in three users that are treated in an age inappropriate way now need us to flip the dial.  Sorry, I'm looking at my ‑‑ I'm looking at the boss over here.

So really ‑‑ I'm just going to say a few things.  So we have to first of all recognize that they are there.  We have to understand that a child is under the age of 18.  We have to apply their rights, and if you think about it, if you think about it this way, yeah, if you took an impact assessment on any digital service that you can imagine and say, how does it affect their well‑being?  How is it affecting their autonomy?  How is it affecting their health?  How is it affecting their sleep?  And if you have practical answers in how to redesign the digital world in a way that is for their benefit.

I will finish by saying three things.  First of all, the normal response to this is we should teach children to adapt to the digital world.  This will build resilience.

Now, I'm all for education and I use ‑‑ I work very closely with a lot of children.  But you cannot ask a billion children to adapt to a system that is not made for them.  You need to adapt the system to their best interest.

It's not a binary between access and the system we have now.  We have to design a system that is suitable.  And lastly, this is not necessarily about age verification and it's not about content only.  It's about the nudges and about the economic exploitation.  It's about their data.  It's about their digital identity.  And I think that you have to look at this in the round.  So ‑‑

>> Thank you very much for that.  So Bertrand, when we talk about adapting systems, we oftentimes have national legislation versus transnational communication in the system, how would you add to that?

>> BERTRAND DE LA CHAPELLE: Good afternoon.  My name is Bertrand de La Chapelle.  I'm the executive director of Internet and jurisdiction network.  The question that you ask, I could probably elaborate for about an hour and a half.  I will spare you the time and I want to focus on one key dimension, which is that as it is particularly appropriate to say in the IGF, there is no alternative coordination between the different actors to face the challenges we are facing.

We are moving from a techno euphoria, where everything was going to be marvelous to the period of techno doom, where we think everything is going to be bad because of technology.  Both are excessive and the reality is that human nature has not changed.  The problem is that as we grew more aware of the abuses and there are abuses that have to be addressed, we are developing a flurry of initiatives with all the best intentions, by private sectors and civil society.  And those are mostly uncoordinated.  They are adopted on a reflexive manner and they are a quick fix and they are a little bit patch work and I would say makeshift and this sort of ‑‑ I would say to use a French word, the sort of governance collage cannot be sustainable, because fundamentally, the lack of coordination is making the problems harder to solve, and it increases the number of conflicted laws and we are in a situation where there's a legal arms race that is threatening the very benefits of the ‑‑ of the Internet itself.

And so, we are focusing on the symptoms.  The abusers, the fact that we have difficulties addressing those issues, but the deeper cause of the problem is that we do not have the instruments.  We do not have the tools.  We don't have the spaces for all the different stakeholders to be around the same table and address the challenges that they are confronted with.  And so the Internet and the jurisdiction policy network in the report that we are releasing this week as highlighted the global status report, the enormous amount of initiatives taking place around the world, in the legislative environment, in the private sector activities., but it's also a recording and there was a strong message by the global conference in Berlin in June in partnership with the German government, that sent a very, very strong message of the need for legal interoperability.

The only way to reconcile the need for the different actors to have the autonomy of their decision making, and at the same time, the compatibility and the coexistence between the public authorities around the world, and the private actors is to think of legal interoperability and taking inspiration from the way the Internet itself is developed.  This is about the exercise of sovereignty in the digital age.  Sovereignty is relevant, but it needs to be exercised in a different manner because in many cases, we need to have an exercise of national sovereignty, and the territorially based national jurisdictions are struggling when we are trying to organize the coexistence of billions of people in shared online spaces.  So the core message is we need to build the mechanisms that allow all the different actors to address their problems in common, and there's no alternative to coordination among the actors to improve legal interoperability.  I think that's the one message that I would like to share today.

>> MODERATOR: Thank you very much, and we have an expert for building capacity in the Pacific with us, Ambassador Feakin, maybe he can add on how they tried currently building up coordination in the Pacific area.

>> TOBIAS FEAKIN: Sure.  Thank you.  Thank you very much.  Thanks for the opportunity to be here.

I mean, I think one of the things that we see in the Indo Pacific and one the things that we believe in the Australian government, for a start, we are only as strong as our weakest link.  And so it's important for our own interests of our population, of our national interests that we build capacity of others in our region.  But what we see is a patchwork quilt in the Indo Pacific of different capability levels and the different ability to respond to the threat environment, whoever the threat actor might be, wherever that threat comes from, we are work with a whole range of different governments who might be struggling with policy approaches to these particular issues, might be struggling with practical approaches, if I just think about the kinds of work that we do in terms of building capacity to combat cybercrime.  Whatever particular crime that might be and how that is perpetrated in the online environment.  We work with a whole range of governments to look firstly at what kind of legislation exists to combat these crimes and how can we assist in building better legislative approaches and how can you work with the legal fraternity, and they understand that they are able to prosecute against that legislation.

But also you need to work with the police forces in the region, who often don't have great forensics capability in order that they can bring digital forensics in a court case in a submissible form and then also working with the judiciary itself in order that judges understand the severity of the crime in which they are actually prosecuting against and then as we have just spoken about, early on the panel.  How do you link all of those pieces at the international level, that's certainly something we talk about a lot as Australia is the Budapest convention, where they link a number of different countries in the world, where we can essentially access evidence in a form quickly that we can gain evidence to put into a court system and in an admissible form.  It's one of the few mechanisms we have for cooperation.

Unfortunately, it does get caught up in the whole geopolitical discussion which is a real shame, because it's really an effective mechanism, for coordinating.  It's not easy, even as Australia, we went through an awful lot of shifts and difficulties in our own departmental structures to introduce the Budapest cybercrime convention in 2014, an important mechanism.

But then more broadly, we need to work with civil society, ensuring that the level of awareness of threats again, wherever they might be, so that individuals can make the most of the opportunity that is out, there and that really requires that we're not just working as governments, talking about what governments are interested in, because I think most members of the public tend to switch off when it's the government talking to them.  So we work really hard in building partnerships with various NGOs and private sector entities in order to reach parts of society that otherwise perhaps wouldn't listen to us in order that we can build the societal resilience to the kind of threats that weekend encounter.  And let's not forget, the most vulnerable online users are not just children but first‑time users of these online platforms and in our region, you know, we still have a huge development journey of connectivity to cycle through, and that means an enormous uplift and awareness and training that we are trying to provide a part of the jigsaw puzzle to, but it's a massive responsibility on all of us, especially as we so many new online users unfortunately being exploited in various ways.

>> MODERATOR: So Marie‑Laure, you work a lot with children and children's safety, and kind of basically to get rid of exploitation and child exploitation.  How can we guarantee safety for children and also for some Internet users in this interconnected world?

>> MARIE-LAURE LEMINEUR: Thank you.  Yes, I work for an organization called ECPAT International.  We are a network of NGOs, more than 100 organizations in all regions and our Secretariat is in Bangkok, Thailand.  We focus on the safety and the resiliency and how to build the resiliency of children.

Children have the right to be protected, and what is the main practical implication of this is that companies and all actors involved are not doing a favor to children, when they are adopting measures and standards to protect children.  Children are subject to rights.  This is one aspect I would like to raise.

The other one is under the broader framework of human rights, there's a tension between ‑‑ sometimes it's the right of children to be protected and the Internet rights.  Some sectors want to make it an either or debate.  I don't believe it has to be that way.  I believe that we can adopt a technical and nontechnical measures to reconcile both.  And I also do believe that protecting or ensuring the protection ‑‑ the privacy of Internet users should then come at a ‑‑ at the cost of a weaker protection of children.  This is very important.

And when we discuss the resiliency and how to strengthen the resiliency of children online, I think there are two aspects and the first one is that we have to distinguish between the type of risks.  There are some risks that are triggered by the behavior of the users in that case the children.  And this debate is more about human behavior than technology, because we discussing here about children who are perhaps solicited ten times a week, to send a sexualized pic, I just heard that from someone in Stockholm who did research with teenagers in schools.  That's the average number of time, children, adolescents were required to send a pic by peers.

So we are discussing here, how can we work with the children or the teenagers to resist the peer pressure, it's not so much about technology, their cycle of trust, how can they learn to go and talk to someone from their circle of trust?

But there's also the risks that are triggered by companies, content providers and social media, who do not set up the proper standards and whereas it should be their responsibilities and we have to look at ways how to do that, how to enforce that on those companies, and I believe Jutta is going to touch upon this subject and so I won't detail on that.

Thank you.

>> MODERATOR: Thank you very much.  So if we are talking about the right to be protected and the societal approach to it, can you add something, what Poland is doing in this area, Minister Zagorski?

>> MAREK ZAGORSKI: Good afternoon.  Thank you very much for trying to talk about this is a challenge.  You are talking about these whole problems which we mentioned here.  We must look for allies who will help us to meet these challenges.  Depending on the area because it must be emphasized that it's necessary to access on many fronts, we will have different partners, business public administration and NGOs and individual citizens, they play a role of teachers, as well as consumers, portal users and the systemic cooperation will allow us to attribute a basic goal, which is a network and user security.

Talking about the areas of security and the network itself.  It's the goal for us.  Now it's very important in the context of 5G.  In Poland, we talk a lot about this, because infrastructure is a crucial element of the system.  Security of mobile networks is becoming crucial if only because the use of mobile Internet is becoming more common.  We are talking about children and teenagers in Poland, for 80% of teenagers, the SmartPhone is the basic if not only networks.  We are talking about the safety of children, we must also talk about the safety of the mobile network.

And social media in the context of children, but in general, we should discuss about who we are, as a social media consumers or maybe only member of the community and what is our rights?  This discussion is also very important.  And so from this reason, mutual trust is so important.  One thing I would like to say from the Polish perspective, one example, since September of 2018, to August of 2019, which is 12 months after the public safety secretary was located.  Only one of our treaties at national level received almost 22,000 reports of incidents.  It is 30% more than a year earlier.

So what does that show to us?  That's proof, two things.  There are, of course, more cybercrimes, and I believe there are no more people being aware.  Being aware of the nature of the types of cyber attacks and how to report the incidents, it's also very important if you are talking about the safety, because if you are talking about allies, about cooperation, we need cooperation also from the business and from the citizens.

Then the governments it did not come from nowhere.  It's taken by the public and private.  Activities taken separately by public administrations, businesses and NGOs and activities resulted from close collaboration of public/private.

And last thing, but very crucial, in the context of the earlier statements, as I mentioned there's a lot of areas where we need strong cooperation but one of them, is especially important.  So when we talk about it's crucial that it could affect children.  And one of the initiatives of my ministry, on which we put a lot of emphasis is to use against the dangerous content available.  So last ‑‑ last month, we signed a declare to enter the safety of children introduced on the Internet.

It's an example of, which are taken by many players in Poland.  It's very important for us.

Concluding, we need to mutual trust, sharing information and knowledge, cooperation with the private sector and cybersecurity and last but not least by talking about the legislation, the good legislation, which gives enough protection of consumers and enough space for business to go up is also very important.

>> MODERATOR: Thank you so much.  Mr. Koh, welcome to the panel.  How do you assure cooperation between the private sector and the administration and building up cybersecurity capacity in Singapore?

>> DAVID KOH: Thank you very much, I'm David Koh, from the republic of Singapore.  What we are trying to do, we are actually trying to build digital society.  My country has an aspiration to have a smart nation.  Singapore is a tiny country.  It's a smart city.  My country is a city.  So smart country is a smart city on steroids.

We want people to reap the full benefits, the opportunities that the digital economy gives them.  What is digital readiness.  It's things like digital access and digital literacy and digital security.  And digital readiness, it's about equipping the people with the skills and know‑how of how to use technology.  We are not digital natives.  My children are digital natives.  I still speak digital with a bit of an accent.  Singapore has digital literacy to allow the Singaporeans to use the technology, I don't know just operating mobile devices or computers.

We need to nurture a digital society which includes ensuring that no one is left behind, especially vulnerable groups and this I would include children, people with disabilities, seniors and low‑income families.  Singapore, you might think is a rich country, and we have people who are low income and have little access.  So how do you make sure that they are not left behind.  Digital inclusion is an important priority in my country.

It also means access.  It means had a has to be affordable, and, again, an issue of affordability comes in and also language.  Not everyone in Singapore speaks English.  A lot of the older generation did not have the benefit of education.  So there is a big challenge in terms of having appropriate access in the vernacular, in languages like Malay and Tamil or Chinese.  A big challenge for government, when we are trying to put out applications.  It's difficult enough to have a usable in English, but you have to have it in four different languages.  So these are real challenges.  It means security.  We talk about cyberbullying for children.  This is a big issue and also cybersecurity and cybercrime and also digital literacy, being able to tell truth from fiction.

How we doing this?  We are doing it through a multi‑stakeholder approach.  We call it a 3P approach.  The public sector, the private sector and the people sector and we try to do this as an inclusive mapper, thank you.

>> MODERATOR: Thank you very much.  So it's a lot of building up digital readiness and citizen education, but Jutta, is it not only the users' responsibility?  Is it just maybe more than just the user who has to add to this?

>> JUTTA CROLL: Of course, I do think, although any organization which is the digital opportunities foundation in Germany, where I'm chairwoman of the board, has always the user in the focus, but we are ‑‑ we follow a more human rights‑base approach.  And I really appreciated hearing the Minister Atmyer this morning saying that the Internet, the access to the Internet is a human right, should be a human right in some parts of the world, but it should be everywhere.  We also see that there's a right to protection.

Of course we need user education, and we need digital literacy, and I like hearing and seeing, especially at the Internet Governance Forum, how much initiatives are going on in the various countries.  I liked really what I heard from the government representatives here on the panel.  We need users that are competent, that have the relevant skills to make use, to benefit from the Internet, from digitization, but we need to say that states have obligations and companies have responsibilities towards their users.

There is something called the duty of care and, of course, we need care of those most vulnerable groups, but we need care of all users, I would say.  So when it comes to this duty of care, that means that the services need to be designed that they serve the users, that the users can benefit from what is offered to them, and that's ‑‑ I don't think it must be in conflict with, of course, companies have already a duty to make money, that's their purpose, and to commercialize their service and with the user in focus and with young users in the focus, we often see that services are designed for young users but we more often see that young users make use of services that are not designed for them.  So when I do see a duty of care also for ‑‑ it's not only young users.  We have less experienced users still in all parts of the world and we cannot address this only with education.  We need education but we need a joint responsibility, I would say.

Coming to the end, I would like to quote the U.N. Convention on the rights of child, where it's explicitly mentioned that there is a right to access to the media, that was 30 years ago.  People were thinking about mass media, like the press, and like broadcasting, but today, that is the Internet, and so this is a right to have access, there is a right to have access to information, the right to freedom of expression.  And so this' just a whole formulated very well, but not addressing like the digital world that we have today.

And that's now a bit of advertising that we will have a session with regard exactly to the U.N. Convention on the right of the child and the big subject in Europe, where we will discuss the general comments on the convention of the Right to the Child, we will have young people there so we can discuss a bit further how they exercise their right to be heard also in the digital world.

Thank you.

>> Thanks.  My apology.  Can I be disruptive here and expand for one minute on what Jutta just said.

About the duty of care of companies and one important aspect in my experience, I have seen that there's a huge divide between private sector, tech companies ‑‑ service providers, social media platforms, that are based in the western world and the rest of the world, Africa, Asia, Latin America.

And perhaps this is, you know linked to different ‑‑ I mean, factors.  One of it could be that non‑data reporting is not happening in many countries, outside of the western world.  Also sometimes there's a lack of understanding of how automatic tools or solutions that are available, how they basically work.  And I have heard one ISP person telling me in a country in Southeast Asia that using photo DNA, would be violating the privacy of the users.

For those of you know who how photo DNA does not work.  It doesn't look into content.  No human does.  That and sometimes it's also a mind‑set and it's just ignoring that, you know, they have an obligation, and they don't want to acknowledge that illegal content goes through their platform, and sore sitting on the platforms.

So I think we shouldn't think with ‑‑ of the western mindset all the time.  We should think at this issue from more of a global perspective.

Thank you.

>> MODERATOR: So what we heard on this panel was ‑‑ or even earlier this morning, that access to the Internet could be ‑‑ or should be a human right, but still there is the right to protection, especially for the most vulnerable people out, there especially children.  So how can we build a legal framework, which is as connected as the Internet is and not just national legislation.  So it has the aspect of education for sure, but at the other hand, we have kind of a duty of care by the latter platforms out there and we have to understand it as a joint responsibility, and we have to develop a global responsibility out of it, and for this, we have to develop a kind of common understanding looking at the different approaches on privacy, on security.

Closing with that, we have a little time left for questions and I would like to open.  So are there any questions from the audience?

There's one question.

>> AUDIENCE MEMBER: Hello, everybody.  From Mexico, Mario Hernandez.

My question is I have heard about the responsibility of platforms for responsibility of users and the human rights about the right to be protected as well as the right to access to Internet, however, how in all of this relations are basic responsibility of the estates, not only in creating the ‑‑ the correct legislation but also to include this aspect within the education systems.?

Thank you.

>> Can I?  I actually think there’s a couple of different answers to that.  And one of the things that as slightly frustrated me, including amongst myself, we haven't talked about liability.  We let this question of safety and security go for about 30 minutes without saying that we have a global system, where the platforms do not accept liability for what goes on.  And I think until we tackle that as the international community, we will never be safe.  The users.  All kinds.

I want to put that out there.  I did not introduce myself, but I'm amongst the house of Lords.  We brought in different pieces of legislation.  This is something that we are looking at carefully, specifically data protection laws and building on that, because there is the engine room.  I think write laws as it's been described very well by Jutta is really important thing, but more ‑‑ more fundamentally is I think we have to stop thinking about this as an exceptional space, and be prepared to apply and enforce the laws we have, whether they are health and safety, whether they are consumer, you know, whether they are business laws.  And I want to finish with this, which is if I put up a ‑‑ a video of a children's birthday party and it's playing Prince in the background, I get a take down notice very quickly because of the IP laws.  Those are being properly enforced, but in your world, we are not getting it taken down quickly enough.

So I think you have to look at where the power is, where the business power is, and where the accountability is and therein lies the answer for us all.

>> MODERATOR: Can you add to it?

>> BERTRAND DE LA CHAPELLE: I would like to help all of us make a mental shift.  It's extremely easy to say that it's somebody else's responsibility to do something.  I know that's not what you are saying.  But we are in a natural environment where basically the governments are saying, companies are not doing what they should be doing.  And then the companies say you are asking us to do things that are disproportionate and inappropriate.  And usually civil society says, you know, I don't like when you guys are behind closed doors making deals.  You know?

The challenge and the preliminary thing is to get the actors around the table so that they formulate the problems as a problem they have in common and not as a problem they have with each other.

This is a thing that we always skip.  We get into a competition of my solution is better than yours or I want to establish my solution because in a standards competition, if you are the first one to establish your standard, you will prevail and you will be a power struggle.  This is exactly the legal phrase I was describing and we need to get together to formulate the problems in common, before we begin to discuss the solutions.

And for anybody who is interested in my advertisement, you come on Wednesday at 1:15, where you will get more information about the Internet and jurisdiction policy network.

>> MODERATOR: I think I just wanted to pick up on the really good point made there about the kind of reinvention of the wheel.  And you see it all different levels in terms of how we look at the Internet, and how it's governed and how states operate in the online environment, and how individuals do and businesses do.  And I think often you hear this argument of, well, this is a new environment, therefore we need to recreate.  Everything can have brand new legal practice in the online environment and that's actually the worst thing that we can do.  As soon as you do that, you say that hundreds and hundreds of years of legal process and building up norms in the physical space, suddenly they will be lost.  And there is a very big group, if you like, of voices that are building in terms of the state level, that we should essentially re‑invent international law in cyberspace and that's a very dangerous precedent to begin to set.

And it ‑‑ and so what we need to do is understand that, yes, Internet is a very different place to that, than it was when it was formed all of those years ago.  So now it's about how do we begin looking at how the law is applied online and how at times we have to enforce that and I wouldn't disagree.  A lot of that has to be done with other voices in the room that may have not been in the room before, but that is vital if we are to make this work.

>> MODERATOR: I'm sure we could discuss about this specific topic for an hour or so, but we do not have much time left.  Is there one short question from the audience?

Yeah?  Back there.

>> AUDIENCE MEMBER: Hello, my name is Isabel, I'm from Brazil, from an NGO whose mission is no honor children.

And I would like to know from you, if you believe the use of data from children for a commercial purpose, as advertisement, to children is acceptable or it is a commercial exploration of them.

>> MODERATOR: Jutta.

>> JUTTA CROLL: I grabbed the mic.  I don't think that's acceptable.  We need to differentiate their age, of course.  If it's a data of 16 or 17‑year‑old who might know what he or she has done when giving away their data or data of a 3 or 4‑year‑old, but nonetheless, I would say profiling the children should be forbidden for commercialization.  And there could be like we had that example from Sonja Jost in the last panel, that you already use ‑‑ make use of the data but it's anonymized so that you can ‑‑ don't track down the individual person where the data comes from.

I'm still a bit concerned about using data of children to set up even anonymous profiles, just to adjust your product so that they are more likable by children.  We will have another panel how this is done based open a business model for games, for example, we will discuss that tomorrow morning at 9:30.  So there will be lots of debates about that, but my ‑‑ my ‑‑ to be honest, I really think don't use data of children for commercial purpose.

(End of scheduled captioning.)