Welcome to the United Nations | Department of Economic and Social Affairs

The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 




 >> LUCA BELLI: Okay.  Good morning, ladies and gentlemen.  I think we can get started.  Welcome to this second meeting of the Dynamic Coalition on Platform Responsibility.  Today we will have the pleasure of having many distinguished panelists starting with -- well, UN Special Rapporteur for Privacy, Joe Cannataci, rapporteur in the United Nations.  We will have Benoit Thieulin, who is President of the (speaking in French), the French Digital Council.  Then we will have a presentation of the recommendations on terms of service in Human Rights that will be also discussed at the main session on Dynamic Coalition Outcomes. 

     Then we will have a high-level panel discussion with Marcel Leonardi, who is director of public policy at Google Brazil.  Verdiana Almonti, who is a Brazilian lawyer and also works for Intervozes, who is a Brazilian NGO.  And I will co-moderate.  My name is Luca Belli, and I'm with the Center for Technology and Society in Rio de Janeiro, and I will co-moderate with Nicolo Zingales, who is a professor at Tilburg, and Primavera De Filippi, who is a researcher at (?) and Berkman Center. 

     So I think we could start with -- we can start with the keynote by Joe Cannataci.  Can we use some slides, please?  Let's start with the keynote of Benoit Thieulin while we arrange the PowerPoint presentation.  So Benoit, the floor is yours.

     >> BENOIT THIEULIN: It won't be a keynote, just a speech.  First of all I'd like to thank you for giving me the opportunity to speak here today.  I am pleased to represent the French, as you say, digital council in this Internet Governance Forum.  It's an honor for me to be here with the first UN Special Rapporteur for Privacy, a position which I know has been called for by several freedom advocates.

     And I'm particularly pleased also to attend a session of the Dynamic Coalition.  I want to move things forward in a participative approach and not to resign ourselves to the idea that digital governance should mean inertia.  Indeed, there is an urgent need for intelligent decision process given the magnitude of the challenges and controversies we all face, such as government surveillance and the growing desires to control cryptography.  We are coming out of the 20th century with kind of techno enthusiasm.  The digitalization of our world is creating complex changes to which we have to give meanings, direction and values.  40 years after the invention of the Internet, more than 25 years after the Web, a plurality of possible worlds is apparent.  For instance, I'm convinced that digital and ecological transitions are the most critical issues of our time.  The first one has found its path but not its reason.  The second one has reason but hasn't found a path yet. 

     An increasing number of issues extending far beyond Internet Governance will need to be addressed at an international level.  In this context I think it is crucial to elaborate a flexible and open government framework in which civil society, academic networks, technical community and businesses' voice can be taken into account.  Big (?) are part of this moving landscape, including Google, whom I'm thankful being here, they can and must be our allies, the allies of citizens and the freedom's rights.  They can be.  Some of them have already demonstrated a positive contribution to individuals' empowerment as knowledge disseminators, channels for new forms of expression and channel of citizen mobilization, sometimes across borders.  It is in their interest because their business relying on reputation and their trust relation with customers, especially in a disruptive economy where a challenger can use a failure to install trust, to differentiate itself, and indeed they have to renew that trust after this northern shock.  How can it be?  For us, the French Digital Council, it implies to demand platform loyalty. 

     Let's make a semantic discretion first.  I do not feel comfortable with the term of (speaking in French) that we use a lot in Europe and especially in France, because it kind of restricts the debate on a EU/US tone.  It makes us forget that these business players have developed (?) modalities that takes full advantage of digital attributes, network effects, outsource innovation, data exportation, et cetera.  This model is at the art of enterprise transformation and changes the value chains.  (?) by bank, by insurer, even the company that manages the best high tech in the morning will integrate platform functions.  That is why it is so important to reassess our regulation schemes and ask ourselves if we are equipped for the platformization of our economy. 

     We started to work on online platforms after our very first opinion of net neutrality, which took us from position as it is essential for ensuring freedom of communications and also free enterprise in the information society.  In the same opinion, the council noted that digital society doesn't only consist of actual networks and pipes.  ISP are not the only gatekeepers of the Internet.  On-line platform also play a crucial role in the access to information society.  Therefore, we should ensure that while they expand, online platform do not impair the use of the Internet as a forum for cooperation, free expression and the exchange of ideas.  We then organize two consultations during which issues were raised on both business to business and also business to user level.  Users suffer from an unbalanced situation in terms of information. 

     Let's be honest.  It is clearly impossible to be sure of what is going to be done precisely with my personal data after they have been collected.  These value chains are too much opaque in terms of services unreadable for an average human.  Business users can suffer from the intermediate position of platforms in the digital environment.  It raises classical (?) issues, but it occurs in different places, the first result page of a search engine, the technical parameters of an IPI, et cetera, which is strong technical lawyer.  The subject has continued to grow since then.  Twitter recently apologized to developers that were (?) of its API, after it shared access to most resources.  And the Volkswagen scandal continues to raise concerns about regulators' indigence towards extreme technification of the observation subjects.  We have proposed (?) platform loyalty which, which means both accountability and fairness. 

     In a technological rapidly changing service environment platform should say clearly what they do, do what they say, and provide means to verify that it is true.  Our consultation with the French businesses involved in the Google search case has shown the need to objectify the difficulties facing professional and individual users of platforms.  In fact, it involves a wide range of expertise, (?) design, technical skills, et cetera.  It cannot be assured only by lawyers.  What is lacking today is a space for collective intelligence in order to objectify and give visibility to online platform practices, whether good or bad.  We are convinced that in an (?) economy regulation should be more social and pushed by the market.  Crowd vigilance must be encouraged and fully integrated into regulation schemes. 

     The French Digital Council propose to create a network of (?) bodies to elevate the practices of digital services.  It must rely on an open network of contributors by offering a canal for information feedback and federating the wide variety of potential expertises, an app developer confronted to the instability of an API, an (?) facing and opaque referencing policy, consumers' association, et cetera, and even a service user when it encounters problems to enforce its rights.  There are already numerous initiatives that do a part of the job and that should be given full support in terms of visibility and cooperations. 

     For instance, terms of service didn't trade is a brochure plugged in developed by students that writes from A to E, the terms of services of most of online services.  Am I giving a license on my shared consent to the platform (?)?  How might I share it with a third party?  Can I easily have my data back for personal use, et cetera?  That pattern focuses on bad practices relying on interface design and user process through the lens of a Web developer.  We find this approach very important, it is low to enlighten practices which are not illegal per se but can contribute to impair the exercise of our rights if they are done in a larger scale.  We propose that those writings bodies develop criteria embracing both B to B and C to C relationships.  That ability must be one of them as it is key to ensure people's informational self-determination in the era of big data and IOT.

     Our proposals have been taken into account by the French government in terms of transparency requirement towards consumers.  The idea of a petition lever and crownsourced observation has also been fruitful.  At a European level we began interaction with our German counterparts since we saw that there was shared understanding.  For instance, Germany has already put in place a (speaking in German) in charge of monitoring online practices toward individuals and professional users.  We also exchange a lot with the European Commission on these issues and they changed recently. 

     To conclude I would like to say that there is a very, very strong need for commitment of international players on that issue that can and must show how it is possible to enforce our rights without (?) just as easy as it is to use a well-designed app without a Ph.D. in computer science.  I hope this event will help things to move forward.  Thank you.


     >> LUCA BELLI: Thanks for this presentation, and also for highlighting that not only terms of service should be clear but also the practices behind terms of service should be quite clear.  Now I would like to ask Joe Cannataci to provide us some words for reflection.  I know you're going to speak about parallel universes so I really look forward to hearing this.

     >> JOE CANNATACI: Yes.  In fact, ladies and gentlemen, when I asked Luca, what do you want me to speak about?  He said, well, you know, this is about platforms, right?  And how platforms can help Human Rights along, and I said fine.  So actually what I'm going to do is I want to steal -- I want to steal bits of a presentation that I've given before but which hasn't been broadcast, and I've changed a few bits of it.

     But let me first ask you to dream.  Let me ask you of an Internet where you have different spaces.  At this moment in time, we have different places, but think about spaces, and think about a space where you're Human Rights are respected.  Think about the space where there are no problems with jurisdiction, and think about a space where territoriality is not an issue, and think about a space if you're upset with somebody, because not only have they infringed on your privacy but they have damaged your reputation, and remember that Article 17 of the international covenant on political and civil rights not only speaks about privacy, some people only see the point about privacy, but it also speaks about protection of reputation, and a lot of research we've carried out in many projects over the past years have been about -- have also shown that people not only care about privacy, but they're very aware about what's happening on the Internet and their reputation.  And I'm taking this opportunity to tell you something about the mapping project, all right?  Which is an EU funded project, which we have going on. 

     So yes, sometimes, I will make reference to my thinking as a Special Rapporteur, but I'd like to start with a disclaimer, right?  The following views are my own, right?  They do not represent the views of the mapping consortium and it is intended as a (speaking non-English), right?  It's intended to provoke comment and argument, and it also does not necessarily represent my views as Special Rapporteur.  I'm just thinking out loud with you, and then I propose to come back, and please ask me, if you're going to ask me, tell me, Joe, I'm going to ask you a question insofar as your own thinking is concerned in mapping or as a Special Rapporteur.

     Okay.  And if you've heard me speak over the past couple of case you know that I've been speaking about an international treaty, and no, I'm not going to speak about an international treaty, because to me the law is only one part of the solution.  International law is only one part of the solution.  And those of you who have heard me speak before also know that I also believe that there's a technical dimension to the solution, which is one of the reasons why I believe we're meeting here to talk about platforms.  And yes, I am going to talk about holy cows, all right?

     When we talk about holy cows, with all due respect to those people who believe in holy cows, I'm just simply using this as a -- as a figure of speech and no disrespect meant to people who consider the cows to be holy.  And I'm going to quickly leave some stuff to one side -- there you are.  I wanted to come to this particular holy cow.

     The fuss about one Internet, have you noticed how many people like to talk about there is one Internet, as if there is one platform?  And I'm sorry, but that part of me which is still a bit of a technical geek questions a number of facts.  Do we have one Internet?  Are iPv4 and IPv6 no longer different protocols?  Because my computer requires an interpretation mechanism for them to talk to each other.  I would have loved to have Vince here to confirm the technical issues, but as far as I'm concerned we have two Internets, right?  And, in fact, then some countries, which I should not name, but some countries have decided to shut off spaces in their part of the Internet.

     So when we talk about one Internet I get a bit worried because to me what is important is I don't care if there are, frankly, three Internets, 10 Internets.  What I care about are the following:  One, can I access ten Internets physically?  Can I afford to access ten Internets financially?  And more important as a citizen, speaking as a citizen, are my rights protected on all of those ten Internets?  Because if not, then perhaps I would like to have another Internet.  I'd like to have that Internet where when I sign on, clearly I am also signing to a space where my rights are protected.

     And, in fact, if you notice, a lot of people tell us, we don't want to have one Internet that way.  They tell us, today we have a fragmented Internet.  And then when I look at fragmented and I -- I've put this side by side, right?  An unfragmented, interconnected, interoperable, secure, stable, resilient, sustainable and trust-building Internet.  Now, I have no problem with all the rest of it, but fragmented, unfragmented?  So perhaps this is semantics.  Perhaps it's a figure of speech, but frankly, do I really care if it's fragmented?  I mean, wasn't the whole idea about the Internet to have a whole bunch of things which are fragments which I can then go from one thing to another and go together?  So the idea of fragmentation, and the word of course -- I'm clearly asking simply as a figure of speech, is fragmentation a holy cow?  Is unfragmented a holy cow?  And so far as -- insofar as platforms, technical platforms are concerned too, right?

     And of course I ask myself, and I've said so openly, the answer, is an Internet really the key to completing the puzzle?  Is there only one key?  And I'm clearly saying no, there isn't only one key.

     And then we have this phrase, you know, which goes around, the Balkanization of the Internet.  I'm going to give a prize to anyone who does this.  The Balkanization as far as I can trace it 1997, nearly 20 years old, and it's still spoken today as if it's something we want to scare people with, right?

     And, in fact -- I'm not going through the whole thing.  The Balkanization of the Internet, into different multi-connected Internet families, right?  And people have been talking about what I have and some other people have called parallel universes, right?  So I wasn't the first guy who talked about parallel universes.  Wayne Cruz, more than nearly 14 years ago talked about a concept where parallel Internets would be run as distinct private and autonomous universes, and please find any other way -- any other word and name that you care for.  I'm not wedded to parallel universities.  I'm not -- you can call them -- it's almost an insult to the scientists involved, but some people have called over things multi-verses, if you look at certain parts of astronomy and physics.

     But what I'm saying here is should we think about the possibility of using platforms to host a more safe and secure Internet?  And I'd like to point out two things so that I won't run out of time.  I'd like to point out two or three things.  Some people, like the Ganuneth project and similar projects, it's not the only one, are relatively examining overlay software, which can use -- put on top of IPv4 and IPv6, right?  And use these to create a new, more secure space. 

     Some people have also looked at more radical solutions, right?  And some people have also talked famously of why don't we set up, for example, a European Internet.  Why don't we build something which is not necessarily dependent on servers and routers and switches which we buy from somebody else?  And some people have said, well, this may not be such a bad idea.  And then you start saying, A, is it feasible, B, is it desirable? 

     And let me make it clear.  I am not saying either that it's feasible or it's not feasible.  I'm not saying that it's desirable or not desirable.  I'm saying perhaps we should think about it and see -- and have a proper discussion about the matter, especially because of the fact, if a new technical environment, if new software and/or hardware solutions help provide a space where fundamental Human Rights are better protected, then it's not such a bad idea. 

     And, in fact, I'd like to point out two things.  If I were to wear my other hat, that of Special UN Rapporteur, I would be the first to tell you it's not only privacy, ladies and gentlemen.  Fundamental Human Rights is a whole bunch of other things, including freedom of expression, including the right to access information, including dignity, including reputation.  And for all of us in the room and so many outside the room who believe in the power of the Internet, then I think it's worthwhile looking more closely, perhaps putting it on the agenda, Luca, of this particular initiative, which I consider to be very worthwhile.  Let us have a serious look at technical environments which can help provide something else, something where if I can sign on to this space -- you see. 

     Think about it.  How many companies have actually relied on your signing on to get you to agree to terms and conditions?  But instead stand it on its head, and instead of signing on to terms and conditions, sign on to agree to -- to agree to the rules of a new space.  And the rules of that new space should, to my mind, amongst other things, also include affordable remedies for the citizen, because now, ladies and gentlemen, we are living in a funny kind of society where if I buy something on the Internet, especially in certain places where the law has been tightened up, I can buy something on the Internet and still have a guarantee.  I can buy a phone, electronic equipment, other stuff, but then I turn around and say, is the same thing going to happen to me if I go away from consumer, right, and the guarantees that I have and I go to other things?

     About ten years ago we carried out a study why eBay was so successful, and we considered a number of platforms, and the truth of the matter is one of the reasons, of course -- some people would say it's the main reason -- is trust.  And trust there was especially engendered by remedies, right?  Anybody who has been following what I've been saying during this IGF 2015 will no that I have talked about -- will know that I have talked about, okay, the realities that we have, an Internet without borders, and therefore citizens expect safeguards without borders and remedies across borders.  And that is precisely what the dispute resolution mechanism that eBay and similar services offer.  It's that you have a quick, easy and relatively cheap way of solving a dispute. 

     And now think about something else, ladies and gentlemen.  If you have a problem with privacy or reputation on the Internet, and you take that to a court, not an online court but an off-line court, in many countries, and in some more than others, that's going to cost you tens of thousands.  In some countries hundreds of thousands of euro or pounds or sterling to resolve.  And yet we have online dispute resolutions systems for a number of things, including Intellectual Property, including top-level domain names, including whole areas which WIPO has successfully put together.  So have we managed to do this -- we've managed to do this in other areas.  Can't we also manage to do this for reputation and privacy?  But won't we need laws for that?  And that is where we bring structures in.

     I don't have the time to go through all this because I'd like to respect the time we have, and I'm running out.  My message ends here.  My message is while technically a parallel -- my message is please think about what I'm going to say more than taking it as any form of definitive and concrete conclusion.  While technically a parallel Internet could be distinct and autonomous, unlike what Wayne said, it does not need to be privacy, right?  It could be public.  It could be a common good.  And it could be just as accessible as other parts of the Internet.  Only it may play by a different set of rules, and those rules could be built into the platform itself.  And I feel that for one reason or another not enough attention has been given to platforms providing this approach, right?

     And the only thing I'll say, and please forgive me if I'm making a tiny plug here, we are in the mapping project actually also looking at parallel universes.  So if you're interested in contributing to the rate, please speak to me after the session and we can explain more where we're doing.  You might want to contribute there in the growing dialogue we have and finding technical safeguards which complement Human Rights safeguards, which complement legal safeguards in what could be a brave new world.  With that I thank you.


     >> LUCA BELLI: Thank you very much, Joe, for letting us dream about safe human-friendly universes and also to remind us that technological solutions can allow those friendly, Human Rights compatible environment.  So before getting to the presentation of the relations, I would like to welcome Patrick Penninckx, who is the director of the Council of Europe Information Society Department, and then I will allow Nicolo to start and present the first part of the recommendations.

     >> NICOLO ZINGALES: Thank you, Luca, and thank you very much, Joe, for the very enlightened presentation where you actually focus precisely on what we are trying to do here, which is to create some baseline, some rules of engagement that can be used to foster responsible behavior, so a safer space, as you were pointing out.  So actually in this recommendation that you should have -- or if you haven't got a copy you can find it on the table, on the left side, we tried to develop this recommendation -- these recommendations in a participatory manner in a mailing list.  It's work that's been going on for almost a year.  And we have based our work on existing Human Rights document. 

     So we looked first of all at what was a minimum required according to those international documents, but we didn't stop there.  We also looked into what are responsible ways to carry out the values that those documents were trying to protect in a responsible manner.  So we developed some best practices.  So in the -- in the paper that you can see, there is a distinction between shall and should.  When you have shall, we are referring to minimum standards.  Sorry for the lawyerly comment, but it's quite important from a legal perspective.  When you instead focus on should, it is something that platforms could do to show the way in a way to more responsible behavior.

     So one problem that we face doing this is that sometimes national laws might be in conflict with these best practices simply because international Human Rights documents can be implemented on a national level in a slightly different way.  So to that extent we included in a definition something that we call legitimate law, which refers to the fact that the platform can sometimes use this as an excuse.  So saying I am complying with something that was imposed to me nationally, but this is not sufficient.  We argue that it should be a legitimate law, so it should come through a democratic process and substantively issued also (?) that it's not disproportionate to address the social needs that it is supposed to address.

     So letting that aside that you can look into more detail by looking at the documents, we will also present this in the next two days over the main session for (?) coalitions where we're seeking the validation by the broader IGF community.  I will start by simply telling you a couple of introductory remarks about due process, which is another lawyerly term, which maybe is not apparent to everyone.  So -- and then I will leave the floor to Luke and Primavera to talk about privacy and freedom of expression.

     So with regard to due process, what do we mean?  Document at least we refer to three basic values.  The first of all the values is clarity, and this is what we try to foster with terms of service.  So if you don't know exactly what the law is, you have some problems complying with the law.  So first of all, as a user it's important to know what the law of the platform is.  So this is the first value.

     The second is a right to an effective remedy.  So you should always have within the platform the opportunity to have an appeal or review of the decisions that are being taken concerning you.

     And the third one is the right to be heard.  So there should be a consultation whenever the platforms take a measure that can impact on you, not only individually but also broadly as a user of the platform.  For example, through changes of contractual terms.  So in doing that we divided the section on due process in two parts.  One is amendment and termination of contracts, and the second is adjudication.  Amendment and termination of contracts mainly refers to the need to have meaningful notice before there is any significant change in the terms of service or any termination of the service. 

     And secondly, with regards to education, platforms should, as it was also mentioned by the previous speaker, provide alternative, quick dispute resolution mechanisms in order to have -- to be effectively protected.  However, these mechanisms should not replace the courts of law.  Should be only in addition and not a substitute.  So in particular platforms should not impose waivers of access to court or class actions.  This is again another best practice that we encourage, although it's not strictly required by law.

     And looking forward to discuss this in more detail over the course of the next two days.  Please pick up a copy if you haven't, and I'll pass it over to Primavera.

     >> PRIMAVERA DE FILIPPI: So with regard to privacy, we have data collection, so the platform operator should limit the collection of personal data to what is directly relevant and necessary to complete specific, clearly defined and explicit communicated purpose.  And in particular the platform shall specify every type of (?) information collected, rather than requiring a general purpose consent.  And then if consent is withdrawn, then the operator should prevent any further processing on the individual data by the controller.

     Then after consent is given, the platform shall always provide a way for user to opt out at a later stage, and it shall also allow users to view, to copy, to modify and to delete the personal information that has been made available.

     With regard to data retention, the platform operator should clearly communicate in terms of service whether and for how long they have been storing the personal data.  With regard to data aggregation, the aggregation of platform user data should only be done subject to express consent, and the purpose of the data aggregation, as well as the nature of the new data resulting from the aggregation, should be clearly stated in order to allow the platform user to properly understand the scope of the given consent.

     With regard to that use platform shall obtain consent in order to use personal data for the legitimate purpose and for the duration specified in the terms of service.  And it is also recommended that the platform specify in the term of service that the processing of personal data is limited to the scope of the existing service, which means that the enrollment of the platform user in any new service should require all the time an acceptance of a new terms of service.  The platform operator shall also give user the possibility to demand rectification of any inaccurate data and to object to any usage of this data unless it is mandated by legislative law.

     And then finally the platform operator shall always permit the user to delete their account in a permanent fashion, and if there is not a legal reason to (?) the final storage of the data, the data possessor should proceed to the permanent deletion of all or a portion of the relevant data in the platform user account in a time that is reasonable for implementation. 

     And then finally with regard to the data protection vis-a-vis practice, the platform operator should establish clear mechanism for platform user to gain access to all their personal data held by a third party and to whom the data has been transferred, as well as to be informed of the actual usage (?).  The platform operator shall also (?) to preserve the anonymity with regard to third parties to the extent it is permitted by legitimate laws.  And then it is also recommended that platform enabled (?) to encryption of communication and other personal information in the context of both storage and transmission. 

     And finally as regards the handling of the platform user data (?), the platform operator should specify that they execute such requests only in the presence of a valid form of legal process and should release periodic transparency report providing all the amount and the type of such requests for each jurisdiction in which they operate. 

     >> LUCA BELLI: Thank you, Primavera, for this.  So the third very important core part of this recommendation is freedom of expression, and so we all know that platforms terms of service may have a very deep effect on the possibility of the users to freely seek important information and that possibility may be limited, hindered, either by the platform terms of service or by the applicable legitimate flow.  So in the recommendations we tried to provide guidance, similar guidance on these two main issues, starting from the departure of the basic consideration that platforms are increasingly sort of speech enabler.  They allow people to express themselves, but they are also an enabler of information seeking.  So they allow to find information and they -- they are an increasingly essential to finding intermediary information. 

     We are divided -- in the first part we are dealing with limitation due to terms of service, and it is -- it is understandable that a private space could have limitations, but those limitations should be clearly stated in the terms -- and those limitations should also provide -- provide effective remedies to challenge a decision that may not be in compliance with the terms of service.  So the first -- the first part is to clearly allow the user to know what kind of limitations he has when using it, and the second one is to have a remedy, an effective remedy when these limitations is abused.

     And then the second part, we deal with content take-down or restrictions that are imposed by the law, and this is something that could happen, but this kind of restriction -- these kind of restrictions should be necessarily and proportionate to achieve the goals of a legitimate law, and Nico was explaining how it's (?) developmental, and also in this case when the content is taken down, the users always have the possibility to have a remedy, so to seek redress, both within the platform but all -- just as a complement of the existing court system, it should always be available. 

     So without further explaining this part, which is a very important part, but I'm sure we will have the possibility to discuss this with our panelists, I would like to open our third segment of the session, and I think it may be interesting knowing that we are speaking about freedom of expression, to start with Patrick Penninckx.  We know that the Council of Europe is doing -- and has been doing excellent work on freedom of expression, and I would like to ask you, Patrick, what could be the effect that terms of service may have on Internet users' rights in and we know that the Council of Europe has recommended a guide on Internet users' rights only last year, and what is the role of state in order to enforce this right and protect this right?

     >> PATRICK PENNINCKX: Does it work?  Yeah, it works.  Thank you, Luca.  Thank you also for the question and thank you for inviting the Council of Europe to this very interesting debate.  I think you already mentioned first of all our Human Rights guide for the Internet user.  That's -- you have to see, it's not just a recommendation, it is soft law instrument that is adopted by 47 Member States of the Council of Europe.  In that sense it becomes a crucial guiding principle for our Member States.  It's also clear that the role of platforms, service providers, intermediaries are extremely important, and I would even say that our courts, including the European Court of Human Rights, is still seeking for guidance as to where it should place itself with regards to the role of these intermediaries.  That's why, and I see some people laughing, I think it's extremely important that we are able to give and provide more guidance on this.  That's why -- one of the reasons why I decided we cooperate with the Vargas Foundation to look into the terms of reference and terms of service together in the light of this Human Rights guide for the Internet user. 

     But we are not limited to that.  We need to go a step further.  I think on the one side it's extremely important that we do not create a multitude of courts and judges with regards to what should be put and what should be taken out of some of these terms of references, and not leave a completely open space as to a development in a multitude of directions with regards to the protection of Human Rights through these terms of reference of the service providers and intermediaries.  So that's extremely important.  We are still at a searching stage at this point in time.  That's why what we will be doing and that's where this type of recommendation actually comes in very handy for us.  It also is very much in time, because over the next two years we will be looking very precisely into the role of intermediaries and service providers.

     I just quote from our own Council of Europe Internet Governance strategy that will be adopted by our committee of ministers in a couple of weeks' time -- well, at the latest at the beginning of 2016.  One of the things it says, and I read, we will establish a platform between governments and major Internet companies and representative associations on their respectful Human Rights online, including on measures such as model contractual arrangements for the terms of service of Internet platforms and principles of accountability and transparency.  To the multi-stakeholder community regarding the collection, storage and analysis of personal data to protect, respect and remedy challenges on violations to them.  So I think we're -- here you're fully in line with the intent to produce, and it will certainly be an element of consideration when we start working on this.

     As I mentioned, we will have a specific committee composed of governments but also of experts, to focus on this, and develop that in the next two years, actually, in terms of very specific recommendations.  And any orientation on that will be extremely helpful.  I will keep it at that.  We can still discuss it afterwards in more detail.

     >> LUCA BELLI: Thanks a lot, Patrick, also for the very good news about the Council of Europe work on this precise topic.  So, Nico, I think you can --

     >> I find it actually very interesting that this can be used as a base for further policy making, at least thinking about new recommendations that maybe can be adopted at a more institutional level.  In this regard I think it's also interesting to mention, we had the (?) project as part of the Dynamic Coalition that was looking into terms of service, and this was what you mentioned, I believe, referring to the of Vargas Foundation.  We are lucky to actually have a consultation on the implementation of the Marco Civil and the reformer of protection law running at the same time, and we were able to provide some statistics, for example, on these terms of service of main online platforms during the process negotiation at the Parliament.  But also I found interesting that you mentioned representative organizations for users being active in this debate.  In this regard we are lucky to have someone from major consumer organization in Brazil, and I would like to know how do you see this recommendation potentially playing an interesting role, guiding role for you at the Brazilian level?

     >> Thanks.  Good afternoon to everyone.  I would like to thank the invitation to discuss the recommendations here.  I'm Veridiana, and I work at Intervozes with a Brazil NGO that works with -- to promote diversity and plurality in media and Internet as well as the respect of Human Rights, and before that I worked in a Brazilian organization that worked directly with consumer rights.  So firstly, I would like to highlight the relevant work of both Dynamic Coalition in developing those recommendations that can be adopted as best practices by the companies and taken into regard by governments in internal and external policies and legislations.  As the recommendations itself stresses, the digital environmental is characterized by (?).  Increasingly the participation of these platforms affects individuals's ability to develop their own personality and engage in a substantial amount of social interactions.  This is happening more and more through global platforms, with global users, which raises the need for the establishment of global values and standards, based on participative processes and of course having Human Rights as a fundamental common and minimum ground.

     About this, even though we are in international-level discussions, I think that the national experiences and concerns can help us to deal with this international challenge.  One important regulation in Brazil is the consumer defense code that ensures the right to information as a basic right.  Choices cannot be made, consent cannot be provided, services cannot be properly used if there is not an absolutely transparent relationship between the user and the service provider from the moment that this provider announces its services.  Other important pillar of this law is the acknowledgment of the inequality between user and the supplier of products and services, considering abusive or unfair and therefore no obligations or clauses that give exaggerated benefits to the provider or the supplier. 

     It's important for us in national level, this way to deal with unfair and abusive obligations and clauses because it does not matter whether they are written or not in the (?) contracts, which willingly or not is what the terms of services are.  So it's important to have minimum grounds that the companies should meet, but if they don't meet we have these grounds that can be also enforced by national and international legislation to give us this kind of enforcement.

     On the more specific proposals for the three teams or issues folks on the platform, on the recommendations, I would like to bring a problematic (?) experience to comment, the freedom of expression provisions.  We have in Brazil a strict copyright legislation, with few exceptions and limitation to copyright protection far from the access to knowledge and concerns that we discuss in forums as IGF.  Moreover, many times the argument of copyright violation is used to justify the removal of a critical content that uses, for example, a company's (?) critical production to criticize its editorial line. 

     When the Dynamic Coalition's recommendations deal with restrictions to freedom of express, the main criteria, as was said, is that these restrictions showed the -- shall be explicitly defined by legitimate law and there was an exercise to evaluate if this law was legitimate or not, not only by the fact that it's written or not in national legislation, and if it's the -- the process of approving was a democratic one. 

     But even though this evaluation of legitimacy is submitted for different references, even regarding national legislation, depending the level of this analysis.  In order to give more elements for this analysis, I think as a suggestion, that it would be important that the recommendations mentioned copyright for use or the fair use standards to face national legislation that doesn't take this into consideration.  Regarding privacy, the platform addresses many important concerns with guarantees that we still don't have in Brazil, even with Marco Civil considering that we are late in approving our data protection law despite significant advances in this direction this week.  It is interesting to stress that Marco Civil introduced a purpose principle in our legal system.  However, it's not simple to enforce this principle in the face of generic justifications of better profusion of current or future services. 

     In this sense and concern with the concretization of the purpose and the necessity principles, I think it would be important for us to discuss if this part of the recommendations should be a shall or should, because I think this is the core of the concretization of the purpose principle and the necessity principle.  But despite these brief comments I will confirm my congratulations for the fantastic work done, which is important for us. 

     And to finish just to use the floor that I'm having here and very briefly, I just wanted to comment because what happened yesterday -- yesterday when two members of my organization were taken from IGF for raising banners criticizing free basics part of Internet project, and defending (?) during opening ceremony, I think that most of you saw it or known about it, I would like to just inform that this decision was reversed this morning after negotiations that showed the disproportionate character of such reaction, and although we are at a UN territory and these are proceedings for a UN meetings in general what we were not aware, IGF is different from high-level conferences.  That's what we think among states representatives and the right to demonstrate to be part of this democratic, participative and multi-stakeholder process.  That's it.  Thank you.


     >> LUCA BELLI: Yes, certainly, and thank you very much for the comments on the recommendation.  They will be duly be taken into account.  I'm thinking since you mentioned fair use, to move to the other side of the table because so far we're speaking about platforms but we have not heard the voice of one of them, and this is a major improvement from last year, where we didn't manage to get to any representation from the platforms.  So Marcel Leonardi, it's a pleasure to have you here from Google, and I would like to ask you, do you think that terms of service can also in a way serve as a supplement to transparency reports in explaining how you carry out your specific responsibilities?  For example, in the case of take-down requests, how you assess, you know, whether there is fair use, and similarly with regard to right to be forgotten, you know, which criteria are going to be taken into account.  It would be nice if users could know them.  Now, there is a catch there that if you have everything in terms of service, they become very long.

     >> Right.

     >> LUCA BELLI: I believe that you can also achieve midway by having expandable, you know, dropdown terms of service or referring to other documents, but I would be interesting in hearing what you think.

     >> MARCEL LEONARDI: Thank you.  It's an honor to be here at yet another IGF.  On behalf of Google I'd like to thank the panelists for inviting me.  I'm really glad that we avoided an all-male panel, for starters -- yeah, and of course I really appreciate the amount of work that was done to get this project off the ground.  I mean, the amount of things, this kind of work is really a massive undertaking, not only this particular initiative but all the other initiatives we have had.  I see (?) over there with his Internet and projects, the ranking digital rights that was presented earlier today and several other similar projects really help companies like Google to actually understand what the concerns of its users are.  That's why I slightly disagree with the idea of being on the other side of the table because I believe we are on the same table.  We just exactly -- sometimes we share some views, sometimes we quarrel a little bit.  Carry on to what you asked and then I'll delve into some of the comments that I have on the (?).

     It's very hard for a company to draw very specific lines on content takedown, for example.  What is really complicated for any company, especially Google, and environment across the world, I'll take for example Brazil, which is obviously my home turf, the more generic the law tends to be, the harder it is to draw any kind of specific line, and that's the same idea on terms of service of implementations.  People might know or may not be aware that, for example, Brazil has criminal provisions on defamation, on offense to the honor of a person, that kind of thing, and what companies like Google face on a daily basis is essentially a dispute over what exactly those terms mean.  So we get, for example, court orders saying that criticism of a particular political figure or people shouting at each other online or some reason, where it offends or is a defamatory problem.  That translates.  But then we have the judges say, this is, this is not.  I guess it translates it a bit to how terms of service operate as well.  It's just as you said, we try to create specific guidelines to the extent that (?) but obviously they're always open to interpretation. 

     So the way this works internally at least, speaking for Google, is essentially teams look at each and every request for content takedowns.  Take the example of content takedowns and try to ascertain if whatever request comes up is within the spirit of the terms of service but also bound to what local legislation provides.  And I guess that's part of the key challenge that companies face in that sense, and that's why work like yours is so important.  You guys are trying here to establish international framework, which is precisely what's so hard to achieve in these discussions.  I mean, essentially no company would want to be -- wants to be put in a position that it needs to decide whether -- whether complying with local legislation also means throwing their users into a dungeon and into jail.  All of you may remember a case from about ten years ago about what happened to Shitau in China, for example, so that's exactly the sinned of scenarios I'm talking about.

     Now if you'll allow me some quick comments regarding the work that has been done, and again, I really appreciate the opportunity to comment and congratulate you again on this massive undertaking.  The only points I would raise as part of the private sector would be careful when suggesting this not to just look at the big platforms.  I see parts of this document emphasizing, for example, how online platforms are akin to public spaces.  I see parts of this document mentioning all of the responsibility that these actors should provide, and that's all well and good when you're talking about Google, Twitter, Facebook, Microsoft, all of the gigantic (?), but remind yourself these kinds of guidelines the work toward any new platforms as well, and sometimes you may create so many restrictions from the get-go, that essentially you are actually regulating innovation out of the space.  So even though these are, of course, regulatory provision -- not regulatory provisions, they're just recommendations, bear in mind that, because that's obviously something that needs to be addressed.

     One particular example I'd like to mention is, let's say some of you disagree with the basic business model of the online world currently, which is essentially lots of free online services, they're based on advertising, they are based on behavior advertising.  The fact remains that if the business model of the online world would change from day to night, or night to day if you prefer, not -- to no longer actually provide these free services, if everything was charged, data collection, data aggregation, data usage would still be as necessary as normal.  Why?  Because of the way this data is combined, the way this data is used to innovate, the way it's used to improve services, to create new platforms and all of these things.

     I did see on the Web site that some comments were provided by the private sector, and I guess my concern is I didn't really see them reflected on the final output.  But that's a different matter.  In any case, I'll be happy to take any questions.  I know people are anxious to ask them.

     >> Marcel, I want to clarify, I appreciate what you're saying, and we did try our best to incorporate all the comments, and I think the major gap that we had was that we were requiring express or -- consent for basically any further use of the data, whenever there is a change in the purpose, and this is -- might appear to be in contrast with some recent data protection laws that require processing data, also not only on the basis of consent but also on the basis of legitimate interest of the controller or on third party.  So we did put that in.

     But I think -- it's interesting because the new proposal required this legitimate interest actually to be identified, so this is something that I would like to ask everyone actually in the panel, starting from you, few think that in a big data world it is possible indeed to put in the terms of service a list -- list the categories of parties with whom those data are going to be shared and of uses that can be done of the data, even if this requires changing the original purpose.

     >> LUCA BELLI: Marcel -- sorry if I interrupt you, but before -- keeping that question in mind, can we also have maybe two questions from the floor so that we can all have -- okay.  Thank you.  Does anyone have any questions, comment?  If you have them please be welcome to -- do we have -- we have a roaming mic?  Could -- yes?  Could someone -- okay.  We have one there.  Perfect.  Could you bring the --

     >> PARMINDER: I'm Parminder from IT for Change, an NGO based in Bangalore.  A very interesting panel, and first of all a very -- very useful initiative thanks to Luca and others.  And I think it's great value in this kind of initiative.  It's important to nibble on the corners and make progress, but I just want to, you know, point to one structure fact at the base of everything else.  I heard you say one of those recommendations is that no data can be shared other than for the purposes which has been declared, et cetera.  We should realize that there are extreme statements, in a positive sense for me, but probably in a negative sense, for what it -- the engine of today's Internet economy.  You're talking about taking away the very basic logic of it, and I'm very happy, Google did respond to that fact that if you don't do it this way then you need to figure out which other way. 

     But I'm just pointing to the fact that we are dealing not with just a list of rights but rather the structure of the new economy as we would want to, what sacrifices or compromises we are ready to make, and what kind of value different new possibilities can give.  It's possible in a centralized manner data can be used in a (?) for which it needs to be used and there's enough value created and we make a compromise.  But I want to connect the rights issue to the very heart of the new economic structure, and therefore you are up against very powerful forces.  It's not just signing up a (?).  I could say before I end I was very expressed by Joe going to the heart of the matter of the Internet, because it's a (?) technical threat but this is our Internet.  You want to take it, take it.  Otherwise you are denied the Internet and we all say no, no, no, we want the Internet. 

     So then it said, okay, this is the Internet.  It's not a fact because the technical systems allow us to have other kinds of Internet.  We are organizing Internet social forum, which is a forum of the world social forum, which says another world is possible and the team of the Internet social forum and other Internet is possible.  And I think to at least except those possibilities makes us think in different technically fragmented manners.  Thank you.

     >> I think Chris had a question.

     >> And of course the Balkanization of the Internet, we were having an exchange about this.

     >> Sorry, Chris, can you just introduce yourself for the record?

     >> Oh, I thought you had done it.  Chris Marsden, University of Sussex.  The Balkanization issue of course goes back to NSF net being privatized and people using CompuServe and AOL and so on.  So this is very long-standing.  What I would really like it a platform that I could sign up to, in my case as a European citizen still, hopefully, but also I think for others.  A platform that was willing to respect the ICCPR or willing to respect the European Convention on Human Rights, and that's very difficult.  In most parts of the business world, if you want to work in a country you conform to the country's laws and it's very unusual that we are still -- we're now, I believe, 17 years since the case of Felix Saum in Germany, 15 years since the Yahoo in France case and it's still very difficult to sign up to platforms that will actually observe the laws of our own countries, which are the internationally recognized Human Rights standards that we would hope that we could get.

     So I want to support Joe in terms of I would like to see an alternative that actually did respect those Human Rights, and it is very difficult to discover that.  Comments would be welcome.

     >> LUCA BELLI: Thank you for the questions, and maybe we can start with Marcel with his reply and then the panelists can reply.

     >> MARCEL LEONARDI: The only honest answer to your question is yes, of course companies can do more.  Of course companies can do better.  Of course they can disclose more information in terms of service.  However, for one, there hasn't been any prior pressure to -- there we go.  For two, there are obviously some commercial concerns around that.  I mean, to what kind of level of detail do we want to go into?  Does that border go into trade secrets?  Does it go into partnerships that have to be kept in secret because of confidential agreements?  So of course there's probably more that can be done.

     However, one thing that's also worth -- another thing is would that suffice?  I guess the general criticism, and you guys didn't really do this, which is very interesting, is -- which is basically you're still acknowledging that terms of service is a useful tool, right?  I mean, essentially you want more transparency, you want to know better what companies are doing.  That's a great thing.  I mean, we may dislike them, we may not agree with them but several international laws, Brazil's states companies are allowed to collect and use data if they don't violate any legislation, as long as there's no prohibition on the use of data and as long as it's disclosed in the terms of the privacy policy.  So essentially I think that's a very good approach in which it allows companies to actually better disclose that kind of information.  So I guess the major concern is, one, there hasn't been much pressure so you can do it.  And two, we just have to be careful where we draw the line, but moreover, is it really useful?  Would people really be more comfortable if that information was widely available?

     >> PATRICK PENNINCKX: Well, I believe that the pressure may be coming.  If you are talking, for example, and that's why in the Council of Europe open forum you will see that we have invited Max Shrems tomorrow, and I think the world, just like after Snowden, the world is not the same after the Shrems decision.  I think it's important that we really take that into account.  I'm not saying on the one side, obviously companies have first an intrinsic different goal.  That we understand. 

     And I think that when we are going to look into the role of the intermediaries, we have to clearly take into account also that business perspective.  Business perspective or associations, business perspective.  We really need to take that into account and maybe we will not be going as far as the Dynamic Coalition is going, but what is clearly important is that we have to ensure that our citizens, who are also becoming more and more vocal on these issues and more demanding on these issues, that they keep a close eye on how their personal private data, for example, are going to be dealt with, what is going to be done with them, and that there is clarity about it.  And I think indeed, maybe the intermediaries had a little bit of a free hand in that.  I don't know.  I think it was like three seconds that people spent on reading terms of reference when they -- it doesn't matter. 

     I think it's crucial that we, as an international community, be able to provide a number of safeguards that are otherwise not provided.  And that is what the Council of Europe as an international institution stands for.  We have to not only protect but also oblige our states to go into directions that will allow better self-regulation of providers, and that is the way forward.  And I think an initiative such as this one rings the alarm bell, and I think that needs to be done and taken seriously into account.  Thank you.

     >> LUCA BELLI: I think Joe also had a reply.

     >> JOE CANNATACI: Yes, thank you, Luca.  I'd like to come in perhaps on three different angles.  Angle 1 is a point of information.  Here I'm not talking on behalf or -- with results from the mapping project but following something that Patrick has just said, I want to refer to another project who carried out together with at least one friend and colleague in this room called the consent project.  And in consent we actually went out and spent three years trying to find out whether -- what consent is and whether it works.  And Patrick has just said the people spent three seconds on average reading it.  When they read it.  Because our search in consent and some other projects suggests that between 80 and 85% of people don't read them, and out of those that read them, less than 11% understand what they read.  So you are looking at a tiny -- so when we're talking about terms and conditions and terms of service, we need to find out, is it a meaningful consent?  And to my mind all the things we have suggest it is not meaningful.

     The second thing that I would like to perhaps draw attention to, and here perhaps I will now put on my hat as a Special Rapporteur for privacy, is that when we talk about business models, it is not a simple subject at all.  And I think that we have to be careful what we wish for, right?  Because when you are looking -- and I say this with the greatest of respect to our colleagues from Google here.  If you look at an operation like Google and you see how it is providing benefits and access to information across the planet, the minute that you start thinking of seriously amending the business model, you are going to have impact on people who can't afford to pay less and people who can't afford to pay at all. 

     Now, I'm not saying let's not fix the business model.  In fact, I'm saying the opposite.  I'm saying let's work together with people like Google and other suppliers, big and small, to understand what the impact is.  And I think there's a Human Rights impact assessment which needs to be carried out, but there's also -- when it comes to Human Rights there are also accessibility impact assessment which needs to be carried out, and fair information practices assistance which needs to be carried out.  And I think this is important.  I think that one of the things which has been missing is a structured dialogue with -- between the companies and with the companies.

     Now, when companies -- obviously companies in many cases are there -- in most cases are there to make a profit.  And in this sense you can understand that when in discussing their business model, there might be legitimate concerns about discussing those parts of the business model which give them a competitive edge.  Fine, fair enough.  Let's deal with that.  But that doesn't mean that we don't discuss it.  That means we discuss it in a different way, and what is worrisome too is that these things take place in fits and starts, which is why I'm so pleased that we have had this second meeting here and why I look forward to the next ones, because if this is part of a structured dialogue, right, I think that that structured dialogue can eventually lead to better places and better spaces.

     And the third point I would like to make is on the role of organizations like the Council of Europe and other organizations, which can facilitate a structured dialogue at a regional basis, and at the same time then look at how that can be globalized in a way which is going to benefit everybody.  So just to -- since we have -- I could go on at length but I'd like to stop here.  But since we have Google and the Council of Europe in the room we'll take that as an example, right?  If the Council of Europe has -- facilitates, I should say, discussions between various actors and stakeholders and with Google -- with Google on various parts of its business model in Europe, which would make it conform more readily to a Human Rights regime as we would like to see it take place there, then it makes things that much more easy at the global level because you're dealing with a global company.

     And I see a huge advantage in that.  We're then going to talk about with multi-stakeholder environment in Latin America, in North America, in Southeast Asia, et cetera.  The different perspectives and the different cultural perspectives, ladies and gentlemen, make things come together at a global level in a much easier fashion, because sometimes I am wary of a one size fits all approach, which may actually threaten the very rights that we are setting out to protect, and impose useless restrictions on business models, while at the same time not putting in the right restrictions.  I think Marcel has been very honest and said, hey, guys, there wasn't that much pressure.  We had our eyes on other things.  Not only is there pressure but legitimate concerns. 

     And may I say it, and I'd like to end on this.  It is a great pleasure to me when I see companies use privacy as a USP, as a unique selling proposition.  Come on, get it on.  Make your services more privacy friendly.  And customers will vote for it.  Another thing that I am committed to is increasing awareness of privacy, and I've -- if you are going to have a choice between a product which offers you less privacy but which is just as cost-effective, and a product which is offering you more privacy and is even more cost-effective, and to that end possibly more profitable, to a certain extent I'm also suggesting, Marcel, you might want to look at, you know, stack them high, sell them cheap, because if you're going to have a better product, you can have more people, more money using your service -- people using your services, more money and then you can afford to cross-fund services in those areas of the world, especially developing countries where there's less money to be had.

     >> LUCA BELLI: Thank you very much, Joe, and I think I could speak on behalf of both the co-coordinators and the digest coalition, saying the platform responsibility will be more than happy to facilitate this kind of multi-stakeholder dialogue within the IGF but also besides the IGF perhaps at the Council of European (?) -- so I think we have another comment.

     >> PRIMAVERA DE FILIPPI: Yeah, just a brief comment about the question -- about the question that dealt with big data and our capacity to put in the term of services, the users that can be -- the data users and the third parties.  I think that we have to find this path because this is the only path that can drive us to really concretize and enforce all the data protection principles that we have in international law, in other national law there are references that we are trying to have here in Brazil.  So this is something to do with a position that -- well, I mean, they're sad, I think that we should deal with this as a position and I don't think you did it, but we have to try to work in a common path, in a common ground with the business model challenges and the rights -- the protection of rights, and the economic activity has to be built above this rights protection. 

     And just another comment, an important comment about the -- if the terms of services are meaningful or not.  I think that they are part -- they are a part of this process, and even the recommendations, they are not only about what is written in the terms of services that a use reads when it starts to use a platform, but even some tools are here to be provided as the tool that -- the user, for example, be able to redefine the extent of the availability of its -- of its data, of their data.  So that's why for me the discussion about terms of services has to be done in conjunction with the discussion about privacy by design, privacy by default, and how the -- the applications are developed to -- to be able to protect regularly the privacy.

     >> LUCA BELLI: We have time for a couple more questions.  Yes, one in the back, please?

     >> As we started with five minutes of delay we could finish with five minutes of delay, I think.

     >> Yeah, if you want.

     >> (?) okay.  Thanks for bringing the Balkan always in the worst possible light.  Never mind, we are used.  We live in paradox.  I think that consent is something that it's really worth it to analyze and to think of the ethics, but when we come to corporation I have the feeling that being the user (?) and in the land that was not regulated, the cowboys came and took whatever.  And now they say that this is the business model that we need to protect, to save, in order to maintain the Internet.  Otherwise we will lose, and we are what, Indians on the reserves.  I think that we should not fear and not put the business model of it up front.  It's a business model that is based on the users' rights, on the users' information, which are a treasure, and big company as Google and Facebook, which are -- have an incredible amount of money, could experiment and put research for searching for another alternative, more respectable of the users' business model.  So please avoid the mantra of business model cannot be rediscuss, and always say if we rediscuss you stay out.  Because this is the reason why a user spends less than 18 seconds because it's a take or leave it check-in box.

     >> Primavera wants to ask a question.

     >> PRIMAVERA DE FILIPPI: Yeah, it's a comment and I guess a question.  I think there has been all the discussion about the consents, so whether the terms of use are actually clear, understandable and whether the user are providing an express consent.  And then there has been the question about the business model, so whether it is possible to identify alternative business models that do not rely on the collection and processing of personal data.  But it seems to me that there is another point also that has not been mentioned, which is the growing desire of user to actually benefit from an extremely (?) and personalized service, which actually require to some extent, like intrusive collection and processing on personal data, and which definitely leads user to provide consent and they actually do understand it has an impact on their privacy but they will not enjoy the personalization of the service.

     So my question then, and it's a question to all the panel, whoever wants to answer, is is it actually fair to prevent platform operator from actually providing a more concretized service at the expense of what user actually wants?  So if the user is actually willing to forgo its privacy in order to enjoy a personal service, (?) might actually lead to potential violation of Human Rights, can we actually prevent platform from providing that?

     >> MARCEL LEONARDI: Okay.  I guess I'll try to answer that.  And to your point as well.  So I'm a little bit confused because benefiting the user, obviously people vote with their feet.  Actually people vote with their mouths, with their devices, right?  I mean, in a way if you want more private services using the same platforms, most platforms, Google included, obviously have options.  If you go to my account.Google.com you can configure the service as private as you want.  You still see advertising.  Let's be very clear on that but advertising will no longer be targeted.  You can erase any search history that you might have.  You can select that Google will no longer collect any of that information.  So on the other hand, how useful would be Google maps or any other mapping service for that matter without collecting the user's location.

     So I guess to your point, you're right, it would be possible to develop different business models, but so far the market has not shifted toward those models because there hasn't been user demand.  I know that we are at the IGF -- usually this is a community very engaged in privacy discussions, but let's face it, the average user simply doesn't care.  It's up to you guys, for companies, for everybody in this room to actually rise to the occasion.  It's different in Europe, it's different in Brazil everybody it's different in the U.S. but that's what companies are perceiving right now.

     And my second point would be more related to the second question, would be, I think we're seeing new business models creeping up.  I mean, you have plenty of services right now that are based on subscription, things like Netflix, like Spotify, and yet it kind of validates my earlier point which is all these services, despite not having target advertising, at least in their paid models, still collect user data.  Why?  To improve their services.  So I guess we're getting there, but it's up for people to build and choose to use those services.

     >> LUCA BELLI: Thank you very much, Marcel, for this comment.  And sadly, although the discussion could be -- could go on for many hours, we are running out of time, so I would like to thank all the panelists for --

     >> A discussion that is actually another workshop on terms of service has had the regulation taking place in Room 2, starting at 4:00.

     >> LUCA BELLI: Yes, so it will deal not only with platforms but also with infrastructure and content, if you are interested you're all invited, and thanks for your attention.


     (session ended)