You are here

IGF 2018 - Day 2 - Salle III - WS319 Regulations for a Neutral and Open Internet at the Age of Online Platforms

The following are the outputs of the real-time captioning taken during the Thirteenth Annual Meeting of the Internet Governance Forum (IGF) in Paris, France, from 12 to 14 November 2018. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> LUCIEN CASTEX:  Hello, everyone.  I am happy to welcome you to the session on Regulations for a Neutral and Open Internet at the Age of Online Platforms.  We are very happy to introduce our speakers.  Paula Forteza, member of the French Parliament, which will bring her expertise as rapporteur on the GDPR in France, and which is part of the Working Group of the constitutional revision to include Digital Bill of Rights in that constitution. 

      We have also Sebastien Soriano on my right, which is President of the telecom regulator ARCEP and which will bring expertise on data-driven regulation.

     Also we have Theodore Christakis, member of the French Digital Council, an independent regulatory commission created to look at digital matters in France. 

      We have Luca Belli, member of civil society looking at how can this be an active movement of democracy.

     Finally, we have Cherif Diallo, Director of ICTs at Telecommunications Ministry in Senegal, and which will bring a state-centered approach.

     I give the floor right now to the first speaker.

     >> PAULA FORTEZA:  Hello.  Thank you, everyone, for being here.  I would like to share with you some ideas that I think are not yet consolidated.  It is more like an invitation to comment that I would like to launch today.  I have been thinking about this concept that I like to call open regulation or regulation by society.  It is the idea that on top of the action that regulators have to do, we need to give the tools and the resources so that people themselves become regulators.  And so we need to give them the data.  We need to give them the rights, legal rights, and we need to give them the access so that they can understand what is at stake and that they can act by themselves.  So I think this can, we can look at this on several aspects. 

      One of the first is how we modernise the action of regulators, how we open up regulators themselves.  We can think about how they can start working with the right tech ecosystem, for instance, through open innovation through maybe to put in place labs so that this interaction can take place.  We have, for instance, in France the CNIL, that is the agency that takes care of data protection that has a very dynamic lab where all these things take place.

     We need to have also regulators who consult people.  Within the U.S. it is something being done very regularly.  I still remember the time where John Oliver did a little speech on the TV on net neutrality.  The result was that in a couple of days we had 150,000 comments on the FCC website.  So that is the power that we can have when people realise what is at stake.  And they have the place where they can express themselves.

     Another thing where we can modernise regulators is by working through data, regulation through data.  On that the ARCEP is doing a very good job.  I will let you talk about that.  We need data that regulators can access but open date, for instance, on how algorithms function, recommendation algorithms.

     We have been working on the fake news bill in France on our way to have transparency on algorithms without giving up on the secret of affairs.  So the idea was to open up statistics, aggregated statistics on the outputs of algorithms so that we can understand the biases and which kind of content is put, is more like disuse than others.

     This is more the modernisation of regulators.  But we also need to give people rights that they can activate.  This is something we have been working, for instance, through the GDPR and one of the most, I think, important rights that we put in place was the collective action that had started to be used in France by, for instance (French name), and by the Internet Society.  They launched a new collective action a few days ago with a very beautiful website called IBASTE.org and I invite you to go and see what they've done.  It is very clear and an interesting approach.  Because they promise a compensation to people who join of more or less a thousand Euros.

     That is something that we put in place when we were working on GDPR in France because the collective actions for data protection existed already, but we added this possibility to have a compensation when there is a prejudice.  On this idea of rights, we also have the portability that is a very interesting concept that we still need to operationalize.  I think President Macron did a very good point on that yesterday, along with the tools that will make this a concrete right that we can use.

     We have also been working on this bill, constitutional bill where we would put in place like the big concepts and the big rights that overrule our way of interacting on the digital sphere in the French society.

     The last point that I would like to make is that in order to build in this sphere of open regulation, we need to stop talking among each other.  We are always talking between experts and using very technical terms.  We are not getting our citizens.  Yesterday Macron did a very beautiful speech which I think was fundamental.  And yet we did not cover it at all.  We had one or two paper reviews but none more than that.  I think he is talking about the future of our society in a way that is really crucial and fundamental.

     So we need to talk to the public opinion.  We need to choose the words.  We need to choose the images.  I still remember also the video that Burger King did to explain net neutrality.  That was, they were using Whoppers to explain how net neutrality functions.  It helped people really understand.

     So we need to find these issues.  And also trying to maybe sustain and convey people that mobilize with civil society.  So we have, for instance, the savior Internet mobilization that is placed at the European level.  That was an interesting thing to explain what was at stake to the big public, to the broader audience.

     We also have this new campaign by Tim Berners-Lee for the web where he is trying to defend the idea of open and centralized net.  These are also campaigns that are trying to mobilize different kind of actors, not only the experts that we see talking about these issues on a daily basis.

     So I think these are some of the points that I have been thinking about in the last few months.  I would like to have your ideas on that and the returns from the other members of the panel on these issues.  Thank you.

     >> LUCIEN CASTEX:  Thank you, Paula.  We indeed need to reach out to citizens.  I would like to give the floor to Cherif Diallo on the state government perspective.

     >> CHERIF DIALLO:  Thank you, sir.  Good afternoon, everybody.  The concept of net neutrality is that all Internet traffic must be treated equally without any discrimination and regardless of standards, type of content, device, service, or application.

     So the first and most important principle is to answer identical tracks for content for all providers, commercial or not.  This principle was not done in the Dutch and Slovenian laws and the commission of the U.S. government in its issue in March 2015.

     That means that users must be able to access the information and the content of their source without discrimination.  Under this principle, providers must neither block nor conversely guarantee preferential treatment to one actor over another.  For example, sometimes there are slowdowns in video speed depending on the Internet access providers.

     So my objective of net neutrality must be first to develop a very rich ecosystem, favorable to freedom of expression and information access.

     Secondly, to promote the freedom of creation through an open Internet and to generate innovation through unfiltered contribution.

     Finally, net neutrality should help to preserve a digital public space that brings freedom and innovation.

     We have some for example inventory aspects of net neutrality.  It is necessary to have a new legal framework which brings a strong principle of neutrality, responding to concrete situations and taking into account the economic state.  So net neutrality should help to develop a great principle for a network but also for access and communication services.

     The principle must be fair and must complement the existing law.  The principle of net neutrality must aim for the future because we have a lot of innovation every day.

     In Senegal we have a project of updating and harmonisation of the ICT legal framework.  In this project we have some action plans.  First we must clearly state the principle of network neutrality in ICT laws.  For end users, for Internet service providers, for infrastructure providers, operators, for content providers, for application and service providers.

     The second principle for us is to supervise and strictly limit the use of some specialized services.  This is to take all action for providers that they must be able to offer some specialized services.  For example, tell mission, which is a set of medical procedures, surveillance monitoring, that are carried out by a means of the telecommunication network.  The use of these specialized services is justified by the development of innovative services that require guarantee continuity of service, whether for confidentiality, latency or other security reasons.

     In this way, the specialty services must remain the exception and must by no means by the multiplication reduce the quality of the public Internet, as we can understand.

     And then in this legal framework of ICT in Senegal, we have to set up strict supervision of the exceptions which are necessary to prevent the emergence of a multispeed Internet.  So we have to set up the clear definition of these services which will allow providers to supplement the limit of the Internet best of all by using services with equality of services which is guaranteed.

     Another principle in the legal framework is to apply the principle of net neutrality to all network technologies.  Net neutrality is evolving as technology advances.  So it is necessary to adopt a broader definition of net neutrality beyond physical network alone in order to aim for money networks, particularly in the face of so-called Zero Rating process which offer certain unrestricted and uncomplicated services of the mobile data plan.  For example, for YouTube or Facebook, which are limited for this plan.

     So we must also help to guarantee interoperability of standards, including connection objects.  In this legal framework in Senegal, we are also trying to give one's self the means to control and observe the application of net neutrality.  There are different practices deriving from the neutrality on the mobile network, on one hand; but also on fixing the problem.  Sometimes transit argument and peer argument negotiated between infrastructure providers and some operator content, application providers do not fit exactly the principle of net neutrality.  Their argument must not constitute the Trojan horse of an attack on network neutrality.

     It is necessary for us to assure neutrality and assure that subscribers adhere to the technical performance of Internet access offers including traffic management practices in some situations of differentiation.  This legal framework in Senegal must also help to impose transparency to contribute to Internet transparency providers on providers, operators, by auditing, by independent organisations and the subject mobile network and the connected object network must be subject to the same fairness and quality of services control as for fixed networks.

     Finally, in Senegal we are trying to give more means to control the objective and observe the application of net neutrality by giving to the regulatory authorities a right of strict security over -- these will assure that the arguments are not a threat to the tracking of content, especially in the case of powerful actors.  The auto traffic regulation will try to federate feedback on the quality of service produced by the various categories of actors, civil society, developers, technicians, internal users in companies, aggregated by the regulatory authority in open data.

     Those are some principles we are trying to fit in order to update and harmonise the ICT legal framework in Senegal.  That's my last words.  Thank you very much.

     >> LUCIEN CASTEX:  Thank you, Cherif Diallo.  I do like the idea to adopt a broader definition of net neutrality.  I would like to now give the floor to Sebastien Soriano who can discuss regulation.

     >> SEBASTIEN SORIANO:  Thank you very much and thank you for those who proposed for me to intervene in this panel.

     Just before presenting you several stuff, I just wanted to, if you are tired in advance to listen to me you can also look to my Twitter account because I published the official intervention.  Maybe it can be easier for you to read.  And to make it more living, I will not exactly present the same thing that is written down.

     So the question of the day, if I submit it is about Internet and regulation.  What do we do?  The first question is why to regulate.  I think that this question is really in the hands of politicians.  It is not me as a regulator who has to say that we have or not to regulate.  I think we have from yesterday the very strong assumption from the President Macron about the answer, his answer to the why is, I want an alternative to the Californian and Chinese Internet.  I think that really is a perspective, assuming that none of these Internets is democratic or truly decentralized.

     I think it is objectively clear for China and for the United States it is because of the big tech.  I make it short.  That is the why.

     Then the question is what do we want to regulate?  Here I will not go into details on that subject because there are many debate about do we want to regulate platform.  What is a platform.  Do we want to regulate social media?  What is social media?  Is Tinder social media?  This is a very important debate but not the main debate of the day.  To mention to you, we have done work on a specific issue which is the devices.  We have shown that there are many restrictions on devices.  We make proposals to, we consider devices as the weak link of the open Internet.  We make proposals to open it and give more freedom to the consumers.  I invite you to visit the booth for the ARCEP.  Thank you for the teams that will report you on this data.

     If you allow me I will concentrate and focus my words on how to regulate.  And here I totally follow what Paula Forteza says about the fact that the end game is to empower society and not to fix the solution by the regulators themselves.  So in other words, all the regulation that we have to think is as an endpoint, how to empower people.  How to empower consumers.

     Let's be clear.  It is a total disruption in regulation.  As I like to say it in French, (speaking French phrase), which in English means if we want to regulate the disrupters, we have to disrupt regulation.

     So what means to disrupt regulation?  So first we have to be clear about the fact that our classical tools doesn't work anymore.  What are our classical tools?  We define a rule and then we enforce it.  This doesn't work for two reasons.  The first is that the time to understand what is happening, the time to write it down in law, the time to write it down in a decree, the time to enforce it, what do we want to regulate has changed.  There is a new hot topic that has appeared.  The tools we developed are not useful to solve the new problem.

     So the first factor is time and the acceleration of the innovation ribbon.  The second factor is the worldwide scale of the market player.  It is not easy.  I fully respect the French Parliament, but if the French Parliament votes a text that says I want Google or Facebook to do this or that, it is not so easy to make it real because of this international footprint of the market player.

     So we have to invent new tools.  The first main solution that we are proposing in ARCEP that we actually, I insist, we have actually developed in our telecom regulation and we think this type of regulation could be of interest dealing with the Internet is about regulating with data.  Also called data driven regulation.  You can also call this crowd regulation, if you prefer.

     What does it mean?  It means use the data to influence the market.  Very simply.  I give you one example of what we are doing.  We have a concrete problem regulating in ARCEP and this is written down in the paper I published.

     As a telecom regulator we have seen that the consumers in France, they are very interested in the quality of the mobile he network.  And especially the coverage of the network.  But they have no clue about which is the best operator for them.  So what we have done is first we have unbundled the data from operators.  In the 20th century we were unbundling the network.  Today we unbundle the data.

     So it means concretely that we are extracted information from the telecom operator to have on a standardized base their level of coverage.  We have put this information on open data.  Then on top of that we have worked with comparison tools, especially startups.  There is one comparison tool we have a partnership with, his name is Qosbee, Q-O-S-B-E-E.  If you don't know this application, you start this on your smartphone.  It follows you a week and produces a map of where you live with the percentage of time where you live, where you work, the subway you take, the route you take.  Thanks to this information, the application can map it with the information we have on the data of mobile coverage and the issue of ranking of operator.  Not a general ranking.  Not what is the best operator in past in France, but what is the best operator for you.

     It means concretely we have done nothing except having information, putting this information available to the consumer.  What happens, consumers are switching.  They are switching from the networks with less coverage to the networks with the best coverage because when you have this information and you look, okay, so the best operator for me would be orange, for instance, who is my operator?  Oh, it's SSR, just as an example.  Oh, I can switch.

     So typically that is what we see.  This creates a virtual cell, because this incentivizes the operator to do this, et cetera, et cetera.  You can use data to influence the market.  It is clear that you can do this with platforms giving the information about what is the platform that is giving the best privacy to you, for instance, all these kind of things.

     But honestly, and this is the conclusion of my paper, I am not sure that regulating with data will be enough because in the end we are playing with big tech players, with big corporations that have developed nudge approaches.  So it means that it gives you the flavor and the impression that you have the choice, but actually you are directed very much to do several things on these social networks and platforms.

     The nudge in itself is not good or bad.  My strong belief is that today as regulators our role is also to play with nudge and to be the architect of the choice of consumers.  I think this is really the future of what a regulator should do.

     And in that sense I like very much the proposal of a visiting Telex from New York, Times Square -- I don't remember the name of the guy, sorry.  It's in the paper.  He made this video where he proposes this idea:  The right to be represented by a bot.  What does it mean?  It means that the right to have a piece of software that represents you online.  So you give your parameter the bot.  You give him several mandates.  The bot will interact with the platforms.

     This exists today.  Just think when you are participating to an auction on eBay.  You feed the parameter, I will be until ... same thing for Google ADD. words, for your advertising to be on top of the search engine.

     So these bots exist today, but they are totally controlled by the online services.  Imagine that all online services has the obligation to open its APIs, to interact with bots.  Not all kinds of bots, of course.  We have to create some statutes for these bots.  It will totally change the relationship between the platform and the consumer.  All the capacity, the complexity will be dealt by the bots.  You can also with this manage multi-homing without problem.  Imagine you are a driver.  You can be hosted on several driving platforms next to this.

     So I don't know at what extent this is feasible.  I will be happy to work with you on that.  Possibly this is typically the kinds of idea I think we have to really work on.  Thank you for your attention.

     >> LUCIEN CASTEX:  Thank you, Sebastien.  It is interesting indeed to be able to use the data as a tool to empower society to act.  That is quite indeed interesting.  I would like to now give the floor to Luca Belli about a democratic and collaborative regulation.  Thank you.

     >> LUCA BELLI:  Good morning to everyone.  Thank you very much for inviting me.

     So just to provide a couple of things about where I work.  I am Professor and Senior Researcher at the FGV Law School in Rio.  It is not renowned but it is the most influential in Latin America and also a private university considered also the seventh most influential think tank in the world.  It was at FGV together with the Brazilian Committee and the Minister of Justice that was at the very beginning conceived the project of what then became the so-called Brazilian Bill of Rights.  It is de facto a law.  It is not a Bill of Rights or a constitution; it is a law.  But it is a very relevant law because it is the first law in Brazil that has defined the rights and obligations for individuals and corporations online, particularly with regard to platforms, application providers, and Internet service providers, of course.  It has the characteristics of a constitution, if you want.  There are first principles and specified with more detailed norms.  But what is particularly relevant to this norm is that it has been produced through an extensive participatory process. 

      Actually, first it was presented as a project with an open consultation.  Then this was in 2009.  In 2011 it was sent to the Congress and it became a bill before it was what in Brazilian law is a prebill.  It was discussed by the Parliament, the Congress.  There was no agreement on the text.  Then the Snowden revelation triggered what one called the aha moment where everyone in the world said:  Aha!  Now I understand how things work online.

     Then that was the moment that triggered the adoption of the civil rule signed by former President Rousseff in 2013.  With data protection and net neutrality were the fruits of two different consultations.  This means that actually producing, producing legislation through participation is feasible.  It is not something unfeasible.  Actually, the example, say the example of the French (French phase) was actually, I think, highly inspired by the Brazilian previous example.  It is very good to see that this kind of good practices are actually spreading around the world.  It is something feasible. 

      Of course, as any democratic exercise, it requires a lot of efforts, resources.  It is not easy.  It may be hijacked.  So we are seeing particularly in the last years that democracy also has vulnerabilities.  It is like a programme.  There are vulnerabilities that can be exploited.  One has to understand what is the anti-virus that allows democracy to survive and not being contaminated and destroyed, exploiting the vulnerabilities.  They are not perfect.

     So the principle of net neutrality is one of the main principles that are listed by Article 3 of (non-English phrase), the Brazilian Bill of Rights.  But I think both the Brazilian example and the European example, and to an extent the recent U.S. example, demonstrates that having legislation is nice but not enough.  I totally agree with what Sebastien was saying about the need for regulation and I would call it distributed regulation, meaning we have seen, for instance -- let me give you the Brazilian example. 

      The law in Article 9 of Marcus Brazil, described nondiscrimination of Internet traffic.  It is Article 9 in the decree again saying Internet service providers cannot give a privileged treatment to any kind of packets or any kind of service, any kind of content and cannot give a prioritised treatment to the services that vertically integrates with them.  If a provider has an application that provides video, that cannot be prioritised or privileged in any way.  Well, we have at the same time of this wonderful provisions then, we have very widespread Zero Rating practices that are tolerated by the regulator. 

      Many people argue that Zero Rating practices are a violation of net neutrality because they privilege some kind of privilege and content.  If you look at the regulation and at the legislation, they are forbidden.  Also because there are some flagrant cases of all providers giving subsidized access to their own needs and practices.  That is contrary to what is written in the law.  That is to say I don't want to self criticize the country where I live. 

      I want to say that having a nice law is very good.  Having a nice decree is very good.  Then if you have a regulator or regulators or regulatory framework which does not fit to implement the law or to understand how the law should be implemented, because I think what is very telling is that the antitrust regulator issued a technical note to justify why Zero Rating was something feasible and compatible with regulation following a note and previous note from the telecom regulator.  They said that by enjoying -- so this was the issue, the decision taken as a result of a case where a public prosecutor was arguing that having access only to Facebook, Twitter and WhatsApp was a flagrant violation of net neutrality.  It was not compatibility with all the regulatory framework that we have.  I strongly agree with this.

     The reply of the regulator was that by using Facebook's and Twitter and WhatsApp freely one was enjoying free access to the Internet.

     Now, not understanding that Facebook, Twitter and WhatsApp are not the Internet.  When you have subsidized access to them, you do not have access to the Internet.  You have subsidized access to three applications.  Not understanding this is for me a flagrant lack of understanding of how the Internet works.

     Now, that is not, I am not saying this because I have been working a lot on regulation over the past years.  I am not saying that because I think this kind of situation is a perversion that is going to limit freedom of expression and limit competition.  I am saying this because it is increasingly evident that this is going to destroy democracies.  What you also have to understand is Zero Rating process, subsidizing access to the Internet is widespread among people who have lower financial capacity and coincidentally lower literacy.

     You have almost 60, 70 percent of the Brazilian population that has prepaid Internet access.  I mean, they are only charging -- they are buying 20 Reals, five Euros and having Facebook free, Twitter and WhatsApp free.

     The Brazilians ha have the less means that only have the three maximizers of fake news and it has to pay if they want to check if the news they receive is true or not.  And this is something considered like from the regulators something that is the free enjoyment of the Internet.  This is an obvious misunderstanding of what the Internet is and obvious under statement of what could be the consequences.

     We've lab rated the coalition that we have here, the IGF that I chaired since 2013 a map mapping where Zero Rating is available in the world.  If you just put in your browser ZeroRating.info you will find it.  You find some countries in blue where net neutrality and Zero Rating laws exist and there is also data protection.  You find some countries like typically Latin America and Europe where there is already some net neutrality, there is already many Zero Rating regulation and there is already data protection regulation.  Those are in yellow.

     Then you find the majority of African countries that are orange or red.  There is either no regulatory regulation or no Zero Rating regulation and almost never data protection regulation.  Why I'm associating those three?  Because why is Zero Rating, we have to come out of the logic, the classical logic of net neutrality where the provider wants to give privilege to its own vertically integrated services, to let you use a service and not the other one.  It is becoming so much lucrative to have people's data, personal data which makes economic sense to pay to sponsor, to subsidize access to a given application in order to extract your data.

     Now, guess what?  In all the African continue net neutrality all the Facebook services are zero rated around the world.  In the African continent where you have no protection, no net neutrality or Zero Rating regulation, it is also the country where associated partners operate.  We are allowing people to massively collect data and to give to people with low literacy access only to those applications that massively connect data and can spread fake news.  While knowing since at least eight years that the personal data are casually shared with not always good faith actors. 

      And we are not stopping this.  We know it.  I think now is the moment not only to think about which norm we want to have but also which kind of Internet we want to have.  We want to have a couple of intranets dominated by a couple of super corporations dictating who we vote?  Okay, fair enough.  We want to have, to come back to a generic Internet and an Internet which is general purpose and where people can create innovation, then we have to change our mind sets.  And with this provocation I will finalize my talk.

     >> LUCIEN CASTEX:  Thank you, Luca Belli.  Indeed it is important to understand what is the Internet as a first step to be able to really regulate it.

     I will now give the floor to Theodore Christakis before the debate part of this workshop.  Thank you.

     >> THEODORE CHRISTAKIS:  Thank you very much, Mr. Chairman.  I would like first to say that as my colleague, I'm a Professor, Professor of International Law working on cybersecurity, data protection, and artificial intelligence issues.  I am also a member of the French National Digital Council and I would like to present some thoughts precisely on the issue of regulation viewed from the perspective of a lawyer and of our work within the French Digital Council.

     First we said a lot of bad things about platforms.  Let's start with something positive which is platforms have a positively disrupted our lives.  There were very effective in easing access to information, in easing access to goods, including cultural goods, opened new business opportunities, and the platforms have brought real benefits for consumer and for businesses.

     This being said, immediately we should add that their development has raised a lot of concerns in various fields, including issues of a simulation of power and information that have been mentioned several times.  Concerns about market dominance and side effects on competition and consumption including ease of use, not easy to read and very transparency.  Side effects especially from a human rights lawyer point of view on human rights, with several issues.  Data protection and privacy issues, of course.  Fake news.  The concerns about algorithmic discrimination and bias.  The issue of elicit content which covers a lot of different things:  Terrorist content online, harassment comments, abuse of children, Intellectual Property and copyright issues, several different things, hate speech, we could add several things.

     Is there a need for regulation?  I will quote what Brad Smith, Chief Legal Officer of Microsoft said:  We need a new generation of laws to govern a new generation of tech. 

      The issue is clear even for the tech industry.  What happened a few months ago in France is that the Minister of our digital affairs and the French government launched this (French phrase), the French convention on the issue of digital regulations.  And it was extremely interesting experience because we met several times in different groups around the table, deputies including one who is here with other representatives, representatives of different ministries, with the independent regulatory authorities including our cybersecurity and all the other members of the French National Digital Council.

     We tried to think about it, to rethink the issue of regulation, to propose ideas eventually for a new approach, to explore different competencies, methods, tools, and what was extremely interesting was especially when we were hearing the regulatory independent regulatory authorities talking was the difficulties that they face every day for sever reasons, including in order to understand how platforms operate.  The issue of time that has been mentioned so much so many times, how to open the algorithmic black box.  And all these issues.  It was very clear that there are some issues, there are some problems.  And a lot of ideas were proposed.  I will not present them here in detail.  Among other things that the French Digital Council proposed, a lot of other things were also mentioned by other participants.  But we considered, for example, that the principles of transparency are very important in order to reach a fair and open digital environment that favors the division of capacities across Europe and assure that platforms continue to bring benefits to our society.

     We proposed the development of new tools including, for example, to be able to audit algorithmic systems through reverse engineering methods.  We proposed to build canals of collaboration between regulators, citizens and researchers with different means.  And also there is this proposal of the French National Digital Council to create a watchdog at the European level of online platforms in order to help promote fairness in their behavior.

     This being said, there were a lot of other proposals that I don't have the time to suggest here because I would like also to insist on the issue that in our mind regulation must be wise regulation, should not be wild.  A wise regulation, yes.  Not a wild one.  (French phrase.) Why I say this?  Because there is currently, everybody feels this need for regulation and for new tools, but this should not lead to a kind of tech-lash. 

      I will end by this.  We need to make our Internet great again, but first we need to save it.  And a lot of ideas were suggested by the French Digital Council during the discussions.  And first, this is the point of view of the lawyer which is before talking about new regulatory tools, et cetera, we have to assess what already exists or what is on the table.  You talk about all of you about the GDPR which is a major regulatory tool, extremely useful an came into force a few months ago.  It entirely transforms the landscape in privacy and touches on cybersecurity issues. 

      There are so many proposals right now including at the European level in order to tackle the problem of illegal content online.  The Parliament voted in favor of the copyright directive recently after a very heated debate.  There is a new regulation on the table being discussed about terrorist content online and several other ideas, including at the national level.  We see in Germany, for example, the law on hate speech.  Another field is the field of assisting law enforcement investigations.  We have on the table, cloud act was developed in the United States and a lot of movement there.  We have European online platform trading regulation on the table.  We have the work in EU on algorithmic transparency.  Let's assess what is going on, understand and try sometimes to clarify regulations.  A lot of texts are coming out of the EU.  Every time a new text comes out, we know that we lawyers have to work very hard because they are 80 pages with hundreds of references sometimes and we are lost in regulations sometimes and we need to clarify and simplify things sometimes and not just add, add, add eternally.

     Some major safeguards.  I will end by this.  There are some very important issues.  First, preserve innovation.  This is something very important.  I would like to quote here the former Head of Nokia and Head of the group on artificial intelligence.  I will quote that Europe has to make sure that we do regulate when it is the right time, but we don't do it prematurely when it will actually create impediments.  He said for the continent to be competitive, it equally has to avoid rushing into regulation concerning artificial intelligence.  He said, well, it is very good to propose a broad horizontal principles for the ethical use of artificial intelligence, but for regulation when it comes to norms, probably it could be better to proceed exposed, which we will also give room for self regulation and we must understand when to regulate and when not to regulate.  A word of caution by a high expert.

     Also building on this, regulation should be business-friendly.  This is something very important, especially for small and medium enterprises.  And this is something that must be highlighted, that a regulatory conditions are very important factors which affect small and medium enterprises.  They need protection.  They still need protection.  The idea which was underlying the eCommerce directive in the year 2000 still is perfectly relevant for small and medium enterprises, probably it is not that relevant anymore concerning the big giants, but we need to think about it and to offer a level play field for small and medium enterprises so that they will be able to be developed.

     And a third word of caution.  I would say last but not least, protect human rights when we regulate.  This is something very important and you quoted the Tim Berners-Lee to said it is extremely important to give people greater control over their data and a safe haven for debate.  That was the words used by the French Minister for Digital Affairs, give people greater control and all the speeches that I heard speaking about this end game of empowering people and societies.  France was, by the way, we must highlight this, that France was the first government to sign up to the Berners-Lee new initiative with this idea of if a new contract for the Internet.

     When we are talking about human rights the first thing that comes to mind is privacy and data protection.  Europe played a huge role and the GDPR was disruptive from the point of view of entirely transforming of the landscape of privacy protection.  It was interesting because we were going to an international conference and hearing colleagues criticizing the GDPR, fearing the GDPR.  Now everybody seems to understand this is a very good thing, after all.  I don't know if you have heard it recently, Tim Cook a few days ago spoke in strong terms in favor of the GDPR and even called for a federal U.S. privacy law to match Europe's federal data regulation.  We understand more and more, to quote our South African lawyer, a country not working towards the standards promoted by the GDPR is left out in the cold.

     At the same time it is not just about data privacy and privacy, but it is about freedom of speech.  There is a new report saying for the eighth consecutive year, global freedom has declined.  This was explained by Freedom House by the fact that there is an increase in government efforts to control personal data.  The use of fake news as a pretext to suppress dissent and also the spread of China's model of surveillance to other countries.

     So we should think about it because some regulations could affect freedom of speech when we are talking especially about tackling innocent content on the Internet.  You know the debate that followed, that preceded the adoption of the copyright directive in the EU Parliament with some concerns expressed, including by the academic community about the potential side effects for freedom of speech.  We have the same debate in other fields.  It is extremely important to tackle, to protect the rights, for example, of the creators or to tackle terrorist content, hate speech.  All of these things are important but we should do it, must acknowledge the risks and try to mitigate them through appropriate checks and balances.

     So in order to end, I think that we should as a conclusion adopt a series of good practices.  First of all, regulation should be, I entirely agree with all the speakers including the first one, regulation must have democratic legitimacy.  It is not just a top-down approach.  We need a participatory approach.

     You talked about the experience in Brazil.  This is exactly what the French Digital Council is doing now, which is an open consultation on all these issues.  And you can visit our platform and contribute to this movement and we try to associate to this reflection, the business people, the civil society and all possible stakeholders.

     We should proceed also with impact assessment studies including about the effect on business and innovation.  The EU is doing this pretty well, very well.  And regulation should be preceded by very good impact assessment studies.  And from this point of view I think that we are facing with the regulation, this is my final word, we are facing a lot of difficult questions and regulation, where we talk about the regulation at the global level.  In reality you will see that states have very conflicting views about regulations.  It is not the same thing exactly and sometimes, for example, France might regulate, you listed yesterday to the President of the French Republic speaking about this middle way between the two models that you have mentioned.  But the idea is how are we going to enable to conciliate all these without undermining the inherent global nature of the Internet, without moving to a kind of Balkanized splinternet which could be counterproductive. 

      This is why I will end by saying we need to make our Internet great again, but first we need to save it.  Thank you.

     >> LUCIEN CASTEX:  Thank you, Theodore Christakis.  I would like to give the floor, open the room and the debate and have your perspective on the very interesting ideas that the Panelists discussed in this panel.

     >> AUDIENCE:  My name is Dwayne Winseck from Carleton University in Ottawa, Canada.  I want to ask the question if it might be a good time to give the concept of net neutrality a decent burial in favor of bringing back to life the idea of common carriage.

     I say that for a couple of reasons.  One, I think it has a much longer and illustrious history and firmer legal foundation.  I think it avoids the confusion of neutrality, because we know that technologies are not neutral.  People are starting to run the net neutrality idea up and down the layers of the Internet.  Yesterday during President Macron's speech when he said we ought not to fetishize this idea of net neutrality because there are things we need to defend, he's right on this.  If we were to go back and use the more robust concept we might avoid these problems.  And so that's one point.

     There is another point and maybe come back to it later.  The idea of global regulation.  I wonder, platforms are global but so too are banks.  Banks have subsidiaries.  We have HSBC Canada, HSBC Mexico, France, we have global functional standards and norms, but there are also subsidiaries are subject to national laws.  I would like to hear some thoughts on those two ideas.

     >> LUCIEN CASTEX:  We will take a few more questions before giving back the floor to the speakers.

     (CART captioner awaiting audio.)

     >> LUCIEN CASTEX:  Go ahead.

     >> (Speaker away from microphone.)

     >> LUCIEN CASTEX:  Okay.  I'll take one more question.  In the back?

     >> AUDIENCE:  (Speaker away from microphone.)

     I'm suggesting that people use ...

     (Audience microphones not working, captioner apologizes.)

     >> LUCIEN CASTEX:  I will give back the floor to Luca Belli and we will do a roundtable.

     >> LUCA BELLI:  I will try to reply to the first question addressed to me and if I have time a couple of thoughts on the others.

     Zero Rating is not something bad per se.  It is like traffic management is not something bad per se.  It is bad when it is misused for anti-competitive purposes on when it creates distortions that then cannot, can literally not be reversed.  In Europe you have Zero Rating but you have quite wide data caps.  You can browse the Internet freely but you also have some services.  The guidelines would suggest to give access to classes of services.  For instance, as the German Deutsch Telecom giving access to all video services, that is much better than giving access to three or four or a selected list of services.  Why?  If you want to know more, I published last year an article called Net Neutrality and the Minitelisation of the Internet.  Zero Rating can minitelise the Internet.  It can transform the Internet to a predefined network.  Why and when it is misused. Zero Rating is only interesting when you have very low data caps.  If you have very generous data caps, you have access to the Internet.  You don't need a free gift.  You have everything you want.

     It is interesting when you have a poor person and you cannot afford in the majority of African or Latin America countries or Asian countries you cannot afford to pay a high telecommunications fee.  You end up only using what is given for you for free.  It is not really free.  You pay for that with data.  So you are being transformed into data, free labour, free data producer to produce the most relevant and most lucrative assets in the world for a few corporations that are even giving, being considered as philanthropies.  They are connecting poor Africans and poor Latin Americans.

     Let's think about nudges.  It is not only about nudges.  The last 20 years of behavioral design are focused on creating addicts, not nudges.  The fact that when we wake up in the morning, the first thing we do is check our Facebook app and when there is a red little dot, we are almost super over excited because we have to click it and check the notifications.  That is because the design, the behavior of the designer has done a very good job, has created a trigger that when we see the triggers, there is a reaction.  That creates a habit.  Of course, that is how you create addiction, period.  That is how behavioral design works. 

      I'm simplifying very much this.  If you are interested there is a very good book, best selling book written about how to keep your user hooked.  It describes how to create an addiction.  It is like -- I'm if people are starving, you start feeding them with candies, sugar.  They will love it because they are starving, but then they will have every kind of cancer possible.  You won't be realising this in the first month because you are starving.  But after a few months, you have problems with the sugar.

     We still have time to reverse things.  Also we should take a different approach.  Of course, in Europe where people have much higher education and much higher, are much wealthier, the problem is less harsh.  In other parts of the world it is shall worse.  So I think we should have these kind of understanding and also regulators should evaluate what could be the negative externalities of this.  Technology is wonderful an brings a lot of benefits, but there are also cost, mental health, addictions, disruption, eruption of democracy.  Who could think that WhatsApp could be used to propagate fake news and let some people of very questionable ideals to win in a major democracy?  No one.  Now we know it is like that and that this kind of technology could be manipulated.  It is good to start, have a different approach and understand that there are also negative effects of technology.

     >> LUCIEN CASTEX:  Thank you, Luca Belli.  I give the floor to Sebastien Soriano.

     >> SEBASTIEN SORIANO:  Thank you.  There was one question about the partnership between France and Facebook, about hate speech regulation.  So the question is not what is it about, but why ARCEP is participating, if I'm correct.

     ARCEP is participating for two reasons.  One, we have been proposed; and second, we have accepted. 

      (Laughter.)

     >> SEBASTIEN SORIANO:  Why do we have accepted?  I begin by this.  We have accepted because I think that in this very particular question of Facebook/hate speech, I think that there is a good chance that we French state and Facebook are more or less looking for the same objective.  I think that for Facebook dealing with hate speech is a real issue and that it is complicated and it would create value for Facebook to be backed by the regulator to do it.

     Of course, by the regulator, but by the states.  Let's say a government authority to do it.  And in the other way Facebook being the new village place, let's see, at least for many people, it is good in the other sense for the French state to be aware of how Facebook is dealing with hate speech in more details.

     So we have accepted because we think that the Internet, the general interest and the Facebook strategy could convert on this specific example, meaning that I would not have accepted to engage ARCEP in a topic like, I don't know, privacy or tax issue because I think that the strategy of the company is, I mean, in the DNA, not necessarily compatible to what the public interest is.  And I would not have accepted to work with other big tech players, including in the same subject.

     So today on ARCEP side we have accepted because we think that there is a chance to really learn something and that we are possibly in a win-win situation with Facebook.  That is why we accepted.

     So why have we been proposed?  I don't know.  You should ask to Emmanuel Macron's team.  You know that he is talking about the third way, a third way.  So talking about the third way, I invite you to read a paper from the Institute for Global Change of Tony Blair.  That is the joke of the third way, sorry. 

      (Laughter.)

     >> SEBASTIEN SORIANO:  So it is a paper by Chris Yu, issued two weeks ago.  This paper is proposing new generation regulator and especially one of the items -- so you will see many of the stuff I'm talking about regulating data, blah-blah-blah, but you will also see stuff about how regulators in the future may interact in a logical, accountability of the platforms, making the platforms responsible for what they are doing.

     So possibly the people that have invited ARCEP to join this team consider that we have been working more or less in that direction.  I have been mentioning about new way of regulating possibly, but please ask them.

     So the other question is about nudge.  So I think that it is not easy to regulate the nudge of the platform, but possibly this will be something necessary at some extent.  As you mention, it is already possible today through competition law, through privacy law.

     What I was more mentions about is to intervene as a public authority to nudge people.  Like, for instance, when you buy a pack of cigarettes in France, I don't know if you have tried this.  It is terrible.  You see pictures of people dying with cancer of throat, mouth, terrible.  This is typically a nudge.  It is Libertarian paternalism.  The concept is, you can smoke but it is terrible for your health.

     In the first intents, the goal for the public authority that would like to play with nudge is more to intervene in the choice process of consumer to nudge them at some moment.  So, for instance, giving you an example in the frame of cultural diversity, something very difficult.  In France we have quotas.  The French TV has to show you, I don't remember, 50 percent of French stuff.  I don't remember the exact figure.  But when you are on Netflix, this doesn't mean anything anymore.  So you can imagine, for instance, that every day or every week you receive a message about the cultural diversity of what you are looking at, exactly like Apple are others are doing with a report of what you are doing with your smartphone and the screen time you are spending, how do you spend it.

     You can imagine that a public authority would mandate, I don't know, the TV operators, the developer of the operating system of the television, for instance, to send you every week a kind of report about the diversity of what you are looking at.  Typically this is a kind of nudge approach.

     >> LUCIEN CASTEX:  Thank you, Sebastien.  So I would like to give the floor to Paula Forteza.

     >> PAULA FORTEZA:  Thank you.  Yes, on some of these issues regarding net neutrality and how we are using this word for everything, I have, for instance, a problem using it for algorithms because there is no neutral algorithm.  We are always ordering information, prioritizing something.  Maybe what we need to use is a loyal algorithm that is doing what he is meant to do.  It is a problem that we are using this concept for everything and stretching a bit the meaning and not being able to be precise in this concept.

     Regarding what Emmanuel Macron said yesterday on neutrality, a look at us citizens knew very well what he meant.  I'm sure he was being a genius but maybe we don't understand him yet.  I'm sure he will precise his thoughts in future speeches.

     It was a bit tricky.  I would be glad to know what you think about what he said and what it would imply precisely.

     Regarding nudge, it is very interesting.  With the implementation of the GDPR, we started seeing this appear very, in a lot of different implementations because what big platforms did was not up to the level of the ambition that we had for this text at the European level.  They continued using terms of references that were very technical, very long.  People didn't have type to read them.  They did user paths that were very complicated and with big flashy buttons for yes, I consent to share all my personal data.  And so we need to start thinking of not only privacy by design as we were talking about during the GDPR, but of privacy, of design of privacy.  How do we design interfaces that on the technical level are already respecting privacy?

     We are thinking about this a lot lately and designing, for instance, dashboards that are very clear where users can understand which data is being shared at which moment, with who, trying to also do these paths that are more understandable, trying to do functionalities for the collection of consent that are really loyal to users.  And there was a pretty interesting initiative that was called Terms of Reference, I didn't read.  It was a platform where they tried to summarize terms of references for big platforms and using like symbols to convey some interesting messages.  So how respectful of privacy this platform, is it open source, is it open data, et cetera.

     And with very small symbols and people could understand very quickly what it was about and make their choice knowingly without having to read papers and papers of what these terms of references are.

     Regarding this, Sebastien's idea of having bots, I think, is very interesting.  We should work on that together because it would be nice to really be able to define kind of a profile of how much I would like to share my data or not, how much I prefer my privacy or how much I would like to have more personalized functionalities regarding my profile, et cetera, which is the argument that platforms always use.  We need your personal data because we are going to personalize your user experience.

     But maybe some people want to go that way.  So it would be nice to have this profile and have a bot or intermediary regulating all these for you on different platforms.  So I think this idea we should work on it together.

     >> LUCIEN CASTEX:  Thank you.  It is quite interesting indeed, the Term of References.  I will give the floor to Cherif Diallo.  Do you want to react to the questions of the room?

     >> CHERIF DIALLO:  Thank you.  Thank you very much.  What I would like to say in conclusion is that in the view of the states and the problematic problems of stiff competition, the government has an interest in giving the regulatory authorities more issues in order to guarantee the principle of net neutrality.  Otherwise what is likely to happen is that we could have a disputed Internet where the weakest may not survive.  As a result, national laws should be updated to take into account the interests of all stakeholders, all stakeholders.

     Finally, at the international level, the effort must be made to put in place international regulation and cooperation, imposing the principle of net neutrality on operators providing international Internet traffic.  There, the last point I would like to add to this discussion.  I thank you very much.

     >> CHAIR:  Thank you.  I will give the floor to Theodore Christakis to react to the room.

     >> THEODORE CHRISTAKIS:  To address a final question that has not yet been addressed about the concern of the issue of platforms where the title of our panel was focusing on the age of online platforms.  That's why I mainly spoke about this.  There is no consensus, of course, about the definition of platforms.  There are a lot of discussions about this generally speaking.  We consider that platforms are online intermediaries which offer facilitation and transaction services.

     To respond to your question, of course, depending on the regulation, there are some regulations, you take, for example, the draft regulation about the B2B as we call it in the EU, that's about platforms.  If you take the GDPR, it concerns any kind of company, for example, or entity which not only company but also a state administration, et cetera, but anybody who is using personal data of Europeans.

     So depending on the issue, there are different subjects and different objects of regulation.

     >> LUCIEN CASTEX:  Thank you.  I would like again to thank all Panelists and the room for a great debate and to meet you next time to improve the conversations.

     Thank you.

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 678