IGF 2016 - Day 3 - Room 4 - WS28: The 'Right to Be Forgotten' and Privatized Adjudication

 

The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

>> Hello.  Welcome to the session on the Right to be Forgotten and privatize the adjudication.  I'm Daphne Keller for the Center for Internet Society at Stanford Law School.  We are a public interest law ‑‑ can you hear me?  I'm Daphne Keller from the Internet Society at Stanford Law Schooling.  I'm the director for intermediary liability there so I work on a lot of issues, including the right to be forgotten, and we are presenting a session today in conjunction with the electronic Frontier organization with Jeremy Malcolm, who will join us soon, for those of you not familiar with the European background here, the so called Right to be Forgotten or right to be delisted arised from the Court of Justice at the European Union.  Can people hear me?  I'm getting funny looks.

A newspaper, La Vangardia had machined information about his bankruptcy.  When they put it online, Google indexed it and people searching for it found that information at the top of the results.  He asked Google to Rhea move, based on data protection law rather than civil privacy law or other kind of privacy claims, which at the time was a novel and undecided question whether that law applied to content on the internet.  This was a question that went up to the CJEU, the Court decided that he had a case, that Google needed to comply with it.

It decided specifically that Google needed to remove results about him from search queries for his name so if you searched for his name, you would no longer see links to these pages but if you searched for other information on these pages, you would be able to find it so it's this interesting remedy that is tailored to the harm they see from the de facto profile that search results create. 

They also said while that in many cases Google should delete in response to data protection removal, they don't always have to, because in some cases there is a public interest in the information and they left a lot of open questions with how to do with the balancing with common interest but Google got to work on it, did the best it could and eventually got guidance from Article 29 working party and some other sources and at this point both Google and Microsoft's Bing report that they are getting about 50 percent of requests that seem to be valid and about 50 percent of requests that don't have a valid claim under EU data protection law.

The thing we want to focus on is only partly the European law but much more how the fallout from different countries has played out around the world.  So we have KS Park here to talk about things going on in Asia, we're going to hear about things going on in Latin America.  We're particularly excited to be doing the panel here in Mexico because Latin America is a very important forum for this fight.  It is playing out in many different forms in different countries.  It's coming up not only as a question about delisting of search engines but also as a question of removal of the content from the publishers site or from de-hosted sites so even if it started in Europe as this right, it has potentially broadened.  The thing I think is important to understand about the EU case for countries that have data protection law similar to the European data protection directive, is it didn't have to come out the way that it did.

In fact, the Advocate General, whose recommendation the CJEU more often follows, did not recommend this and thought there shouldn't be a removal right against Google based on a data protection law.  The Court went the other way.  That's the law we have now but for countries that are working through the same question under their own data protection laws, the law is sufficiently flexible that the countries can arrive at any number of interpretations.  Interpretations that are stronger on free expression side, stronger on a privacy side, there's just tremendous breadth in the doctrine to each different conclusion and huge policy questions to sort out.  And it's good to see that conversations about those questions are happening here so that hopefully we can reach informed outcomes.

The Right to be Forgotten has two aspects relating to free expression that you'll hear about a lot.  We're talking about one and not the other.  What we are talking about is the privatized adjudication at peck.  They're holding up a sign for Peter.  They're looking for Peter, but these not up here.  Good luck to you, Peter, whenever you are.

So one aspect of complicate between free expression and privacy and the Right to be Forgotten is simply the question of what content should the law require intermediaries to take down.  That's a really interesting question, and we're not talking about it or we'll talk about it a little bit, but the main point we want to get at here isn't that substantive question about the balancing, privacy, free expression.  It's the procedural fairness question about how intermediaries, these private platforms, should decide whether there should be processes for the accused speakers to be able to defend their speech, how appeals should work, how you can have a notice and take down system predicated in data protection law.

Framed as a question about privatized adjudication, this looks like what a lot of us would call intermediary liability.  It looks like the questions of that are spoken to in the Manila Principles and in other important Civil Society statements on notice and take down, and on privatized adjudication of speech.  For better or for worse, there are doctrinal reasons why in Europe, it might not be considered technically an aspect of intermediary liability, but I would make the appeal to this group as a group of Human Rights advocates that it doesn't matter if the formal law applies if the intermediary liability law didn't exist, we would have to invent it because that law is there to protect users rights and to ensure that there is a fair process when somebody makes an accusation to a private actor and asks that private actor to remove or delist their information from the internet so in the larger sense of intermediary liability as a mechanism, the questions are extremely relevant to the Right to be Forgotten.  In the other thing I'll say before introducing the panelists is we have a big communication question relating to the Right to be Forgotten because it arises at the intersection of intermediary liability and speech law, and data protection law.  How many people here work on data protection law?

How many people work on intermediary liability and speech law?  Okay.  I saw like four people who raised their hands for both, which is awesome.  You're the people we need because these two areas of law don't speak the same language.  They use completely different terminology.  They bring in different assumptions.  One is a regulatory body of law with regulators staffed by privacy professionals, the other doesn't come out of a regulatory background and it's difficult for the two fields to convey the issues they're having or work towards a resolution, to the people who know both, please help make that happen.  The people who don't, please be very open minded to the other body of law because they are both very complex and they both are driven by important value that's we all share.

All right, I will switch from that to introduce our panelists.  Each will speak for ten minutes and we will have discussion afterwards.  We've all encouraged ourselves to speak for less than ten minutes because this discussion is more fun than listening to us.  The first speaker ‑‑ I'll introduce the discussion and then chaos can start our first speaker is KS Park from law school, and a former with the communications standards issue.  He is one of the foremost thinkers on Right to be Forgotten as it is arising in Asia so we're very lucky to have him here.

Our second speaker will be Lina Ornelas.  She is the head of policy and public affairs for Mexico, so she has been in some very interesting events here.  The third speaker is Christian Borggreen, the director of international policy at CCIA.  He can represent industry perspectives on this.  The next down at the end will be Luis Moncau who is the intermediary liability specialist working with me at CIS stand Ford.  He was previously active in Brazil.  Then we will have Cedric Laurant who is the director and can speak to developments here in Mexico.  And finally, Jeremy Malcolm is a here but he doesn't have a chair.  Maybe he's been hanging out with Peter.  Finally, Jeremy Malcolm will speak and one of us will give him our chair when the time comes so go ahead, KS, take it away.

>> KS PARK: Right to be Forgotten is not very welcome in Asia for a reason.  There are many countries formerly colonized and as a result, they suffer dictator ships until recently.  And these former colonies and dictatorships have not resolved these injustices or oppression that remains in their societies.  In addressing those structures, they need to see the whole truth for resolution.  Not partial truth.  Not truth just about public figures, not truth only about high level officials who collaborate with dictatorships or colonial administrations.  Proponents of Right to be Forgotten doesn't apply to public figure, but sometimes you need more information to decide whether someone is public figure or not.

If the information is delisted, you really cannot make that determination properly.  To people in many Asian countries believe that people have right to know the wrongs of not just others, or even themselves.  So that they do not repeat it.  So that's one reason.  Historical experiencey for more healthcare more information.  The second reason is, there are already many laws in Asian countries ‑‑ of course, I'm making these sweeping generalizations, right?  Asia is really big continent.  In Asian countries, there are already laws that allow suppression of truthful information which is the role that Right to be Forgotten is fulfilling right now.  So they do not need Right to be Forgotten in Asia to get their ways. 

So, for instance, in Korea, we have a truth defamation law so instead of making Right to be Forgotten requests to Google search engines, they can just go to prosecutors and ask for indictment of the person who spread truthful information about themselves.  So that's my report.  Very sweeping, generalizing report about Asia and I'm going to talk about kind of the second component of this seminar, which is privatized adjudication.  Now, a lot of people think that Right to be Forgotten is privatized right now, but I don't think so.  Google is fulfilling 10 million requests a year, but Google doesn't want to do it.  They are doing it only because they are forced to do it.  If somebody submits a Right to be Forgotten request and Google does not comply, he or she can submit again to data protection authority and data protection can force Google to delist.

So, it's not really privatized.  This is really a government censorship.  I worked for past ten years fighting government censorship in Korea.  We have censors for all possible genres of production.  Songs, musics, internet committee, and the danger of government censorship is already well studied and learned in other countries, so not many other censorship parties exist in other countries.  The danger of censorship is that lawful information can be taken down when a party orders some content to be taken down, it's only provisional until it is judicially approved, and also progovernment bias can dilute those decisions.  Also, the subject of orders are likely to challenge the decision even if there is a judicial review process because the government can always retaliate even just for challenging it.

But, Right to be Forgotten empowers the authorities to order internet to delist URLs depending on information.  This is just good old‑fashioned censorship to me.  For instance, and some people like to believe that data protection authorities are different from other organizations that have conducted censorship but I do not see it that way.  I already see the evidence that in Peru, other Latin American countries where DPA is really playing the role of censorship.  And in Korea, the dangers are also being played out.  Internet censorship is not taking down unlawful content, bus taking down unethical content and what is ethical is decided by these nine Korean males in their 50s and 60s and nobody can really get a consistent principle out of that. 

And I have been worried that a data protection authority will do exactly what Korea's intended censorship body is doing, basically being partially protective of privacy of government officials, taking down lawful content, under the pretext of being ethical.  I've been worried, then comes Google's decision.  That's what Google's decision is doing.  It's requiring truthful, nondefamatory, non privacy‑infringing information to be suppressed.  If you look at how the Google theory is structured, it doesn't say information is bad or flawed in any sense.

If you look at the decision, it says, there is no precedence to the data subject, right?  Then it orders Google to keep it out of the internet.  Keep it out of the search algorithm of the internet.  Why?  Probably because the internet is considered dangerous space for people's privacy.  So I think that we should look at it from this angle of administrative censorship, and what shall we do?  Right?  What shall we do?  When administrative bodies make these requests to take down otherwise perfectly lawful information.  What should the private companies do?  I think we should ask them to do the same thing that we see regarding other surveillance.  Resist to the maximum extent.  Stand up for the rights of people who posted information on the internet believing that it would be accessible to everybody around the world, as long as it is legal.

Also, stand up for the people who expected to see the contents, as long as they're legal.  So that's one kind of an ethical request that I'll make to the companies receiving Right to be Forgotten request.  Now, there is a moral hazard here, right?  I've heard people say that Google's position not really directed at protection of privacy but directed at dominance of Google in the European market, but if you think about it, it is really self‑defeating.  Because after Right to be Forgotten decision came out, there is really no other company that has resources to comply with that many Right to be Forgotten requests.  The compliance costs are so high that, you know, the European market will forever have to be forever dominated by Google.  There's no other search engine that can come in and process all those Right to be Forgotten requests and compete against the Google.

I'll stop there.

>> LINA ORNELAS: Thank you, Daphne.  Thank you all for being here today.  This is a very interesting issue, and as you know, she explained very well the case.  How this so‑called Right to be Forgotten work in Europe.  So what we understand from Right to be Forgotten comes, she said, from an interpretation of the European Court of Justice of the European law.  In this case, the new data protection law.

And, this is important because the ruling says for that individuals have the right to demand from search engines directly the removal of search results when such information is inadequate, out of date, or excessive.  Those are the words that the European Court of Justice put in the ruling.

However, instead of being the judges or competent authorities who rule on each of these specific cases, this great responsibility has been given to search engines themselves, to private companies, who must not only decide which information must be eliminated, but also consider public interest issues when making such a decision.

So, what happened?  What are the issues with this ruling?  First of all, broad terms like inadequate, irrelevant, out of date, or excessive.  It allows the removal of legal versus illegal information.  A private entity makes the decision, not a judge.  And it does not set enough safe words for public interest.  So, what did Google do?  So the very next day, Google created this application form for people to delisting of personal data.  Of course, the compliance is the first thing we do, and this was our ruling at the last level.  You know the most important level at the European legal system.  Therefore, we put this form, and then we called for an advisory council to Google to give us more information on how to handle this because it was so broad.  So they gave us some light. 

For instance, what is the role of the person in public life?  What is the nature of information?  When a person needs higher protection of privacy, or freedom of expression?  What the source is important or not.  Special protection for journalistic information or information of governmental websites.  The time frame, as well, for instance.  A minor crime submitted some time ago, and everything.  So these advisory council guidelines were very useful.  However, there are still a lot of gray areas.

For instance, let's take this case.  Public pictures.  Cases at the edge such as health information of a low‑rank public official, that maybe is going to become more important or maybe the precedent.

Information published by the media, like news.  When a person is mentioned as an essential part of a news piece, we generally do not remove it, especially in recent news.  For example, a writer who requested the removal of an interview to a recent news article.  How do we define news?  Blog?  What happens with people mention incidentally on the news.  Also, we deny removal with regards to illegal performance of our profession, but what happens when there is no guilty ruling or the person has been exonerated?  A doctor accused of sexual abuse or a felon who is a minor or people who have already served their jail time for, for example, hitting their wife.  So a private company, as you mentioned, under the data protection law, you have these concepts.  That of controller, and then data processors.  But, we do not collect information in the first instance.  We don't give privacy notices.

So, the Advocate General came up with this decision saying that search engines are like murals that only the web masters collect information and that's the way search engines work.  So we collect information that is available.  If the webmaster doesn't put a solution, then Google is not able to take information.  If they don't put that, it's available.  So that's the way it works so it was such a difficult thing at the beginning.

And now, I must tell you that the reaction, for instance, of the European Press was very negative at the beginning because the Court said we didn't have to give notice to media when we delist information.  Whereas the Advisory Council said, it's a good practice because they are the first ones to collect information.  For instance, the BBC started one morning having less information and they don't know why.  Right?

So just to give you an example, we have received 651,077 requests.  And it's nearly 2 million URLs that we have evaluated.  So, we can say that in 56 percent of these requests, we have denied the delist.  So I always say that this Right to be Forgotten is like an octopus because it has lot of unintended consequences.  For instance, I'm going to give you some cases.  To go to that information, first of all, let's think if we are not confusing defamation, with privacy issues.  The right to delist, but we need more criteria.  So we receive guidelines from the European data protection authority as well and now we have another instance, for instance, global removals.  The European Court of Justice said removals from Europe, for instance, .FA, .ES, depending on the country.  They said that also from the.com, you have to remove this information to the list.

And it can be possible bobble for those IPs in Europe because if you applied this globally, that means something that is considered from the public interest is Mexico is not going to be available.  For instance, we have a very important case in Mexico of a French citizen that was accused of kidnapping and she was released.  Actually, we lost relations with France because of this case, and it was a due process case so the Court released her and she was a hero in France.  Maybe she can erase it in France but in Mexico it's a public interest issue because there were police involved.

For instance, in the United States, you can know people who commit a crime against children like sexual abuse, you can know where they live but that's a right the country decides to have, so it's difficult.  I want to mention at this first intervention that the special rapporteur for freedom of expression of the OAS has said that in America, we have a Human Rights system whereby prior censorship is forbidden.  And in this case, we can be, in this situation, so it is important to be clear that we have to take care of these aspects in Latin America.  And I think our colleagues are going to tackle the final decisions of the Supreme Court in Latin America, but we have so differently because of the Human Rights convention in America.  And for us, let's say, in Columbia, in Argentina, in Chile, Supreme Courts have ruled that search engines are not data controllers so they go directly with the source.

For instance, last October ‑‑ two minutes, right?  The Inter American press society issued a resolution producing the document, database, a risk for freedom of expression.  And they say there's a confusion regarding the scope of these so‑called rights because this causes dangerous things.  For instance, there's a lot of companies now working for delisting information, and there's one that comes from Spain.  Elimialia, so what they do, they are provoking causing chilling effects.  For instance, they go to small blogs and media, and they say, if you do not delete information about this corrupt public serve ant or enterprise man, therefore, you will have a fine from the data protection authority.  So without noticing, all of us, they are taking down information.  So, these are chilling effects that are terrible.

So, just as a confusion, I think this right that confuses owner with personal data protection is difficult Mac this balance right, and give us the role of censorship.  I think web masters are in a better position to defend their own contents and I think more information is better than less.  Otherwise, the most restrictive country will define the right to know of the rest.  Thank you for the time.

>> CHRISTIAN BORGGREEN: Great, thank you.  My name is Christian Borggreen.  I'm with the ocean that represents a whole host of internet companies, so I'm sort of the token industry guy here but I'm very grateful for finally being invited and I also appreciate the irony of the IGF, here we're talking about freedom of speech and freedom of information and I get the tiniest room which excludes most people that want to be part of this conversation.  So thanks, IGF, for this little irony.  Well appreciated.  To be honest, the Right to be Forgotten is very challenging not only for Google but a whole host internet companies.  It goes to the very core of discussion about the internet.  Social, historical, political, the technical features of Internet Governance, this is a good place to discuss these issues, but it is extremely difficult for companies to adhere to because it's just extremely tricky to sort of nail down. 

Even the whole term, Right to be Forgotten is wrong but we could have a whole session whether it's right to delisting, erasure, et cetera.  I look forward to hearing some of my colleagues from North America, Latin America, people in the room because each country has their own dynamics, history, tradition, et cetera which probably puts each country at the best place to decide what kind of legal framework they want to have in their place and what kind of information people in Peru or south Korea or other countries should be able to access in that country.

I think as a European I'm a little bit ashamed because we do this quite often, we sort of tell other country how they should do and manage the internet, and it's kind of awkward.  I can criticize Europeans, or the French in this case here, who in the case of Google, as mentioned before, actually apply a European ruling based on European law to the rest of the world, which is extremely problematic.

And of course, so, to my point, our companies obviously respect all the laws in the country they operate in, laws related to freedom of expression, freedom of information, and of course data protection law, which we're talking about here because that's often where the Right to be Forgotten is applied.

What we talk about here is sort of the repercussions on European ruling and the right to be forgotten.  What we were talking about in Brussels the last few years is the new data protection framework, it's so complex.  Even the acronym is complex in itself.  But the GDPR data protection framework has actually mutated in a way this n right to be forgotten.  Now in Europe, we talk about right to erasure.  Isn't that crazy?  You think about this movie with Will Smith, Men in Black where you erase people's minds because they saw something they shouldn't have.  Just mind blowing.  To be fair, this is not an absolute right, it is balancing and they are saying in the text that there is a balance with the right of freedom of expression and freedom of information.

And it is very much targeted, I think this whole discussion in Europe, towards social media companies and not so much search companies.  I see time is going so I'm going to jump ahead here.  There was a lot of talk about this being extremely important, also mainly for Croat issues but I think Hispanic copyright issues but I think this is something that should be part of this discussion especially companies find themselves in an awkward position where they have to be policing and be the judges over what kind of information is out there and can be accessed so we think that is a very awkward position, we think maybe it shouldn't be, that privatization of judgment.

To wrap up, so the Right to be Forgotten is extremely challenging for all companies.  We need to have a careful balance of rights, including right to information.  Right of freedom of expression, and information versus data protection, in this case here.  We think that each country is probably in a better place to decide what kind of data individuals should be able to access from that country.  And we think that it will set a bad precedent in one country could just overrule other country's laws and impose what kind of access.  So thank you for that.

>> DAPHNE KELLER: So Cedric is next.  I made him move to have a microphone but it turns out we have a mobile one so I won't make everyone move but Jeremy may need to move to be closer to his slide show, so go ahead.

>> CEDRIC LAURANT: So, I'll talk to you about two Right to be Forgotten cases in Mexico, though I don't like the term because it's a term that may be used by some to flag the risks of implementing these kind of rights and the risks against freedom of speech.  I'd rather talk about the right to deindex information on a search engine.  In Mexico, there is a data protection law that applies to private companies, individuals, that enables people to get access to their personal data, requesting it from the companies.  To cancel processing, or opposed poz it for legitimate reasons, and I think some aspects of the so‑called Right to be Forgotten can be interpreted as the exercise of the right to cancel or oppose processing of personal data.  Two main cases in Mexico.  One that most people know about in which Google was about to be defined by the data protection authority at the federal level.

In another case, one year earlier, and the second case was a case badly argued by the data protection authority, in which Google was right to fight it in Court thanks to R3D, a non‑profit and advocacy organization.  But in the first case, it was the case of someone who had a valid claim to request the deindexation of its information was someone whose address and name had been linked and stored on a website called ABCTelefonos.com.  He asked the website to delete the information, which the website did pretty quickly.  Then he asked Google Mexico, he didn't get any answer.  He went to ENI and started proceeding and claiming that Google didn't do anything.

In this case, the NI wrongly argued the case.  It's examined a lot of arguments by Google Mexico referring to the fact that Google Mexico did not have jurisdiction ‑‑ rather that ENI did not have jurisdiction over Google Mexico for tons of reason that's were unrelated to the case at stake.  In this case, the ENI said Google Mexico can't be considered a data controller.  This was pre‑Mary Susteka case, in the second case, the president of a company whose name had been mentioned in three publications online asked to have his information removed from Google's index, Google's database.

And he sent his request to have his data removed, deindexed from Google's database, to Google Mexico first, then without obtaining any answer from Google Mexico, went to ENI.  Although it committed several mistakes in its arguments referred interestingly to the fact that Google Mexico has statutes in which its corporate purpose includes managing its search engine.  Google Mexico said, well, in reality, we do not index information.  It's based in California.  ENI said, well, it's on the paper.

When we Google your websites, the Google.com changes into Google.com.MX, we click on terms and conditions, we see a reference to an address in Mexico City.  Those two arguments made ENI decide that it had jurisdiction over Google Mexico.  It actually made a mistake because in assessing whether a company is data controller, one has to check whether the company is ‑‑ in reality, in California, Lina can correct me if needed, however, here, there is a link between Google and Google Mexico because Google Mexico uses the information that has been indexed in order to sell its advertising to people based in Mexico.  And it's precisely that link that the Maricosteka case that was decided by the European Court of Justice was analyzed and that the ENI skipped it.  The other thing that the ENI skipped is the checking the balance between the right of that company directed to have his information protected because he said it was ruining his reputation and the possibility to carry out business decisions.  It did not take into account what the Court did of the interest of the public to know about that information.

Actually, R3D, Mexico non‑profit organization went to Court to fight that precisely and the Court gave reason to Google.  So we left here in Mexico with two cases that were wrongly argued, that reached the wrong conclusion.  The first case, it should have given a reason to that data owner who has information ‑‑ the second case didn't take into account several elements, including free speech privacy balancing analysis.  So, there's still a lot of options for data owners to get access to the ENI and to claim their rights.  Here, there's a problem with the ENI because it did not analyze the case well enough.  So I think it would be worth it to wait and see and check what's going to happen with the next case because they're going to be next ones in the future.  Thank you.

>> LUIZ MONCAU: Good afternoon, everyone.  I'm Luis Moncau.  I'm actually at Stanford, but originally from Brazil so I will talk a little bit about some problems I see with the Right to be Forgotten and some issues that the developing of the discussion in Brazil raises, some concerns.  So.  I would like to make two types of considerations.  One more on the material side of what we are discussing and the other on the procedural side.  So, on the material side, what I see is a big confusion about what is Right to be Forgotten.  I see it spreading, or not spreading, but the same in other places as well.  So we have to be very careful when we see some data from Brazil, so, for example, there are lots of cases on Right to be Forgotten because sometimes these are not related and Brazil does not have a data protection law.  They're not related either to data protection or to search engines at all.  So, so the first thing we need to separate and I think there is different procedure for you balancing freedom of expression and then privacy rights and there should be maybe different techniques for balancing data protection and freedom of expression as well. 

So, this is the first remark I would like to make.  The cases in Brazil, at least the two that are more famous and are in the Supreme Court, they are not targeted at search engines.  So, they are targeted at broadcasting companies, and they are related to the right to be forgotten as it was affirmed in our jury's prudence related to the right ‑‑ one is related to the right of one person not to be linked to crimes that the person was not convicted of.  And this started at broadcasting company.  When it comes to search engines, there is one case, and there is another recent one again right now, but these are balancing privacy sand freedom of expression.  And also, how able would be Google to deindex, so the first case says that it's really hard to request Google to deindex content without offering the specific URL, and this would mean that you cannot request the company to remove content based on a search query.  The so, this is the first one of the cases.  And the superior Court of Justice, which is not the Supreme Court for these removals, the person that actually posted the content.

On the procedural side from the intermediary perspective, there is already lots of know‑how on why we should not be moving toward notice and take down regime.  So, if you want to take some examples from Latin America, you can check what is happening in Ecuador with copyright, this is a very important issue, copyright, trademark being used as a tool to remove content of public interest, including official documents.

But, if you want examples from other parts of the world, there is this recent study from Joe Coriganis, Jennifer Urban and Riani Scofield, a notice on every day take down process which also points to some very interesting points on removal of technology by the content industry.  So besides ‑‑ and this study shows a lot of problems of over-removals that can be ‑‑ can happen when you have a notice and take down regime that puts the decision or the incentives on the intermediary to remove content.  When you talk about procedures or privatized adjudication, there are also many concerns about due process, right?

So, besides private entities deciding what should remain online or not, we should be ‑‑ and I understand the point that KS did on, there is ‑‑ this is actually the administrative authorities requesting the removals, but there is also the problem when the companies remove the content or they respect the notice they receive, and there is no control at all authority.  No party interested in maintaining the freedom of expression or the content online.  So if we are to decide that there is a Right to be Forgotten based on data protection rules, we should surely be thinking about how to involve all the persons interested in the content and the public interest is really hard to involve, so this would mean maybe supervision in Brazil, at least from the public persecution office which are the one responsible to take care of public interest and collective interests.  And really try to involve the ones who posted the content

So, the decision Brazil took from that is that freedom of expression is such an important topic that a Court should be involved when you have ‑‑ when you want to remove some content.  The point is, if we approve a data protection law, there is uncertainty on how this will play with the markets, if the Marcosevil will remain in place or if we have a different rule emerging from the data protection law and eventually a right to be forgotten.

the last thing I would like to mention is part of the debate about the Right to be Forgotten, this bothers me a lot, moves towards the question of whether information is truth or not.  And there are lots of debates on the freedom of expression field, especially when related to the journalists and the ability of journalists to seek for information and so spread information, because not all the time you can assert clearly that something is truth.  And so this is already problematic when you have a Court deciding, and would be even more problematic if you transferred this responsibility to administrative agencies or even to the private company that's responsible for the search.

One last thing I would like to mention.  There is one case that would be similar in Brazil about the platform that told us everything about everyone, which was not processed in Brazil and which processes information that is truly available on public databases and the courts decided in Brazil preliminary decision was to issue an injunction to the ISPs to block the website entirely.  So, as the Brazilian Court could not hold the website accountable, they went to this path it protect privacy and mention data protection considering the few provisions that we have on the Marcosevil. 

Last thing if I have one more minute is that we should also consider separating or trying to craft balanced rules between freedom of expression and data protection.  We should think differently about when the consumer has a specific relation with platform, and then these rights to remove content from that platform, which should not require an order.  You could just reach out to the company and say, this data that I give to you, if there is a data protection law that allows me to delete it, you should delete it with no delay from companies that are tagging or indexing, especially the search engines, content was not uploaded by the user itself.  So, that was the comments that I would like to make.

>> JEREMY MALCOLM: Can we have the presentation slides on the main screen, please?  Thanks so I've been asked to give a few remarks on the intersection between the right to be forgotten and intermediary liability regimes.  So we heard about the class fiction of data controllers.

Who are responsible for complying with the Right to be Forgotten regime.  And I'm mainly going to be talking about the European data Right to be Forgotten regime in this presentation.  And there's a question about to what extent internet intermediaries other than search engines may be classified advertise data controllers in this regime, so the ‑‑ classified as data controllers in this regime.  So established that it does at least apply to search engines.  Even that was a little unusual because the search engine barely can be said to control the data that this indexes.  The data that it indexes exists elsewhere on the web and all it's really doing is directing the user to that by means of an index.

But even so, the Court decided that yes, data controllers with respect to the information they indexed, so we take that as read.  But then how many other types of intermediaries are included?  Facebook?  Twitter?  It would be very interesting to know this.  And the latest incarnation of the Lao regime under GDPT, general data protection regulation, seems to extend to some other intermediaries but it's a little unclear when ones and all that we can surmise is that it's most likely to apply to types of intermediaries that engage in some sort of processing of personal data.  A bare web host would probably fall outside of that category.  But it's injected a lot of uncertainty into the position that we're in or indeed foreign intermediaries that serve European users, what's the position they face now?  We still don't really know.

So, there's even a question about whether the liability of the platforms under the Right to be Forgotten is classified as intermediary liability.  So the regulation does state that it's without prejudice to the intermediary immunity rules on the e‑commerce directive.  So the e‑Mercedes directive, as many of you all know, is the European law that acts as sort of a notice and take down regime where if there's an alleged copyright infringement or some other illegality then the platform is protected from that liability until they receive a notice requesting that the content be taken down.

So, given that the Right to be Forgotten is without prejudice to that regime, that sort of implies that Right to be Forgotten is a species of intermediary liability law, but it's actually kind of not because if a platform is liable for failing to remove content under the Right to be Forgotten, they're not being liable for publishing that content as such, in the same way that an intermediary in some countries may be liable for defamation, it's actually a complete different species of liability all together. 

So if that's the case, what are we to take as the meaning of this statement that the Right to be Forgotten is without prejudice to the intermediary liability regime under the commerce directive.  If it means that the e‑commerce directive, an equivalent process of notice and take down applies as it applies under the e‑commerce directive then that would be useful to know because then we would know that intermediaries were protected from liability until they received a request to take down personal information and Right to be Forgotten.  But, that's an odd sort of interpretation given that the intermediary isn't actually a publisher of that information as such.

They're a data controller, which is a different thing.  So, if that doesn't apply then we have a more complex situation and GDPR can be interpreted in a way to set out a series of obligations that are slightly more complex.  It's actually a confusing situation.  Again, once again, we have a situation of uncertainty for platforms in Europe.

Okay.  What about penalties?  So the penalties are enormous if you fail to take down information that you are required to forget, then you can receive a penalty of up to 4 percent of your worldwide annual turnover.  So, for that much money, as I said, Google could employ about 20,000 employees and buy a fleet of 1,000 Rolls Royce cars and a 747.  So it's quite a lot of money.  You could say that it's actually disproportionate.  So there are a set of principles that we've developed ‑‑ and when I say we, this is a group of Civil Society organizations around the world came together to develop principles on intermediary liability last year.  It's a set of best practice guidelines for both regulators and also for the nixes themselves. 

We ‑‑ intermediaries themselves.  We released it last year and this has a bearing on the Right to be Forgotten.  It's not specifically directed towards Right to be Forgotten, but there are some provisions in these null principle that's do bear on it, and also we did treat it in some depth in the background that accompanies Manila Principles.  If you go to manilaprinciples.org, there's some background papers there and you can read about how Right to be Forgotten interacts with these principles and I'm going to pull out a few of the relevant principles that bear on the Right to be Forgotten.

Firstly, that intermediaries must not be held liable for failing to restrict lawful content.  And as Professor Park mentioned in his presentation, generally that's what we're talking about under the Right to be Forgotten.  It's content that is not, per se, unlawful and yet the platform is being held liable for failing to restrict its availability, so that, there, is immediately a conflict with the Manila Principles.  Next, a content must not be required to be restricted without an order by judicial No. that's also not the case.

Under the Right to be Forgotten, the platform has to make their own determination and there's no judicial authority involved.  By the way, these are not in order.  I just picked these out of the principles, just cherry picked.  Also, any liability imposed on an intermediary must be proportionate.  We've just seen the pretty much slide.  It's not particularly proportionate.  It's an enormous penalty.  Of course, that's the maximum penalty.  It may not be that high in any given case, but even so, the penalty is extreme.  And another principle, intermediaries must not be required to analyze the illegality of third party content because that's not their job.  They're not qualified.

This sort of goes along with the judicial authority because it's really saying only judicial authority has the power to make a standard determination about legality.  Finally, unfortunately, the GDPR doesn't really enact any penalty on those who make false Right to be Forgotten requests.  Before any content is restricted, intermediary and user content provider must be provided a right to be heard.  This is a little obscure, the wording of this, how do we apply this to Right to be Forgotten.  One person who is never given a right to be heard is the person who put the content online that the intermediary is delisting.  So if it's a Google search the person who put the web page up that Google is delisting doesn't have a right to be heard before Google takes it down, which is a real problem.

And finally, I think this is my last slide, where content has been restricted that allows it to display a notice.  What it means is, you know how you do a Google search and under DMCA if a DMC error request has been made to remove, Google would tell you which results would be removed and you can click through for more details.  You don't get that with Right to be Forgotten.  That, again, is against the Manila Principles.  They're not a Lou, of course, but they have been pretty influential.  They've been cited in these high level documents, so not just a single NGO saying, quite a significant group.

So, in summary, the Right to be Forgotten doesn't comply with the criteria of the Manila Principles such as requiring a court order, limiting it to unlawful comment that they have to remove, giving a penalty for abuse of request, giving notice to the person who put it online and being transparent to the user.  Thank you.

>> DAPHNE KELLER: Thank you.  Those were all very valuable interventions.  Can people hear me this time?  No.  Thank you to the speakers.  Those were all great.  I'm glad that we closed on discussion of the GDPR because if the Costeja or Google Spain rules are present, the GDPR is its future for the European Union, so these very different and more dangerous rules about how notice and take down is supposed to work will come into effect in May of 2013, in the EU.  For those of you operating in either countries under data protection laws, you have questions now about whether the Google Spain interpretation is something you want to embrace under your laws and you will have a question in the future about whether your country wants to move to the GDPR model.  For the moment, there are these rules about data transfer advocacy.  It's easier of it adequacy.  It's easier to do business with Europe if they deem your data protection laws to be adequate.  For the moment, if you have a law that looks likes the data protection law, you have a good chance to be deemed to be adequate but perhaps in the future the GDPR will be the rule that countries are deemed to be adequate for the European Union.

So, this will become pressing.  For those of you looking at this issue I just want to tag that SELE is soon to be publishing a book and online availability, a series of essays on current issues in Latin American civil law.  I have an article on precisely this issue that lists out doctrinal arguments rooted in American convention why different outcomes might be legally correct in other countries and why following exactly in the path of the EU is not a foregone conclusion for places with different legal frameworks and different prioritizations in their human rights instruments.  I have promised Lina an additional few minutes of response.  And then let's ‑‑ I think we'll switch straight to questions.  If you don't have them, I have them.

>> LINA ORNELAS: I just wanted to add something important that the European data protection directive is pretty general.  It's from 95.  Google was born in 98, and all these concepts about data controller, were created that year.  The second thing I wanted to highlight is that Google is, indeed, a data controller but it's when we process our use of data.  For instance, if you have a GMail and you open an account and you're an assigning user, we are a data controller and give you excess rights and even we created something as industry, the right to portability, that actually now are included in GDPR in Europe.  So that means all your pictures in Google, you have the right to take them to another service, another platform.  So we do that.

But when you are intermediary like search, we do not collect data directly.  We do not give privacy notice.  It's amazing how the European Court of Justice or even the data protection authorities were not angry with webmasters because they didn't get the consent for the transfer of information to the search engines.  They didn't apply the directive on those cases.  And the other thing, quickly, I wanted to add is that I loved when EFF said, okay, there's the Right to be Forgotten, but what about the rights that are forgotten?  There are other rights involved like in mention Latin America, we believe in access to information because we have been having a lot of corruption and authorization so the right to the truth.  This is very important because people that commit fraud and public servants, the case that Cedric mentioned is actually from a guy doing business and rescued by the government with public money.

So, it's terrible because all these cases are related and finally, I want to make clear that we do delist information but we do this when we are in face of an illegal content.  For instance, sexual abuse imagery for children or child pornography or copyright.  But when we are in a position where people are saying because of the last name, that's terrible because we have to waive rights and it's interminable.  I just wanted to mention that in the case of Mexico that Cedric mentioned the webmaster won the course and the data protection authority decided that they have a right to defend themselves and ENI, so all the URLs involved are going to be invited to give their arguments because they do not receive more traffic and this news disappeared from one day to another.

So, this is a very important thing to think.  And that's it.  I won.

>> DAPHNE KELLER: Thank you.  I saw a hand up back there for a question.  Do we have a make phone that travels?  Is this the microphone that travels?  All right.

>> Thank you very much.  Lorena Juama here speaking.  First of all, I think it's a pity that we have many panels in the public forum and we have overcrowded panels and I see a lot of faces wanting to say something, and we've been all sitting here 75 minutes.  I think it's a pity having experts like you which all of us would like to discuss not being able to have a discussion because we only have 15 minutes to say something.  So, I think ‑‑ I'm saying this because it's being documented and I hope that the man reads these comments and sort of figures out how to schedule this so that there is a real document.  In the first place.

In the second place, I wonder why this argument is not being mentioned.  In the GDPR, there is no right to be forgotten.  There is a right to deletion, and there is a right to restriction of data processing.  And actually, the thing that applies at the restriction of that processing because things are not being deleted, they are being black listed, which means blocking.  And blockings is exactly that article in the first place.  So, when we talk also about the GDPR, I agree very much with what Jeremy says.  The data protection regulation and data protection overall doesn't understand intermediaries.  It only understands data controllers and all the data processing meaning third parties that are processing data in order from a company.

So, for instance, a company that ‑‑ a doctor that has the patients, and then needs to do the booking but he's a doctor so he's not good on booking so he gives this data of his patients to a to booker so he is able to do the accountant stuff.  And his name.  So that would be an auto data processing.  But, so I wonder why ‑‑ I mean, I read the articles from Daphne Keller and I think they are fantastic because they add a new view and insight into this.  But I really would ‑‑ I think it's somehow misguiding the main point that is actually being made by Jeremy that this is not about intermediary liability because Google is not understood as an intermediary from the view of data protection.

I know, actually, we have a doctrine in Europe right now anymore, and it's not relevant whether that is being processed.  It is relevant whether the company is offering a service in Europe within the European Union or not.  So the reading was not wrong.  As was said before.  It was not a wrong reading.  Google is offering services in Europe and it's processing the data in America.  And, the Court decided in this case, that this was a controller because they are offering something in Europe.

On the other side, there is a contradictive case and this is something that I really found curious and I don't understand which was the case of Facebook versus Ireland, where suddenly the ECJ decided that it was not ‑‑ that Facebook Ireland was not a controller but it was a data transmission.  And because Facebook in Ireland was offering a service, but Facebook in the U.S. was processing the data.  So I wonder what your take on this, why we have this contradiction in two cases that are very close to each other, because I think this is crucial because if the Costeja case would have been decided as in the Facebook case, meaning that Google Spain is not a controller, then we would have a completely different situation right now.  Thank you.

>> DAPHNE KELLER: Jeremy, do you want to speak to that one?

>> JEREMY MALCOLM: You could probably answer it better than I could.

>> DAPHNE KELLER: So I'm going to take a shot at this but others please speak up, and our next question is right there.  So the ‑‑ sorry.  This is Google Ireland.  That's why I'm getting myself confused.  I don't think I can speak, actually, to the distinction on Google Ireland.  Do you want to talk about that?  Facebook versus Ireland.

>> LUIZ MONCAU: I was going to talk about other aspects of the question.  I think the problem with Google Spain decision is that Google was not considered an intermediary.  The intermediary liability law is built on the premise that the intermediaries are not really consciously accepting and processing data.  They're just being there, being a transparent conduit for the information to be exchanged.  But for some reason, the Court decided that indexing the IP location of data is enough to call it data controlling.  So even that part of the decision, I think, will be heavily debated and contested in the future because there are many other forms of intermediaries that do not involve conscious processing of information.  I mean, just look at where data protection law came from.  Data protection law came from our concern about hospitals, schools, that really use our private data for certain purposes.  Right?

The substance of the information was really useful to those schools and libraries and hospitals.  The substance of information is not at all useful for meaningful to Google, so that's ‑‑ that decision, I think, is wrong but we have to live with it.  We'll see where it goes.  Because there will be other, as other people said, the GDPR doesn't apply just to search engine.  Applies to many other ‑‑ could apply to many other intermediaries.

>> DAPHNE KELLER: So now that I've sorted out which case you were talking about, I will briefly speak to the Microsoft Ireland case.  So the Shrimms case is data transfer, it's a different set of questions.  Actually, you know what?  I take it ‑‑ we had like four people say that they know data protection law, so I think rather than continue on a rabbit hole that I think about four of us understand, I'm going to go to the next question.

>> Yes, I'm sorry because my English is not very good.  I'm going to try to say all things that I have in my head in Spanish because my English is not very good.  I understand the worry for the companies like Google because it's their business.  Their business it not collect information, it's to tell the people where is the information.  That's the point.  And that's the important point here because those like Google being and those who went looking for information give access to the information whatever people only putting her name and say all of these explaining because they can't look for the information easier in the paper without Google.  Google is a thing that can help you to access the information, whatever is in the web.  And that's the importance of the Right to be Forgotten.  The Right to be Forgotten give a person a second opportunity to correct the things of the past.  And he said that the only point that is bad in the sentence of the Court is that Google has the power to say what is relevant, what is not relevant in the information.

That need to be the government because need to evaluate some things that is in the law.  For example, how do you say that this information is public, is not public, is important, is not important that can be decided for the government in the special agency, not for Google because Google is a company, it's a private company.  They can't be ‑‑ I think it's bad in the sentence.

>> LUIZ MONCAU: Why should the government have the power to decide whether information was relevant?

>> I think it was better to go to an agency and you say to the agency, this information is bad.  It is affectioning me.  I need to erase this information.  These agents decide, Google, yes, this information affects the right of the people, you need to delay indexing of information, this is better, but now it's not ‑‑

>> DAPHNE KELLER: For the record, it's not Luis Moncau speaking.  I think there's a really interesting concept in which the word relevance is meaned to use two different things.  Google search engine is used to say, is this what ‑‑ then being asked to decide as a legal matter whether a result is relevant or not in order to take it down is not coherent.  But the Google Spain case brings the word relevant to bear as a legal matter to mean something very different and something that's sort of hard to understand.  How do you know if something continues to be relevant to someone's life?  That's a very philosophical question.  I think we have five more minutes, and in the spirit of dialogue, let's try to get one more person if there's someone from the audience.  Paula?

>> PAULA: Hi.  I'm Paula from Argentina.  You mentioned due process.  You were the only one, I think that mentioned due process, but you didn't mention transparency and I would like to see how because to me are strongly linked to those because in order to have public control, the public interest in minds, you need transparency because otherwise not only due process, but an appropriate level of transparency so people will know that certain content was removed or whatever.  So, I don't know.  To me, transparency is a requirement for due process, and I don't know if you see it that same way or like ‑‑

>> DAPHNE KELLER: Paula, you're the one person at this whole conference who stood up and did not pitch her project.  Will you describe your project on transparency?

>> Oh, yes.  We are just starting after the internet jurisdiction conference in Paris, one of the takes from there was that transparency, they needed to put more transparency on transparency, meaning that usually, companies says, I cannot be more transparent because I have obstacles, legal obstacles, supposedly.  And no one knows what actually are those.  So we are trying with the (Indiscernible) in Chile and hopefully some other partners in the region to map, to define obstacles, to define transparency, and to map what they are.  And if they are actually obstacles, concerning freedom of expression arise.  Maybe they are not.  They are perceived as obstacles, but maybe they are not.  That's my project.

>> I think I forgot to mention transparency.  It's not that I don't think transparency is important, but, so I think this can be built maybe into law, somehow.  For example, Brazil is in process to approve a data protection law.  Even if we treat Google or search engines as intermediary or not, we can benefit from the intermediary liability rationale to build a system that protects freedom of expression and transparency should be built into it.  So when we take about, for example, the market review, when we were doing the regulation of the review, one of the things Civil Society were pressing for was transparency about the request for personal data from law enforcement agencies. 

So what we have in other countries is usually companies can share with the database or publish on their own platform, or share with the person that had content removed information about these requests.  So companies can do or cannot do, depending on the law, and what we built into the Marcosevil was the authorities are required to have annual reports.  So this might be, if you have administrative agency pushing for a move of content, they should be accountable for that.  They should be publishing reports saying how many requests I made, what work the basis for that, did this lead to a good solution or not?  So, we should think about this not only on the side of the companies.

>> And as Paula was saying, the legal framework is critical.  If the company doesn't believe they are allowed to disclose the letters they receive, they won't do it.  So having a legal framework that allows for transparency is critical.  This is an amazing panel that's been dominated by everyone but the white men on the panel, but I'm going to give Christian the final say.

>> CHRISTIAN BORGGREEN: It better be good, then.  I just want to say that absolutely transparency is crucial.  I don't know if you got an answer before when you say why is it private companies that should be deciding.  I think it's totally awkward that companies should be the ones putting the pressure on them, and this is what's happening.  Maybe it should be to someone else who is a little more capable.  But if you are an intermediary, maybe you're not a big company like Google, small start‑up.  If there's so much pressure amities lose focus on your turnover if you don't uphold the law then what's going to happen, every time you're in doubt, if something is even in a gray zone you take it down because you don't want to be infringing on any laws and that means you have access to much less data.

>> DAPHNE KELLER: Thank you so much to the panelists.  Thanks to the audience.  Enjoy the rest of the conference.

(applause)

(Session was concluded at 4:30 p.m. CST)