You are here

IGF 2019 – Day 1 – Convention Hall I-D – OF19 Human Rights And Digital Platforms - RAW

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 


>> MODERATOR: Good morning, ladies and gentlemen.  And welcome to this provocatively open forum on whether platforms and human rights are a contradiction in terms.

My name is Joe McNamee.  I have had the honor and privilege to work on working groups on roles and responsibilities industry mediaries and coon a draft recommendation on automated decision making.  The past few years have been very exciting from a data protection per speckive with the entry into force of the conform the convex 108 plus as well as the first years of operation of Europe's general data protection regulation.

In this panel, we will look with esteemed colleagues at the expansion of data protection rules internationally and the roles and responsibilities of businesses in relation to privacy and data protection.

In particular, we will look at our direction of travel.  How close are we to a global framework that works in practice as well as on paper?  A global framework that ensures respect for fundamental rights to privacy and data procession.  What is our destination and how close are we to it?

We need to be very conscious of the fact that without privacy we cannot have security.  Without privacy we cannot speak freely, anonymously or without fear of retribute.  Without the freedom to speak and move freely we cannot associate freely.  If we cannot speak, associate or move freely we cannot hold power to adowntown.

If we cannot hold power to account our human rights are at the whim of authority.  Lack of privacy is therefore the antithesis of human rights and democracy.

We will start with a under are of introductory comments and hopefully be able to open up to questions and answers after that.  Profs our speakers are Jan Kleijssen who is director of the information society director of the Council of Europe on the action against crime.  Alexandria Walden is Global Human Rights and free expression policy counsel at Google.  Fany Hidvegi, is European policy mag at access now.  Rami Efrati is senior cyber fellow at the Tel Aviv university and former head of the civilian division of the Israel national cyber bureau Prime Minister's office.  And Florence Raynal is deputy director head of department of European and international affairs, at the CNIL.

I will keep as quiet as I can about a subject that I'm passionate about, and hand over the floor initially to Jan Kleijssen from the Council of Europe.

>> JAN KLEIGSSEN: Thank you very much.  Good morning, everyone.  If recent weeks, we have seen two very powerful statements amongst others on the issue that is the title of this session Human Rights and Digital Platforms.  One came from a person that we would normally associate with these issues, namely the UN, United Nations special David Kaye who came out with a report very strongly pointing out that the respect for human rights on digital platforms left to be desired.

The second statement came from someone we would not normally associate with these issues, a British actor by the name of Sasha brown Kuran who made a remarkable speech just a few days ago which has been widely circulate and I think deservedly so on the internet and social media.  And for those that haven't seen either or haven't raid David Kaye's report and haven't seen Sasha bar Ron I would encourage you to do to.

There was one phrase in the statement by Sasha which struck me particularly I must say.

And which I would describe as debunking the myth that freedom of expression covers everything on social media.

And I think he rightly pointed out that those that deny or justify the Holocaust on social media are not offering an academic point of view.  They are preparing the next one.

On this note, I'm on the Council of Europe which was founded to prevent the horrors of World War II from repeating itself.  We celebrate 70 years this year.  And we have adopted in the course of the 70 years for our 47 members now nearly 850 million Europeans some 200 treaties including the European convention on human rights which a number of you may be well aware of.  A binding and enforceable treaty and does cover human rights whether they are violated online or in the real world.  It applies to both.  And referring to the booklet that we all received the aagain da for the 2020s.  One of the opening statements by Vin Serff who I usually agree but would like to disagree slightly.  He calls out in the field of privacy and internet governance we need enforceable treaties but they are not yet there.

There I would slightly disagree.  The treaties do exist.  On cybercrime the Budapest that originated in Europe but has gone far beyond Europe's borders and last we are we reacted with capacity building in the fight against cybercrime in more than 130 countries.

On data protection convention 1 o 8 which you mentioned mentioned.  Convention 108 is a binding international treaty enforceable by the parties to the treaty through a committee of the parties called the 3 tp and nearly 70 countries cooperate.  Nearly 60 parties but more countries than parties already cooperate preparing themselves for accession.  Which is about half of the countries in the world that have data protection legislation at all.

To there is a binding international treaty.  And therefore I would take this opportunity to encourage all of you who have not yet, if countries that have not adhered to the treaties which was modernized with the input from self-Seo tiety and account deemy and other international organizations to consider cooperating with us or acceding to this treaty.  Would like to close in order to not abuse my speaking time with a reflection.

Law enforcement and data protection.  When we think about those two issues, we usually consider the restraints and the conditions under which law enforcement may use data.  And this is, for instance, an issue that is being discussed at the moment in Strasburg to have quicker access to evidence held in the cloud because nowadays most cybercrimes go unfun initialed.  There is virtual impugnity.  Not case in not even a million actually leads to a conviction.

Another issue regarding criminal law and data protection.  Our societies and I speak in perhaps the privileged position here on the Council of Europe as someone who is responsible for freedom of expression and fight against crime in our member states, our sew stities criminalize behavior that seriously harms individuals or society.

Is it therefore perhaps not time to start considering whether those -- that deliberately breach data protection regulations that deliberately sell or violate privacy provision whether those people should not also be held Crim tally responsible.  I leave you with this question and thank you very much for your attention.

>> MODERATOR: A very interesting final question.  I hope we come back to it in the discussion later.  Next we have Alexandria Walden from Google.

>> ALEXANDRIA WALDEN: Thank you.  Thank you for including us in the conversation today.

My expertise is in human rights and I come from a background of civil and human rights and social justice issues and bring that work to -- I bring that work and that experience to what I do at the company.

And so while there are thousands that look every day, I look at how we approach it across the business.

I wanted to back up a little bit and talk about how Google approaches human rights.  From our perspective, we believe in technologies power and potential to have a profound effect and positive impact of the world.  We are guided by nationally recognized human rights standards and committed to respecting the rights enumerated in the universal declaration of human rights and its implementing treaties.  An important part is the UN guiding principles on business and human rights and global network initiative principles.

That in ofs the way in which we operationallize the commitments across our business.  In addition to actively harnessing the power of technology to respect and advance human rights and create opportunities for people across the globe, we are committed to responsible decision making around emerging technologies.

This approach includes important pieces in terms of the way we integrate the issues across the business.  So, one piece that I think is critically important to the way companies are addressing these issues both in how they design products but also in terms of how they engage with governments and contribute to thinking around policy, one aspect of that is executive commitment to human rights and engaging on these issues.

Another important piece is in Terpal processes for conducting human rights due diligence and HRAAs.

Lastly, it is important that companies and specifically this is important to Google, to do external engagement and consultation with experts around how we develop our policy positions, our products and those features.  And so if you take that as the foundation for how we approach the issues I would like to Harken back to what Jan said on the key issues the in the realm of privacy and preexpression.  We come to the table tone gage with stake holders around the way the problems are actually sort of emerging and evolving to ensure that what we are doing with our products is actually addressing the problems as our users are experiencing them, and as governments are experiencing them as well.

So, just I guess I will say in closing that I think it is important for us as we talk about what companies are doing in this area and how companies are maintaining their commitment to human rights to always tie that back to the UN GPsand ensure we are having a conversation around the UNGPsthat is evolving alongside the way that we are viewing these issues in the world.  Thanks.

>> MODERATOR: Thank you very much.  We will pass the floor straight to Rami Efrati from Tel Aviv university.

>> RAMI EFRATI: Shalom, good morning.  I'm Rami Efrati, coming from Israel.  I'm the self-nominated cyber ambassador of Israel.  And since I don't know how many of you have been to Israel I would just like you to raise your hands if you visited Israel because I'm going -- excellent.

So while speaking about privacy whenever you are coming to Israel the first thing that you are -- you find is while coming to any supermarket or any movie, cinema, you find somebody, a guard looking at your basket or looking at your clothes or looking at you to find out whether you are terrorist or not.

I would like to discuss quickly and briefly the main question which is what is the right for privacy in data protection way.  Is it valued while speaking of tourism activities and what is the right way to communicate can the digital platforms.  Just to make it very, very clear, in Israel, we have two main organizations, dealing with cyber and privacy.  One is the privacy authority, the second one is the Israeli national cyber authority.  It goes together.  Speaking about privacy cannot be done unless you are dealing with El with the cyber as well.  And therefore, we also decided the government of the state of Israel decided to start up also with what is called a cyber low because without a cyber low, taking care also this privacy issues we believe that you cannot work in the right way.  We are looking at ourself as leading country both in cyber about you also in privacy.

GDPR became a very important -- take a very important role in our life but our life is totally different from most of you.  We just heard from Jan Kleijssen and Sasha brown what he said.  We can speak about it with antisemitism and money laundering and pedophiles and as well.

So what is the question and what is the way that we should deal with it?

The digital platforms takes a very important role not only when you have a protect yourself but also if you are a terrorist.  Unfortunately, you found out most of the digital platforms terrorists are using these platforms against privacy.

When the tart is using a platform, a digital platform he knows very well it is open to the public.  So when it is open to the public he can understand that also it is difficult for the law enforcement agencies to deal with it.

What I would like just to come out with highlight is what are the tools that the government has to give for law enforcement agencies in order to deal with cyber when cyber is -- when terrorism or antiterrorism when the main platform is a digital one.  And I will be more than happy to answer questions about this one later.  Thank you.

>> MODERATOR: We have some very well-behaved panelists staying well within their time so great.  That's not to put pressure on Fanny.  Fany Hidvegi from access now.

>> FANNY HIDVEGI: Thank you very much.  I will behave, too.  Thank you very much for being here.  I'm the European policy mag of Access Now.  A human global rights organization that works at the intersection of menu Han rights and technology engaging on privacy sand artfication intelligence and cyber security and more.  This couldn't be more timely for us.

I'm based in Brussels and one of our key topics in the past few years the adoption of the general data protection regulation and my colleagues work on it so I'm really glad that the panelists are addressing that topic.

Back to the title of the session in contractual terms.  They are not enough to provide adequate prevention, mitigation, prevention and redress even for normal users of platforms and services like Facebook, for instance.  Much less in the event of misuse and abuse.

We need incentives and business models that respect human rights.  Companies have the responsibility both to know about the impact of the product and services on human rights by conducting due diligence and working with outside stakeholders but also to demonstrate they are taking meaningful measures to prevent and mitigate the adverse effects.

On the government side we talk often about the obligation of lack of interference with fundamental rights.  But we have to mention the positive obligations as well that states need to create an environment for the full enjoyment of human rights.

This panel focuses on business models of platforms mostly but when we mention human rights and companies we must also account for different types of violations.  So I want to highlight that companies enough as the in ESO group and hacking team make it possible for repleasive regimes to target those who oppose them in order to stifle.  The covered nature of targeted spyware makes them the tool of choice for authorityians.  We see the role on the other hand of the big companies taking actions like the WHatsapp litigation brought in the state of California.  When we talk about the responsibilities and obligations at the moment and maybe that ideal scenario that you asked for, we are failing on both ends.

As the session description rightly mentions the Cambridge Analytica.  It created a home yum to mainstream the urgent need for the enforcement of pry satcy and data protection rules or adoption of comprehensive frameworks in areas where they don't exist.  In contrast to the revelations it has not led to meaningful reforms yet.  It has translated into political talking points about addressing the information mostly by self--regulatory measures but no systematic reform of strengthening safeguards against microtargetting.  To bring the European example the way the rev election helped move the needle in the adoption of the GDPR it was just last week when we pronounced the E privacy reform dead or zombie at best.  The European European phone most follow through and complete the reform after the GDPR to protect protects Jens gone line tracking and enshower the confidentiality of electronic communications.

I'm looking forward to discuss all of these topics at once in 60 minutes how we will solve them and thank you very much, once again.

>> MODERATOR: Thank you very much.  And finally, we have Florence Raynal from the French national date protection authority, CNIL.

>> FLORENCE RAYNAL: Hello.  I'm Florence Raynal working for the CNIL.  I'm honored to be with you today.  Just in case, let me recall the CNIL the French authority.  We are regulate data that protection in France.  Our role is to advise companies but also public bodies in complying with the GDPR in France.  And we also are enforcing the (?)

Digital forms is truly an interesting case study from the privacy point of view.  Indeed, in today's word, it is technically possible to collect enormous amount of data but there are privacy risks associated to this -- (?) we need to be concise about.

In certain cases, new techniques are used to artificial intelligence and facial recognition and other automated system which must be carefully framed because they raise privacy challenges.

These can lead to blacklifting, discrimination, abuse of decision for the people.

We also see a creation of big data reservoir where companies can pick and choose and lead to the development of a huge data market without much control.  This needs to be done with legal basis in a transparent way and with possibilities for the people to control this collection and reuse them.

It raise also other privacy issues such as data retention, security, also post issue is back to qualification of responsible parties which is crucial in order to identify who is responsible for what.  GDPR provides for tools to the people to better control the digital life.  They provide tools to exercise their rights, rights to object, right to be informed, right to erase and right to the portability.  This is very important right that helps also to rebalance the asymmetrical relationship with companies.

GDPR provides for duties to companies in the way they process data and I would like just to mention here Article 22 with profiling which is a very important provision with respect to data combination.

And user for new technologies to create profile.  As GDPR give a robust framework to the new practices, it is also -- it is also provide way for digital platform but more generally public bodies and companies to develop policies that also correspond to expectation for users on their privacy and at the end can also become good for the business in order to provide trust to the customers and at the end have a good business model.

With respect to all those type of processing that we see happening, we are issuing guidelines and tools to accompany business in order to comply with the GDPR.  For example, recently we have developed a tool that can really be seen as a success as a tool -- as a compliance tool.  And we are are also doing some enforcement action.  Maybe you have seen some action with respect to Facebook and Google in respect to data combination, lack of transparency, but also other platforms such as Blabakar and the 18 website where we found that security transparency consent issues.

There is also an important factor Lynched to the geography to the platform and internet parameters that are not necessarily located in the EU and where data transferred and stored and on this case GDPR brings also a very clear police is message with Article 3 and the territorial scope of the GDPR.  To summarize this, if you do business in the EU, you must respect EU rules.  Either because the company is established in the EU or because the business is targeting the EU markets.

And it is a very important provision which in certain way puts EU and EU actors on the same equal foot and provide if they target European markets.

EDTP, gathering (?) ISSUED FURTHER GUIDELINES.  We need to be more ambitious because the global issues in fact need also global solution.

In that line, we truly support convention 108 plus as a possible instrument to resolve conflict and to refer a common solution.

As Jan said, it is the only binding instrument that exists today at original level and that is open to cert countries so it can really be seen as an international instrument and only has original instrument covering both public and private sector and also intelligence processing which is very important as we have seen that today.  Also with adequacy decision taken by the commission.

It is well articulated with the GDPR and it creates a great forum of cooperation between that of authorities and government at international level.

Going back to the title contradiction in terms we think that we can have a kind of a win-win situation where privacy is good for business because it is good for people.  Privacy should be seen as a chance, an opportunity also to create trust and to improve the quality of services because at the end, again, it respect -- the expectation of user end privacy and fundamental rights.  Just to finish we think we can have a common interest to avoid contradiction.

>> MODERATOR: Thank you very much.  One of the questions that we were asked to consider in preparing the panel was what should governments do?

And I think we had the full range of possibilities proposed by our five speakers.  Jan pointed to convention 108 and the GDPR as strong pieces of international legislation.  And wished for broader -- continued takeup of convention 108 and reflected on the need for criminal sanctions.  Fanny wanted the law enforced more effectively and enhanced with E-privacy rules.

Florence pointed to the reinforcement of the GDPR with tools and guidelines.  Rami talked about reinforcing privacy by stopping criminals from abusing privacy online and ail Alexandra pointed to a non-governmental multistakeholder engagement to achieve our goals.

I would like to ask our panelists if they want to come back on any of the comments made by their fellow panelists before looking to see if there are questions from the audience.  Okay.  The floor is now open to the audience.  We start with questions right in front of me.  Introduce yourself, please.

>> AUDIENCE: Thank you.  Steve Delbianco with net choice.  The question equally framed to the digital platforms represented here, ail Alexandria and the governments and if there were courts here I would love to understand that, too.  When platforms adopt the UN declaration of human rights how shall a platform balance two rights that are in conflict with each other?

And the example I would give is the right to be forgotten which is an exercise of the 12th human right principal respect to pry satcy against number 19 that sighs humans is the right to seek and receive information through any media.  I seek to know whether to lend money to an individual but that individual is using the right to be forgotten to deny me the ability to know they have been bankrupt or a doctor whose license is revoked or a child condition care provider with a Crim nap conviction.  I'm trying to come up with an example and ask for help on how to balance human rights that are in conflict.  Thank you.

>> MODERATOR: I think I would be interested in Alexandria come back on that and in particular whether you think iting likely that Google would choose to impose a decision on right to be forgotten in cases where somebody would be harmed because they didn't have a proportionate right to ask for the right to be forgotten in the first place.

Facebook -- I'm sorry, Google needs to impose to deindex or delink content in situations where it would be unfair on the individual to have certain search results come up.  And a harm that the gentleman just described would not be compatible with that.

Do you see a challenge in that for you?

>> ALEXANDRIA WALDEN: Well, as with all things in human rights there are often times not simple answers or solutions.  Especially when rights are -- or are appear to be intentioned.

But I think the sort of that's a little bit about the beauty of UN guiding principles on human rights, they refer back to the government's duty to protect human rights.  And they refer to a company's responsibility to respect.  And ultimately what it does is point out that certainly there are actions that government should take, and companies have to think about how they respect the law and the countries where they do business.  In addition, companies have to do their own balancing, their own human rights due diligence to understand how their products are actually operating in the real world and are there ways we should be thinking about how we design our products and the features that they include to ensure that the way that users are engaging with them can enable them to be -- have choice and control so that we can have be rights respecting on the company side.

But all of that requires both on the government side for them to have strong rule of law and clarity and for them to respect human rights.  And for companies to do it as well.  And then we can sort of work together to deal with these issues.  I think it has been interesting to see what has happened in Europe around the right to be forgotten.  Google did challenge -- we had -- we have a long history of challenging that history in court.  And when it became clear that was going to be the law of the land for Europe, we respect the rule of law and have complied and created an ornate way that we comply with the law and allow users to appeal directly to us as part of that mechanism.

I do think it is where the issues are most challenging it requires there to be significant multistakeholder dialogue with governments, companies and society all at the table.

>> MODERATOR: And I'm going to force myself not to comment because it is a subject that I'm very passionate about.  Do we have -- oh, we have -- we will take them in the order then.

>> AUDIENCE: Good morning.  I'm from the house of lords in the UK where I sit as an in be dean pen.  I'm interested to hear the panel views about the failure to hold children's rights in online situation.  Least the fact is the child under the age of 18 and a age of adulthood online at 13 based on a piece of old fashioned law in the U.S. proper.

And so I'd really like to know how you imagine that children's rights could be normatively observed by the platforms.  Andshould declare an interest just in that we are currently undertaking a general comment on the convention of the rights of the child for the digital world.  Thank you.

>> MODERATOR: That's a very interesting question.  I look forward to the answers.

Two further questions behind you and further back.

>> AUDIENCE: Thank you.  Alexander from the Russian federation.  I want to mention the problem about censohship on the big social networks such as Facebook and Google.  They perform extensive blocks or delete accounts where the owners are just trying to express their political views.

And how -- how can we demand from the Facebook, from Google from Witter to publish the relevance of words that are forbidden to use on the platforms and in general put an end to politically-based censohship.

>> MODERATOR: Thank you very much.  Next in line.

>> AUDIENCE: From the university of (?)  My question would be a good follow-up to the previous one.  I wanted to have your views on the recent case versus Facebook of the European court of (?) according to which a handful of content on hate speech has to be removed and even a criminal content.

And this on the worldwide scale.  So on the one side that can be considered a big move forward in forcing platforms to adopt the policy against hate speech, protect the reputation of politicians.  On the other side there are obvious down sides if there is interpreted in the way of going towards censorship as the critic before just mentioned.

If you allow me a second question, then counsel of Europe was quite successful in developing a guidelines for the liability of platforms intermediaries.  My question would be how is the process of implementation of these guidelines being monitored?  Is there any form of monitoring possible?  Thank you.

>> MODERATOR: Thank you.  I think we have about three sessions worth of questions there.  I think we should answer those before coming back to another round.  I particularly like the last question.  Does anybody want to go first or should we take it in order of speakers if nobody is jumping in?  Okay.  We will start with Jan then.

>> JAN KLEIGSSEN: I just wanted to refer for the question which I think was excellent.  All a question of laws.  Should we like to forget the Holocaust or to let it go?  Or maybe we would like to delete the picture of the child who was abused.

Should we look at a terrorist who made his will on the digital platform and influence hundreds of other people who are going to kill any one of us.  80 people were killed in Nice-France by a truck driver who was influenced by this.  First of all, a question of norms that I know it is not easy to talk about and not easy to come with an idea.

I believe that the most important way to deal with Facebook, Google and other is to try and come out together with agreed norms in order to try and come out with the solution how to do it.  Meaning, if you want to delete something, you have the right not to do it as well, let's come with the norms together with this organization.

If you will do it only by regulation, by law or by somebody in any country, it will never work.  We have to team and work together.  I think this is the best way that at least I find out when I started to deal with these companies on cyber and other things.  Because when you are dealing with a terrorist, the same question is coming also for the digital platform, how are we dealing with the content or are we dealing with to make sure it is sanitized network.  This is what I can say on that.

>> MODERATOR: Thank you, Jan.

>> On the balancing of the rights by.

>> JAN KLEIGSSEN: There are legal standards with clear case law.  And in the end, it will be for a court to decide if there is a dispute.  Children's rights I fully agree there is much that needs to be done.  Censorship on social platforms and political propaganda.  One answer could be for government certainly to promote more public service internet so citizens would be able to rely on quality investigative journalism on the internet supported by our governments.

Harmful content to be removed the case by the ECJ.  You raised the question of censorship but if it is in this case an international tribunal, that is guarantee at least the judgment that it is not just unwar ranted censorship.  As to the necessity to refer certain content I would refer to the speech of Sasha Baron Cohen.  The guide lines are being assessed by considered by the European court of human rights and I would expect there will be cases with where this recommendation or these guidelines will be brought up as a sort of the rule of law.

>> MODERATOR: Thank you.  Florence and then Fanny.

>> FLORENCE RAYNAL: Thank you.  I would react on the question related to the children because we think that it is a very, very important topic.

And that the GDPR there are specific provision with respect to consent given by the children in processing of their personal data linked to a certain age and needs to be defined at national level to be sure there is a maturity aspect.  Very important on that.

We have done huge work on the children rights but also at international level with the international conference of privacy commissioners.  We have done, for example, some analysis of different legislation and exercise of rights and done also a kind of reference model for the community of education in a certain way, train the trainer material in order to help people in contact with the kids to make them aware of their rights.

It raise a lot of issues.  It raise issue with respect to the scope of exercise of the rights by the kids and the children, how far, you know, what kind of right could they precisely exercise and the role of authorities and whether or not they can endure complaints by kids and at which age.  And mechanism to verify the age and how companies can put that into practice.

And we definitely think that there is a huge work done by other international organization like OECD and Council of Europe and the UN that we are trying to follow as much as possible.  And we hope that we will be able to influence the discussion at UN level in order to better frame the exercise of rights on the digital environment by the children.

>> FANNY HIDVEGI: Let me respond to the Facebook case question.

And so I know it is quite well known but maybe it is worth while to offer a little bit of context for the people who have not read that case just so they know what is it about.  Because for -- as a starting point, it is not about hate speech, but defamation.  Is started when a Facebook user posted an article featuring the photo of the well known austrian politician and used slurs like she is a corrupt oop and lousy trader and member of a fascist party.  What happened throughout the litigation in the end Facebook was ordered to remove the post but there was a huge legal question to decide whether Facebook should remove identical content going forward or also equivalent content going forward.

And that is -- that is the major legal question that is now being discussed.  And the Court opened up the whole possibility for a general monitoring obligation and the use of automated tools in that context which is the most problematic part of the decision.

And in the European context we will see how this is being addressed in the opening potentially of the E-commerce directive or the DSA, the new Digital Services Act.  The problem that we see is this case might have really negative implications for freedom of expression but also freedom to perform opinion which is by the way an absolute right and has quite a different legal implications afterwards.

So automated tools are demonstrated not to work aspicking up on context and being able to make that human rights decision that is already problematic for courts and people.

And it also might create this general monitoring tool that violates universal human rights.

>> MODERATOR: Thank you.  We have too imto maybe two more quick questions and we already have four.  Those two hands went up first.

>> AUDIENCE: I'm a member of the German parliament internet policy speaker for the left party.  I have a two-fold question.  The first is we have really big issues in Germany because of lacking national responsible context of digital platforms.

I'm not talking about nice buildings, PR offices and even in place, they do exist but there is no address where you can deliver a court order or where a lawyer can send official letters.  They just don't take them.

And then they refer to Ireland or even the United States headquarters and you never hear again from them.  We have with Twitter already an issue in May, Twitter blocked a green politician in the middle of political campaigning for a joke he made on Twitter.  He is still blocked although two court rulings have already Ruled that Twitter has to open the account, they just don't do it and they refuse to take the court letters.

So how do deal with it and is there an option to make a European legislation force them to have legally binding delivery address.

And the second is talking about for example digital violence against women on those platforms.  One issue we have first that those platforms don't take -- don't take this serious enough.  But the second issue is a lack of capability of law enforcement.  In all countries I know of definitely in Germany it is also an unpunished crime because you don't have police people and justice people knowing what to do.  What could be done to help in?  In Germany we are talking about Creiteing a specialist police authority where -- creating a specialist police authority where at least you have some trained people to deal with these issues.  Are there other ideas and how is it dealt with in other countries?  Thank you.

>> AUDIENCE: So our question is about our concern is that tech companies are developing policies and practices with states who are known to be violating human rights and digital rights of the world's most vulnerable including occupied peoples and we want know how companies like Google are working to ensure that policies and practices they develop do not enable states to engage in illegal occupation and to commit further human rights abuses and war crimes.

>> MODERATOR: And finally, if you can, yep.

>> AUDIENCE: Thank you very much.  (?)  Deputy.  (Speaking non-English language)

>> MODERATOR: Thank you very much.  The last intervention for those who have made the terrible life choice of not learning French, that was a request for translations of the interventions of our excellent panelists in order to help decision making in Chad who are working on a legal framework and feel the speaker said that they are somewhat delayed in addressing these challenges.  Quickly if we can make a final set of responses from the panel and then I will try to do a wrapup of this very diverse discussion.

>> ALEXANDRIA WALDEN: Working our way back down from this end now.  So I will address a few of the questions that came up.

With respect to enabling states to commit war crimes, I think that gets back to what I was saying in the first instance with respect to the universal declaration and UN guiding principles and companying embedding, figuring out how to operationallize in ways that allow them to to due diligence across their business in how they respond to law enforcement requests but also in respect to how they launch products and who they sell them to.

On the topic of digital violence again women, I can tell you that platforms do take this issue seriously.  At Google, we have a variety of policies and products that seek to address some of the ways that harms can manifest.  It is not a pan seia but we have policies that prohibit our policy against hate on YouTube includes gender and gender identity.  When someone is inciting hatred based on that protected characteristic content is removed on that basis.

We also have a policy against harassment and that includes when -- that is focused on when the content is based on an individual, a threat to an individual.  And we have been clear that actually we are evaluating that policy currently to see if there is a need for revision and to tighten it up.

And then lastly I had some comments on the gentleman who raised censorship issues.  Companies are committed to preDom of expression in the way that is a articulated in Article 19.  There is a freedom of expression and there are legitimate restrictions on that right.  We ensure that we are following the way that the law is playing out with respect to freedom of expression where we are doing business that we understand how the Courts work, what rule of law looks like in a given country so we can understand how we operate this.

That is with regard to how we respect the law.  But we also have policies in place to ensure that our users understand our policies.  We have to be clear about those.  We have to have appeal mechanisms in place.  And then lastly, we need to be transparent about what we remove both it respect to government requests to remove content and with respect to what we remove under our own policies.

>> FANNY HIDVEGI: Thank you.  As a wrapup, I just wanted to highlight how there is an overarching demand for all of these rights implications whether it is privacy and data protection to be addressed in a systematic way because it is a business model question and underlying core question.  And whether it is competition and markets or various export control measures, content governments including moderation but also the creation and design.  We need to ensure that the companies follow human rights norms.

>> MODERATOR: In a predictable way.

>> FANNY HIDVEGI: Yes.

>> FLORENCE RAYNAL: Very quickly because I know that we are late.  It will not be a full answer because it will be just on the GDPR aspects on the reputation, not necessarily on the removal of full content.

The GDPR organized a coordinated manner to answer to infringement of the GDPR on the EU territory around the lead DPA that is restrictth the case in coordination with the author of the authority.  Called the one stop shop system.  Right now we are practicing it, but for the company that are established -- Have establishment in the EU we have a system of cooperation among ourselves in order to ensure student the violation infringement and to sanction them.

We are in the practice -- we are putting that in practice right now.

>> JAN KLEIGSSEN: One sentence as a wrapup.  We are certainly not short of standards, but we we also certainly must do a lot better to ensure they are implemented.

>> MODERATOR: And thank you very much to the panel.  I personally find it kind of shocking that we are asking quite fundamental questions in 2019.  I think that the -- the first and the last one of the last questions are the core of a lot of these questions.  When are they doing too much and when are they not doing enough?

The Council of Europe recommendation on the rights and responsibilities of intermediaries is a very important document that should be the first step in trying to find answers to this.  It is not acceptable that legal content is removed, it is not acceptable that a legitimate parliamentarian is taken offline and we need to dig into the basic principles of national law that restrictions have to be predictable.  At least why are we talking in 2019 about unpredictable decision-making?

It is -- bewildering.  I think if we can leave this room with at least if you know you have a problem you can start finding a solution.  If we are recognize this as a problem that needs to be addressed and build on the Council of Europe's very good work in this area then we are Heading finally maybe in the right direction.  Thank you very much to all of the questioners.  Apologies to the questioners who didn't get to ask their questions.  Thank you to a very good panel with very good insights.  Thank you very much.  And see you soon.

 

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 411