IGF 2016 - Day 1 - Room 7 - WS168: IMPLEMENTING HUMAN RIGHTS STANDARDS TO THE ICT SECTOR

 

The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

>> LUCA BELLI:  Okay.  We will start in two minutes so the technical assistant can get logistical detail.  Thank you.  Also suggest a housekeeping note, we do not have translation to Spanish.  So for those who are hoping for translation into Spanish, I'm sorry.  Can we start?  Yes.  Excellent.

So welcome, everyone, to this workshop on implementing human rights standards in the ICT sector.  We are here to discuss this very relevant issue because at this time, we have reached a label of understanding of maybe on some things consensus and also a critical amount of evidence produced by a lot of very good initiatives and organizations to have some more material to really tackle this problem.  We all know that from an international law perspective, duty bearers have an obligation to protect human rights, but we also know thanks to human rights guiding principles that every business entity has a responsibility to protect the respects for human rights.  they have to provide effective remedies. 

So we have here today an amazing set of panelists to analyze this with us and not only to deepen the question into the details, but also to provide some concrete evidence of what is not going on properly and what could be done to make things better.  So we will start with AMASTO which is representing professor David Kaye could not be here because he left and he has agreed to substitute him.  He is the legal adviser on human rights. 

Then we will continue with Joao Brant that was previously working with the Brazilian ministry and observatory regulation media and convergence. 

Then we will keep with Katie Shay, which is the legal counsel at Yahoo. 

Jamila Venturini is working with the society. 

Peter Micek is global policy and legal council at Access Now. 

And also last but not least Rebecca MacKinnon.  So to start the meeting, I would like to ask Amos to provide us a little bit of information of the work that the UN is doing on human rights and particularly on offering of expression.  What are the main challenges that you are currently facing in the work that you are developing.  Thank you, Amos.

>> AMOS:  Thank you so much, Luca and also for organizing this panel.  I will give you a brief overview of the work that we are doing that we are about to do in the next couple of months and then maybe also identify some of the challenges that we are seeing in the process of working on these issues.  Since August of last year, the mandate has began looking into state regulation, impact of freedom ever expression and the responsibility to respect human rights and freedom of expression that springs from this environment.  We think of this as a really huge topic.  For us what we wanted to do was initially scope the project and to met out the legal frameworks that might govern some of the gaps that we see and also a work plan for moving forward.  And to really bring, you know, the set of issues before the human rights console and U.N. member states which isn't necessarily something on their radar.  So this report in the human rights counsel in June 2016. 

So the next phase of the project ‑‑ the next phase of the project will be focusing on telecommunications, as service providers, ISPs as well as any other entity that really is engaged in facilitating and operating the digital communications in the structure.  Equivalent providers and exchange points, submarine cable providers and this is where you really get into the meat of things.  Again, our findings and recommendations will be presented to the human rights council in June 2017.  We just had a call for submissions and we still welcome them to the stage. 

So, take the question of government access to customer information.  I do think we see that several companies have developed a fairly well infringed set of due diligence strategies including, for example, whether request are based on law and enacted law, whether they're in proper format or appropriate requesting authorities and I think we see they're fairly well established set ever tensions.  The need to respect customers privacy and freedom of expression on one hand and other tensions when it comes to employee safety and the safety of telecommunications infrastructure and investments in a particular country.  I think the calculous changes will have other factors.  When we think about medium and small enterprisers in the Teleco and ISP space, they might not necessarily have the resources to deal with more than a certain number of requests and they might not be connected to the CSR in human rights community to understand this might trigger human rights considerations dealing with requests might have considerations.  So we move ‑‑ if we change the issue slightly from easy is to customer information to the provision of equipment that might enable and facilitate direct government access to networks, my sense is that human rights conversation is not quite as far along.  It is still critical to convey kind of the baseline importance of due diligence processes and the need for human rights assessments and it's also not clear what the responsibilities and the processes might look like when it comes to the sale of the use of equipment and when these companies receive requests for modification of infrastructure equipment in order to enable future surveillance demands.

Finally, I think in certain geographical context regardless of the issue that's at state, companies are so constrained by pressures that states put on employees and infrastructure that looking for leverage might be harder or at least might have to be set up earlier at the stage of negotiating contracting licenses.  And that might be the human rights responsibility there.  So I think that these various levels and phases of issues that we need to be thinking about and so that's kind of what I see to be in the challenges and questions we are dealing with.

>> LUCA BELLI:  Thanks for highlighting two very important points.  The first one is basically all we do online is intermediated and this ubiquitous termination means there is an extremely wide range of intermediaries that have some sort that facilitates communications and also have a great impact and potentially a control on our communications.  On this point, I would like also what is the impact of some of those initiatives that we can concretely assess at this time.

>> REBECCA:  Thanks very much, Luca.  Ranking digital rights is a project that benchmarks Internet and telecommunication in other ICT company sectors on core commitments policies and practices that affect users freedom of expression and privacy.  We've passed around some information about the project and my colleague Alan.  Please hold her hand.  She has some Spanish information as well and also some Spanish copies of the report, Spanish copies of the summary.  You can get the Spanish language from her later.  But our findings, we evaluated last year 16 companies and the list of all the companies is in of the material we passed around and on the website.  So I won't do a recap there.  We evaluated eight Internet companies from around the world.  Not just Google Facebook, Microsoft, Yahoo, but we had a Chinese company, a Korean company and a Russian company and we look at a range of telecommunications companies that are husked in a different rage of jurisdictions. 

So we got a good global spread.  What we found in brief last year was that across the board, companies are doing a very poor job of informing users about how their user information is collected, shared, kind of what's happening to it.  There are some positives in that there's a group of companies in the index that have been doing what we call transparency reporting and that report on not only what their practices are for handling requests either from governments or other parties to restrict content or to shut down networks, block content or to hand over user information, but there's all a lot of companies that don't report on these things at all.  So there's a big gap and in many instances while there are some legal restrictions on some things that companies can disclose, across the board, there are many things that companies could be disclosing that they're not disclosing and there's no legal impediment for them do be doing that legal disclosure.  There are also some regulatory ambiguities that governments should clarify so that companies can maximize their transparency on how they're handling user information and access to information. 

Another finding had to do with the private regulation of content.  So companies had terms of service and you'll talk about this more.  They set private rules for what you can or cannot do on their platforms that may or may not relate to what the law requirements or have varying relationships to what the law requires.  And there's no transparency about the practices for enforcement of what's being taken down, the volume and nature of the content being taken down.  We're going to be releasing a second index in March.  You'll be able to see if to what extent there's been any improvement in any of these areas, but there's certainly going to still be a lot more room for improvement and in March, we're also going to be adding a few companies to the index.  So Apple will be added, Samsung, some other companies as well.  So stay tuned for that. 

Regarding the GNI, we have a section in our 2015 index that's called commitment which we're changing the name to Governance.  It has to do with accountability around what their practices and policies are.  So does the company have the public commitment, but also they conducting human rights impact assessments.  Do they have internal whistle blowing, are they doing internal training?  Question, do they have grievance on this?  Questions like this. 

What we found is companies that are members of the GNI and that are members of the telecommunications industry dialogue, many of which are currently observer members of GNI, perform strikingly better in the governance section than the index than companies that are not members of GNI.  One further comment on sort of GNI membership because I know there's a lot of discussions.  What impact does this have in the fact they're carrying out impact assessments to determine to help them make decisions and include these factors in their decision making about going into new markets and rolling out new products, the fact that there's board level oversight over these issues that the board of directors actually makes it clear that managers are accountable on these issues, to what extent is that making a difference to people on the ground.  In a way, it is hard to quantify because it is saying how many accidents did the air traffic control system prevent? 

So how much worse would the Internet be if Google, Facebook and Yahoo and Microsoft and the number of other growing number of other companies didn't make these commitments and weren't putting these practices into place.  It is hard to say, but my sense is from also having been in the GNI that I was on the board for several years and continue to be a participant is that I think ‑‑ I think that despite all of our complaints, which are many and justified, I think things would be a lot worse if we hadn't had this system where companies are being held accountable to whether or not they are implementing their commitments, whether or not they have a system in place and I think the extent to which you have companies that are put in place. 

For instance, a process for demands and they made that process clear.  When GNI started, most of these companies didn't have a company‑wide process even and policy for how managers across the world should handle government requests.  And most of the companies were not even tracking government requests across their companies globally.  And as part of these commitments, they've had to do that and they've had to really understand what the impact is.  And I think that's really huge.  That's really significant that that is now happening at least among some companies.  It needs to happen more.  It needs to be expanded to terms of service enforcement and other things, but I do think there's a lot of bad news in the world.  I do feel that this is one of the things that has made the Internet a little bit less bad than it might otherwise be.  So, I know that sounds funny, but yeah.  Or maybe a lot less bad.  It's hard to know.  But so that's my kind of observation on that as well.

>> LUCA BELLI:  Thank you, Rebecca, for this injection of optimism. 

And to keep on discussing with going into some deeper details of the issue, I would like to ask Joao who has dealt with specific details in Brazil.  I would like to ask, what would be the impact of several intermediaries on several culture states.

>> JOAO BRANT:  Thank you for the invitation and thank you the audience for the presence here.  I would say firstly, the panelists that came before me just set up the framework, I think.  We're talking about intermediaries, big power and big responsibility and rather than saying as we were saying five years ago about okay.  We have to limit the intermediary liability, we do have when they are dealing with third‑party content, but we have to take into the table their responsibility on the conference and the terms of services and how they rule and how they do with state and government requests, how they deal with the content users by their own terms, et cetera.  It definitely impacts human rights and cultural rights.  I would say I like to mention two or three things on that.  Firstly, we have to strike a balance.  Intermediaries have to be liable when the responsibility are their responsibility, but have to be protected when responsibility is not there.  Like this big power and this big gate keeper that we have now is, of course, under pressure of everyone of state, of other private sectors of users, et cetera.  We face an important situation in Brazil.  I would like Ana to put the image on the screen please. 

When the ministry of culture had a page censored by Facebook, we posted a historical picture of an indigenous capital from 1909, this was the picture.  More than a specific case for us to was important.  It was a national exposition on historical pictures, but for us it was a discussion on cultural diversity that is really important on this case.  We were saying intermediaries like Facebook, but like other companies sometimes, they are defining global standards, common global standards based on their own assessment of what has to do a global code of conduct that has the effect of uniformizing ‑‑ can I say that in English?  And if I can't ‑‑ I think you can understand what I mean.  Making defining uniform standards for the whole world.  What's absolutely the contrary of what we're discussing UNESCO with the convention of cultural diversity.  The idea of ‑‑ you have to have common global standards that uniformize global understanding of culture is really the opposite of cultural diversity.  This is a big problem.  We had also the discussion with them to say we don't want to refine your global standards to understand how it may be more uniform. 

We want to discuss the power because they play this intermediaries because of their overwhelming market power, if I can say that.  It is not only a significant market power, but an overwhelming mart power and because of the place, the idea of human rights are not only ‑‑ have to be taken a vertical way, but in a horizontal way and, of course, as companies that deal with them.  They play a public role whether they want that or not, they play a public role.  They have that.  I don't mean that we have to do the common regulation rules that we've been using in the 20th century, but we have to discuss how to face the situation and we have to understand just to finish to wrap up this first position to understand some specific issues for developing countries. 

If we talk, for instance, on cultural (inaudible), then we have to talk how the economy is supported.  For countries like many of the Latin American countries, many Asian countries that are big producers ever culture and big consumers of culture, we are actually taking part of our economy of culture has been taken off by this intermediaries.  I am saying just really to finish.  Of course, there's intermediaries and new reality.  It is part of promoting freedom of expression, promoting freedom of culture and understanding one side.  We just have to strike a balance and understand how it can be a way to promote and have a great cultural diversity in 10, 5, 20 years' time.  Thank you.

>> LUCA BELLI:  I want to get back to you on the question of how government, what could government do to tackle this, but probably in the second segment of this workshop.  First, I would like to exploit what you are very wisely saying, which is that on one hand intermediaries should not be and be liable for what other people do.  On the other hand, they have a responsibility to respect human rights.  And also a very second point that is emerging is in their role of private regulators, they set rules, standards, terms of service that is precisely behind how people can enjoy their human rights within their spaces, their platforms and within their networks.  We have been doing terms of service and human rights in corporation with council and the department for information society.  It is here and maybe after he can say something later.  Then we have Jamila.  This study has been evaluate something very core elements and is one of the few studies that provide concrete evidence on how platform specific e made areas respect human rights.  The purpose was not to have a ranking, but it was to highlight best practices and worst practices. 

I think Jamila will have some very important and relevant data to share with us.

>> JAMILA VENTURINI:  Thank you, Luca.  Thank you all for coming.  Good morning.  Yes.  I'm very happy to talk about this after the presentation that came before because they kind of set the scene on what we have been doing.  As Lucas said, we analyze the terms of service of 50 online platforms.  We have this intermediation phenomena across all Internet layers, if you could say like that.  You were looking at content layer.  In context with what we observed is more and more concentration of platforms.  So terms of service nowadays they basically define the rules and conditions as Luca was saying for the exercise of the right to freedom of expression, of the right for the ability to communicate online on several occasions, access to information.  It is important to highlight just to follow up on what Joao was saying before.  Several countries and developing countries mainly, access to information and the ability to communicate is highly mediated through these platforms.  Right?  Especially if you consider that access through Internet is more and more mobile Internet and more and more mobile devices. 

So what we see when we analyze terms of service is that there is a first challenge that is identifying we chart the binding terms.  So understand which are the documents that users and platforms.  This was either due to a large number of documents that are recited in the main terms of service or to the ‑‑ to the non‑presentation of the documents in a clear way when the user has to create an account or in the first access by the users.  Once we identify it is the language and I'm not talking just about a legal discussion that is common to several contracts and several types of contracts, but also computer science language.  Especially when we look at the privacy policies, you can see several terms in further explanation. 

When there is explanation, you can't really understand the impacts that these technologies can have on your rights.  When it comes to freedom of expression and this Connects a little bit with what JOAO was saying.  We didn't analyze the so‑called commute guidelines.  We can't analyze which content was or was, allowed or not, but how they deal with this content.  There is little commitment by these platforms with guaranteeing justification, notice and the right to be heard in case.  More than that, most platforms explicitly state they can end users account without any notification or the possibility to challenge. 

When it comes to privacy policies, they are much more longer and much more detailed; however, the terms are usually broad enough as to insure that companies can perform various uses of user data without having to require a new consent or having to change the terms.  Usually they require consent to track users activities on other websites.  Most of them do that.  I would say more than 80% would have the precise data on the book we are sharing with you afterwards.  Mostly, they also request consent and they say they may allow third parties to track users activities in their website and they do that usually talking about picks of tags or other types of technologies we had to find out what they were while we were doing the research.  And also they may share content with third parties without detailing which are these third parties, what they will do with this data and under which terms they will be processed.  Even when it comes to government requests for access to users data, only 10% of the platforms are analyzed say they only shared data following eligibility of the due process.  That means 5 platforms out of 50 explicitly making sure there is due process before handling users data. 

I'm glad to hear, as Rebecca was saying before, that there are order levels of commitment, but as we were saying, it is important that this is also reflected on the terms of service on the specific legal terms that they present to users.  Right?  More than establishing the conditions and the rules for interacting online, these platforms also implement and establish rules on how disputes will be solved.  So another aspect we analyze was due process and we can observe that users may never know about changes in their terms of service.  There are a lot of contradictions in these types of clauses.  You will say you will notify users about change and another part is they say they won't.  But it's hard to have some certainty on this regard.  Besides that, legal disputes are subject to several limitations. 

For instance, users may never be able to go to courts to take ‑‑ may never be able to take their disputes to courts.  There are limitations on the presentation of class actions.  There are obligations on the use of arbitrations and resolution mechanisms.  There are a lot of ‑‑ most platforms present impose a particular jurisdiction for dispute resolution and this all makes more difficult to solve this dispute.  So, this basically is the scenario we found in this research.  We are aware that there are several challenges in changing these ‑‑ this context, but we also believe that our best practices have already been implemented and these terms should respect human rights as several international rights and bodies including special opportunities with freedom of expression that have already stated several times.

>> LUCA BELLI:  Thank you, Jamila. 

I think this is the first start of the whole due process.  It is a fundamental right and it analyzing not only freedom of expression privacy, but it is something that's been arising, but due process is the first thing that is analyzed, when you know that 30% of platforms will notify when servers are changed.  That is something quite scary from a due process.  26% of platforms in terms of service and the waiver of class action is something not only scary, but it would be illegal in the majority of jurisdictions.  Before we pass the second segment, I would like to open the floor for comments and debates.  I propose to have three.  Although, there is a result of a lot of people and three people can have the pleasure of raising comments.  Raise your hand.  1, 2 and 3.

>> AUDIENCE:  Good afternoon.  I work for an online team.  Something that was very evident is focus when we talk about human rights is on the content layer.  And David came out clear in his focus that we also need to look beyond that, look at technical aspects, et cetera.  Some of the questions being raised here is how do we do that?  So Article 19 is pioneering some of the work around human rights on the technical layer of the Internet.  Most specifically, we have been working on the Internet task force and assign names and numbers.  So I figured I would share some of the work we have done doing as sort of ‑‑ well, best practices or first practices to have a look at what creative solutions exist. 

So, for instance, in the internet engineering task force, we have a set of specific working group called human rights considerations groups.  We were trying to develop human rights considerations for engineers.  So it is pretty much a guideline that has engineers think through the technology they built and make sure they document what kind of potential impact this might have on human rights.  It seems like quite a basic step, but it is necessary because right now we don't have that kind of documentation.  We don't necessarily understand what the human rights impact is.  With the internet corporation assigned names and numbers, we have been heavily involved in the development of the bylaws and specifically ICANN's new commitment to respect human rights.  They actually become alive. 

Some of the things that we very much have seen is that we need new and creative approaches, only leaning on existing labeled principles.  They are unwilling to engage in these kind of discussions, but that is a difficult question for them to ask where to start.  So some of the things we have been focusing on is that there are mechanisms that do this and we need to start talking about human rights.  We need is to start looking at the principles.  And these are very, very concrete things.  I hope that by sharing these with you, we will also take the discussion beyond the content layer because if we don't look at how this technology is built, we are missing out a huge part of where it might impact our rights.

>> LUCA BELLI:  Thanks for highlighting this.  I think there was a gentleman there.  If we have an open mic, maybe it can be useful here.  Do you have a roving mic?

>> MUGAMBI:  Thank you.  My name is Mugambi from Kenya.  I am curious on the research that's been done.  When you talk about human rights and business, we kind of stop.  So is there a process where perhaps the global network initiative you look at the business for having human rights in the ICT sector and I can give an example of the mechanism.  You work with corporations and there's a business case to this.  There is sustainability issue.  You don't look at privacy issues.  If you don't look at freedom of speech, freedom of expression, then the business itself is at a certain risk.  One of the biggest risks is you put a lot of (inaudible).  This litigation impacts on the profits that the institutions make.  Whether you help this corporation look at what is the business case for having human rights for the ICT sector, but it is something that has profitability and sustainability. 

Thank you.

>> LUCA BELLI:  Yes.

>> AUDIENCE:  Thank you, Luca.  You introduced work on the terms of service and human rights because in fact, this was a direct result from memorandum of understanding between the Vargas foundation and the council of Europe following the publication of our human rights guide for the internet user.  This is only one aspect of the work that the Council of Europe is doing in order to promote human rights online.  I won't go into much detail, but if you visit the Council of Europe, you will find elements such as a guide for human rights for the internet users, the free flow of data, network neutrality and human rights.  And I think it's extremely important that we do this in such a multi‑stakeholder environment, but also to directly work with business.  We have started to set up a platform.  You could call it with business to discuss directly with them on human rights implications and their policies. 

On this IGF, I noted down three key words.  One is, of course, sustainability and one is ethical reflections.  In the work we're developing, that goes to the infrastructure level as well.  I think that's quite important. 

And thirdly, it is also trust.  How can we insure that this trust is maintained?  Therefore, I think it is important to also look at the constituents of the Internet, basically, that going from logistical infrastructural subscriber information with anything to do with content information.  All of these are very specific.  Working on the bylaws within ICANN, working on the generic top level domain names, all of this is crucial.  At the same time, that's what our Internet freedom recommendation concentrates on is on a whole aspect of the transparency not only of the companies and thanks to ranking to what they're doing with regards to companies. 

It also goes to governance.  Governments also have to be transparent in what they're doing with regards to the Internet.  Consistent.  Right now there's a number of preoccupations I would say that we have with government intervention not only in the traditionally lot of governments, but specifically also due to the developments that we know and anti‑terrorism legislation.  The impact of government all over the world on the human rights issues on Internet.  Thank you.

>> LUCA BELLI:  Thank you.  This is a very crucial issue.  I would like to keep on with the second segment and give the floor to Katie Shay so she can provide us in elements on the private sector and what they're doing.  Go ahead with perspective on this.

>> KATIE SHAY:  Thank you very much.  I am legal council and human rights at Yahoo.  I am here to talk about how Yahoo is taking its commitments and implementing it internally.

I want to start off with a little bit of history from Yahoo and how we came to Jesus with how we need to make commitments and uphold them.

So, Yahoo was one of the early Internet companies and in the early 2000s was a pioneer.  In the way that Yahoo went about is that the theory is the best way to create a global company was to rely on infrastructure.  So Yahoo created local companies, local entities with local personnel.  We translated content into the local language.  We also incorporated the company under local law.  So that meant that we were subject to local law in the countries where we were operating.  And that created challenges.  Sometimes there are conflicts of law in the countries where we are working.  Sometimes we were subject to laws that may not have been consistent with international human rights standards.  And there were certain points where that created issues for the company and opportunities for the company to really learn some lessons. 

So there's a very public example that some of you may be familiar with.  A journalist named Chattel was in prison because of valid legal requests from the Chinese government for information about this user.  And after this came to light, it really impressed upon Yahoo the fact that our business increasingly impacts freedom of expression and privacy potentially in negative ways, but there are ways to promote privacy and freedom of expression where we work.  So through these hard lessons, Yahoo had a choice of what to do next.  And the company decided to engage with those issues to make commitments and to invest resources in implementing commitments to human rights.  The most tangible way that Yahoo was to create a dedicated program called the business and human rights program at Yahoo and it is stacked by two full‑time human rights attorneys. 

Some of you may have met.  And also cofound the global network initiative and I am happy to talk more about GNI and how we think about our membership in the organization.  We co‑founded with Google and Microsoft and now linked in are members and the industry dialogue many of those companies are now observing and may join in the coming year.  It's a growing organization.  I will leave it there for now.  But I am happy to dive in on what the GNI means.  So through our business and human rights program, we focus mainly on privacy and freedom of expression because that's what we determined to be the greatest area in human rights for the company.  That's where we face the largest risk.  It's also where we have the greatest opportunities to promote those rights. 

When we think about privacy, we are thinking mainly about privacy vis‑a‑vis governments.  We are thinking law enforcement requests for user information and other ways that governments try to obtain data and better users.  And on the freedom of expression side, we focus mainly on government censorship and how governments interact with users and with our company on our users rights to free expression.  I just want to mention that in this area, we're noticing growing trends and I think Peter will speak more to this as well, but we're seeing governments crack down on privacy and freedom of expression.  This has also been an issue, but we're seeing new challenges come down on things like internet shutdowns, trying to take local or regional concepts and apply them internationally, things have the right to be forgotten in Europe and they try to apply globally.  Arrests of journalists are on the rise.  There are numerous issues.  We are seeing that governments have a duty to protect human rights.  They are creating new problems and problematic trends. 

So as a company that just kind of raises more awareness for us and raises the importance of these issues for us internally.  So the work is becoming increasingly complex.  So I mentioned we have a dedicated team.  Two people may not seem like a lot, but it is more than some companies have and we try to kind of increase our capacity by working with people across the company that we've identified as key partners.  We call that our cross functional or virtual team.  So we've done an inventory of the company and looked at what are the key issues for our company and who is our partner in the company both in the legal side and in the business side. 

We have identified key people to work with us in promoting these issues.  So we have trained them in human rights.  We've trained them in the UDHR, the ICCPR, the UN guiding principles on human rights and, of course, we're spoiled in the ITC sector and we have sector specific items.  So we talk a lot about these things internally and we translate what the concepts mean for our counter parts that our company may not be steeped in human rights law and talk about what that means for their role, what should they be looking for.  We create processes for them to flag issues up to us and to escalate with us for a human rights review.  And then we also sit in the global public policy team.  So, we work with our government lobbyists around the world to try to promote laws or push back against problematic laws.  So it is not just a focus on the company getting it right, but promoting a better working environment for our company and a better online environment for our users.

Our internal processes touch on a variety of issues.  We've talked a little bit about the transparency reporting and government requests, but we also look at new products.  If Yahoo is creating a new product or we are modifying a product or launching a product in a new market or in a new language, we consider what the human rights implications of that might be.  We also look at merger and acquisitions.  Typically when we purchase a company, that company would become part of Yahoo and would operate under our policies, but we do look at the company before it is integrated and determined where might be risks and how we should address those.  On Tumbler, it is run as a separate company, but we work very closely with their trust and safety staff and their response team so that they are aware of some of the issues we're thinking about through GNI and the broader business of space.  So we work through with them post acquisition, but since they're a separate company, we have made a commitment to continuing our relationship with them and advising them as well.  We look at things like data storage.  The list goes on.  I won't give you the whole laundry list, but just to give you a sense of all the different areas that we're looking at. 

I think that I mentioned this work is becoming increasingly complex.  This is not to say that having a dedicated team like we do and developing these processes means that you're going to get it right every time, but it really positions us, we think, better to make good decisions as a company and to be considering the right questions when we're making decisions.  So I advocate for this approach for other companies as well.  Companies that are thinking about how to approach human rights.  So in addition to our internal work, we work externally with the network initiative.  Rebecca and her team have digital rights and really appreciate the work that you're doing to promote good practices.  Thank you.  And groups like access as well.  So the external work I think is really important to us because we're able to take back what we're hearing from groups that are operating on the ground or that are working more directly with our users to hear what concerns are, what some of the American trends are to bring that back into the consideration and vice‑versa.  GNI is very useful for that because of the multi‑stakeholder component there.  I am out of time.  Sorry.

>> LUCA BELLI:  So I think we can close this second segment with Peter Micek's reflection on what the society concerns with regard to human rights and best practices and also what is access doing.  You know that access is a very active organization and they're doing very amazing stuff.  Please tell us what is going on.

>> PETER MICEK:  Yes.  I want to take a section to recognize we're in a room full of bodies and that there is an overwhelming interest in asserting the role of human rights online.  I think this sort of convening is really important to recognize, because not all convenings look like this, from the conference in February to others, many spaces that do not reflect the diversity of the Internet's users which is the diversity of the world.  So, as far as convenings, yes, Access Now has their own.  Rights com, many of you have submitted to in March.  This is just not meant to be another place to talk shop.  It's meant to extend human rights and digital rights. 

And that is our mission to defend and extend the digital rights of users at risk around the world.  I think we understand defending human rights.  Human rights defenders is someone we know and what do we know by stand?  For me, that means three things.  It is sending guidance to companies who are doing their due diligence and who are working to comply with the U.N. in guiding principles.  It is essentially educating companies on how they're impacting their human and digital rights.  So we hear about this hidden layer, these unknown names who are digging the tubes and having switching points and the ISPs and all these layers we do not know as houseful names, but do impact us on a daily basis and don't necessarily know that.  We're at the high level creating norms, prescriptive norms.  That's a problem that we've seen arising like mushrooms in a dark forest.  They're popping up around the world on almost every continent.  If they're not occurring, they're threatened.  We encountered over 50 in 2016 alone.  These are just in time throttling.  It is not necessarily a new measure, but it is scope and scale that's unprecedented and it is increasing. 

So how do we extend our norms of keeping the internet on and resilience and openness over the people making the decisions to shut down networks?  Well, we're starting at the top.  We're going straight to the U.N.  Advocate of the human rights counsel.  I have some freedom of expression.  We're going to the GSMA.  The world's biggest technology association and they issued a wonderful statement on service restriction orders.  If there is someone here, I would love to hear how it is being implemented.  These are norms at the international level that we heard Katie talk about the local versus the global level as well.  There are global standards being imposed on local cultural situations. 

So, we're having difficulty at the local level and that's the third place where we extend digital rights is through users.  It is through our digital security help line.  We have a clinic set up in the village and that's a physical manifestation of our work.  It is digital security network where we feel complaints, concerns, questions and really rapid response as scenarios from the users at risk around the world 24/7 basis.  We also hear from coalition members.  Many of you in this room and 100 plus ORG network are pledging to keep the Internet on.  So we've got this coalition, but again the local to the global.  The solutions are going to happen locally.  What we will do is we can show you another international companies is to enable those local partners to find those solutions and make progress in a multi‑stakeholder fashion.  We need DI.  We need all of us to go back home and not wait for the saviors to come. 

You talk about your own company and CEOs.  We heard some people talking about the silos that can be created within companies, but it is talking to your Presidents and maybe new presidents and educating them a bit and talking to regulators too.  I think the regulators are the intermediary and the governments that should not be immune.  They should be very much engaging in this discussion.  Where do they convene?  I don't know, but somebody in the room does and they should be found.  And so yeah, that's extending norms.  One more talk about the challenges is the question.  What I heard IGF is a lot of challenges around local identity requirements.  SIM card registration.  It is very local.  Where you get your SIM card?  Your local shop.  It is national ID cards creating honey pots of data unnecessarily against GSMA advice.  Sale to law enforcement is another area where companies are proactively creating capacity to surveil users and selling that for profit.  You want to talk about business cases, talk about where the business operations teams of your companies are actually making their own business case to law enforcement for unlawful surveillance or unlawful restrictions on content.  That is occurring. 

We have seen a couple major cases even in the U.S. lately.  They are storing data much longer for ordinary, purposes and creating a new, which is sell access to that data of that new law enforcement.  It puts users directly at risk.  So they're paying for their own surveillance because local police are paying the company to access this.  There's a lot of work to be done. 

I think I'll leave it there.

>> LUCA BELLI:  We have a lot of work to go to here.  I would like the last words from AMOS of what we will do, but very quickly so we can also have a round of interventions from the floor.

>> AMOS:  Sure.  I think I laid out what our next steps are with regards to the June 26th, June 27th report on the pressures of state regulations and Telcos and related entities as well as the scope of the copper responsibility.  I want to talk about one of the true lines for the discussion, which is really transparency.  It is in the necessarily on the part of companies where there is a lot of transparency work needed around reporting, but also around human rights.  How much weight they are given?  What influence they have on business decisions?  What methodologies they use on legal obligations and the scope of responsibility?  Not just companies, but also states.  I think states and this is in our report that states is not just required that laws and regulations adopt and implement it, but also they don't get in the way of transparency by other important factoring.

Finally, Connecting to careen's point on IEEE and ITF, those organizations that are engaged in Internet Governance, meaningful public access to the policies as well as to the decision making processes within these organizations is very critical to actually seeing real human rights reform.

Finally, I want to touch base on the gentleman talking about sustainable case.  This is where I emphasize that I don't speak on behalf of the mandate or David.  This is not a fully formed idea, but there's a strong business case.  We might want to go even further.  That way the case coming from social and economic rights to the extent that access to the Internet means access to basic information about when the water and electricity is going to be shut down.  Access to information on the Internet matches to whether you are able to transact money online.  This goes to the very heart of whether people are guaranteed the right water, the right to housing, social welfare and trying the obligations are very important covenants.  I would say they link up between the right of freedom of expression and to this sweep ever social and economic rights.  It should something is that we should go forward and expand on.

>> LUCA BELLI:  Patrick, Rebecca.  Yes?  Patrick, please.

>> PATRICK:  Yeah.  I wanted to react on the filtering and blocking and take down of internet contents because it may be of interest to you we have done a comparative study in Europe of filtering, blocking and take down.  It is available online and gives a description of the practices and policies in 47 states on this issue.  Also comparative analysis on that and that can be quite important.  Also to reiterate what AMOS said.  It is crucial to look at social rights.  I think there will be something on Thursday on that.

>> LUCA BELLI:  Thanks for being fast.  Rebecca, I hope the comments will be quite fast.

>> REBECCA:  I wanted to pick up on the gentleman that you were saying about the business case.  We're seeing a number of particularly the European Telcos and reporting on these issues in their sustainability reports because this is sort of in many ways a broader sustainable issue.  And the companies need to address.  There is also a very interesting development in Europe with legal requirements for companies to report on what are called non‑financial issues.  Supply chain issues or conflict minerals and all for companies in the ICT sectors.  Key disclosures on those factors.  You mentioned the global reporting initiative.  They're reporting indicators on our issues are not very well developed. 

So we should all encourage them to update and I think many of the projects in the room have lots of material they could draw upon.  But I think what's going to be quite helpful for the companies is they figure out how to report on the issues and we provide them on the framework and how to disclose in a meaningful way.  The good news is that companies are increasingly seeing this as part of their business interest.  They're investors within the GNI and the reason is they're increasingly concerned about the impact on the companies performance long‑term.  They're there for not just ‑‑ for, you know, ethical reasons, but also for business reasons.  So, this is increasingly a business issue and an issue around risk and reputation and so on.  I think that's a very good development.

>> LUCA BELLI:  Gentleman here.

>> AUDIENCE:  Very fast.  We had very useful discussion, but the reality is ‑‑ we have people in prison as we speak now because of a tweet.  A long list, but we just name two.  Simply a tweet.  The other issue is governments are trying to spend a lot of money to buy most sophisticated software to monitor online citizens.  That's an issue of how to prevent western companies from doing deals with our government.  That is too important for us because really we have examples, cases and there's no time to go through the cases.  Their issue is also very important.  I just want to see a discussion, maybe not now, but another opportunity of how you can protect people who are using the Internet to protect human rights.  Is there a way we can give them some protection to enhance their protection?  We are facing all kinds of harassment including prison, torture, targeting of our income, everything.  We have published a lot of reports on this.  The community including the united nation system, how can they protect online in our region and other regions.  Thank you.

>> LUCA BELLI:  Thank you for bringing us to the reality on the ground.  Can we have a mic there?

>> AUDIENCE: Okay.  Global Voices.  I wanted to pick up a little bit.  Rebecca, you talked about investors and I'm curious about the edge of where new start ups and venture capital a lot of it in the west coast of the U.S.  Where are groups like the GNI in terms of trying to get this kind of thinking to them?

>> LUCA BELLI:  One other comment here.

>> EDMOND:  Edmond from Internet Society Hong Kong.  ICANN and ITF, we have seen how it is displayed and who gets displayed for who.  It might be interesting in looking at the digital rights and I'm not sure whether that's included, but the transparencies.  In effect, that should call for some concern on human rights impact.  The other thing, the other comment I am to make is the internet shutdown.  Internet shut down, governments are thinking about it ands next time it comes around, if you.  To stop a protest, it might stop more than that.  You get into issues for life dependencies and other issues.  So internet shut down is not as simple as trying to shut down a popular protest or something.  So, it is those two comments.

>> LUCA BELLI:  The other gentleman there.

>> AUDIENCE:  Yeah.  Talking from the view of the common users, in my opinion, the first easy task is teach the users about how internet is hurt them.  Normally the people think that the human right is only a violent act.  In the comfort of our chairs for a nice keyboard, it is difficult to see the potential damage.  The companies obviously must be ethical, but if not, the users must be ready to prepare themselves.  Activists and government should give a great government.  But in attention for a report to users, they can stop the mistakes.  Nations should have defender users to help with those problems.  They will teach the users and the other users.  Thanks.

>> LUCA BELLI:  Highlight the important of educating and empowering users.  The lady there and then the gentleman there.  We'll take just the last three.  Very fast comments.  The gentleman there.

>> PAUL: Yes.  My name is Paul.  I am with Tillia Companies in Sweden.  As Peter said here, there's a lot of work ongoing and we do not always succeed.  Some steps we have taken is in the area of transparency where we publish statistics on number of requests, statistics on major events.  We have published human rights impact investment that the BSR has conducted.  We also transparent on direct access laws in our markets.  What I wanted to say is the discussion on investors it is mainly the investors that are interested in the transparency work.  They're all superior companies that are interested and have not yet started to publish transparency reports.

>> LUCA BELLI:  Thanks.

>> I wanted to get a comment from the panel on the announcement that was made last night on the removal of terrorists, extreme content.  It might be too late in the panel, but I think this is the most obvious human rights impact announcement that we got there tech companies in a long time.  So it raises many issues to criteria to redress.  It would be good to hear.

>> LUCA BELLI:  Now we are closing the panel.  On Friday, we had the responsibility with regard to platform will be one of the key issues debated.

>> We have to look after the role of state, national states in defending human rights.  We have a defensive stance when regarding to states.  The states may play an important role in defending human rights and legislation that established guarantees to users and to the companies in countries.  So, part of our impression in Facebook is because we had established they were not liable for third‑party content.  They couldn't be sued for that.  So when we say this is ‑‑ that's because we have civil rights act on internet and that's because we have some legislation of defense.  We are fighting them in the country in Brazil to defend the good legislation against the bad legislation.  I think we should do some best practices for national states that want to defend human rights. 

Two very quick comments.  One, freedom of expression should be taken as a broader perspective in order to include access to information and to discuss diverse in terms of things.  And second cultural identity.  When an indigenous person has to wear as a white man to appear on the Internet, we are finding a digital side to a part of our population.  For us, it is so important and emblematic of this idea.  We should consider the regards of cultural identity and taking that as a freedom of expression issue.  Thank you.

>> LUCA BELLI:  Final remark to conclude.

>> Thanks.  I agree that VC and private equities are shielded from a lot of transparency requirements and public disclosures.  They do want to respect human rights.  It is up to us to tell them how to do that and how to design their platforms to be more transparent and open.  But yeah, as far as getting to those VCs, I think established Silicon Valley companies and their leadership need to reach out or perhaps create their own equity pools that are used to support retrospective companies.  The question on shutdowns is very well taken. 

Unfortunately, we have already passed that third degree that lives are being lost due to government shutdowns.  Great important ‑‑ they found a woman who was pregnant was unable to contact her doctor and lost her child.  That is one anecdote we have to assume is repeated elsewhere.  So lives are being lost.  On all of us, every shut down we see to digital rights, companies need to be transparent about the orders they're receiving.  Your legal teams are under the privacy policy teams.  Your business operations and development teams and your legal team needs to push back against governments and talk more directly to regulators in shutdowns.

>> Quickly in response to Brett's question.  The new practice that the companies announced is that they're going to share information about images, extremist images and create what's called a hash so they can automatically detect these things.  My understanding is there would be a human review before anything gets taken down.  My understanding is all they haven't actually developed the system yet.  So there's an opportunity here, I think, in a number of ways.  One is that we collectively need to push for maximum transparency and accountability that the system is developed.  There needs to be some independent review of what images get into the system and how the human review process is taking place.  Human review, decision make be ‑‑ so I think there's an opportunity to work with the companies and really push for maximum transparency.  I am sympathetic to the fact that companies have been under enormous pressure to do something.  So given that they have done this, this is their response to the pressure.  Let's help them figure out how to do it in the most rights respecting, responsible and accountable way possible.

>> Just a couple comments.  I see Ellery left, but I did want to respond to her question about started ups and venture capitalists and how we can get smaller companies thinking about this.  The user comes before anything else and so we have invited start ups to come and talk with our company and other GNI companies about how they can think about implementing human rights.  So when they're looking at an acquisition, they have good practices from the outset before they reach a point that it becomes very expensive and it might be later the company has developed in a negative way.  On Brett's question, I want to say that Yahoo has not signed on to that agreement.  We have policies prohibiting terrorist content on the platforms and we work to try to address that very important issue.  But at this stage, we haven't signed on.

>> Just a very quick final comment.  We have heard several concerns here and I would like to add another one.  What would be the role of these terms and service and these private contracts in the world of Internet of things and smaller cities.  What about interactions between public and private entities.  We have the use of private platforms to offer public service, et cetera.  So there are a lot of interesting things.  I think the question remains what do we do with this data and information we have here?  I guess we have several ways beyond.  We haven't seen much litigation in Brazil regarding terms of service.  But there are interesting initiatives arising.  We know there are some initiatives in Europe also.  Maybe that can be a way we have a specific legislation that we didn't analyze in the first phase of the project.  We have a possibility to look at how these terms interact with local legislation and also the gaps that are existing.  The gaps still exist when we talk about data progression, protection of ‑‑ data protection, protection of freedom of expression.  That's my final comment.

>> So the mandates send regular communications about violation of freedom of expression.  One of the e merging areas is on legislation.  And legislative proposals to issue our concerns and to give a human rights analysis of these proposals that are published help immediately.  So to the extent you are aware of proposals and make sure they're more human rights oriented, please come and see me.  I will be happy to speak with you.

>> LUCA BELLI:  Thank you for the feedback, comment, discussion.  We have some cops of the report on terms of human rights here.  If you don't manage to have a copy because they are limited in number, we will also distribute them on platform responsibility on Friday.  You can download it online.  We will share it on creative comments, licenses starting from next week.  So if you want to have some comments, come here orderly.

>> Another announcement.  My colleague Alana has Spanish language materials for anybody who is interested.