Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/database/driver.php on line 1946

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/database/driver.php on line 1946

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/database/driver.php on line 1946

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/database/driver.php on line 1946

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/database/driver.php on line 1946

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/database/driver.php on line 2022

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/filesystem/path.php on line 143

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/filesystem/path.php on line 146

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/joomla/filesystem/path.php on line 149

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/components/com_fabrik/helpers/string.php on line 264

Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/src/User/UserHelper.php on line 621

Deprecated: Function get_magic_quotes_gpc() is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/cegcore2/gcloader.php on line 63
2015 11 12 WS 155 Encryption and Anonymity: Rights and Risks Workshop Room 1 FINISHED
 Welcome to the United Nations | Department of Economic and Social Affairs

The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 


>> GABRIELLE GUILLEMIN: Hi everyone. Thanks very much for coming. We are going to start in a few minutes but please feel free to join the table. There is lots of seats around. So we will start in a few minutes. Please join us.

All right. Let's start. Welcome everyone. And thanks very much for joining privacy. My name is Gabrielle Guillemin and I will be moderating the session. We are very fortunate today because we are joined by a panel of brilliant speakers and a brilliant audience and who will channel their insights on this very important topic. I will be introducing them as we go around the table.

Encryption and Anonymity individuals impress that themselves especially important in those countries where Freedom of Expression is heavily censored and allows users to join in with sorts of different discussions that they might otherwise avoid. Now in an online world where we all leave a digital trail and mass avail is rampant encryption is essential to ensuring the security of information, the integrity of communications and the right to privacy online and a vital tool for protection of Freedom of Expression on the Internet and censorship. However what we see in countries such as Iran is a monitored name registration. And if we look West we see, for instance, at the beginning of this year the UK Prime Minister David Cameron talking about encryption and saying that, asking if the Government was going to allow a means of communication where simply it is not possible to listen in to individuals' communications and his answer was no.

So the protection of encrypted and anonymous speech is under threat. UN Special Rapporteur of Freedom of Expression David Kaye, who we are very lucky to have with us, now recognized this threat early on and presented a report to the Human Rights Council in June 2015 which highlighted the need for greater protection for encryption and anonymity. David will tell us more about his report in a minute.

As we go around the table to get our speakers to speak for two minutes, I would like everyone around the table and in the audience to think about how we move forward in the face of a legal environment which often undermines our ability to protect our rights, both privacy and Freedom of Expression. What future advocacy do we need to think about to protect our rights.

So now I'm going to turn to David and ask him if he can tell us more about his report to the UN Human Rights Council, how it was received. And David, what do you think the challenges are?

>> DAVID KAYE: Great. Thanks, Gabrielle. I should thank you and Article 19 for all the support, pulling people together for the project and providing input and just generally for our collaboration. I wanted to say a couple of things. I actually don't want to speak for very long. Although when I say that I end up speaking for too long. So I should be careful but I am actually interested in hearing as much as possible from those on the panel and from those of you who are really seeking to implement encryption because I'm curious about the kinds of challenges that you face, because obviously we did a report that focuses on the legal level and particularly on the normative level of how Article 19 of the Universal Declaration of Human Rights and the International Covenant on Political Rights protects the space we need in order to exercise Freedom of Opinion and Freedom of Expression.

I want to say first something briefly about the process and then a couple of ‑‑ maybe a couple of take‑aways that I would highlight that I think at least moving forward I'm not assuming that everyone has read the report, but I just want to pull out a couple of elements from it that I think are things that either as advocates or scholars or legislators that it might be worth thinking about in terms of moving forward normatively. And then I will say a couple of words with the Human Rights Council Member States' response.

The first point is about process and I want to mention process because the report would really not have happened if we didn't have the input from Civil Society and from Governments but really especially from Civil Society. So as we do with ‑‑ as we try to do with all of our reports to the Human Rights Council and to the General Assembly we do a public call for submissions. We received somewhere around 30 Civil Society submissions. They are all on the OHCHR website or I have one at my University, I would recommend that you go to the because it is https. Because OHCHR, their site is not https which is kind of hard to believe when you think about the kinds of people who really need access to information about the Human Rights mechanisms in the UN. That's my little ‑‑ it shouldn't be a pet peeve. It is a real peeve.

So what I want to encourage is one for people to go look at those Civil Society inputs. They are really excellent. And they go deeper in to certain issues and to normative trends than I was able to do in the space of a report on encryption and anonymity. The other thing is the Government's submissions are pretty interesting as well. And there are about 20 of them and I think, you know, for those ‑‑ you can look and if there is a country of interest to you I think it is worth taking a look at those. So the process was important. And for those thinking about engaging in the kinds of reporting that we do going forward I really want to encourage you to submit your comments.

So a couple of ‑‑ just really a couple of take‑aways that I want to highlight because I think they are areas where there could be more normative development. There is a lot of room for normative development in at least three spaces. One is generally the acceptance of encryption and what encryption means as a matter of Freedom of Expression around the world. I think that we have seen over the last six to nine months at least in certain capitals I would emphasize there the United States, the debate has moved in a promising direction. And I think this has been in large measure due to the engagement of Civil Society, NGOs, and of the technical community. And I think the technical community which came out with a report in June or July with the report led by Danny Weitzner of MIT that highlighted the importance of encryption to the economy and to other values was important. This would be one sort of take‑away/implementation would be to the extent that when you go back to your home countries or international or Intergovernmental space that you work in that really bringing attention from the technology side to the importance of the value, the uses of encryption I think is really quite important.

The second issue is on the issue of anonymity generally. I said this yesterday in a panel, I don't want to belabor it here but I think that anonymity is under threats and subject to doubts that encryption isn't. They are subject to challenges, but I think that anonymity to a certain extent faces a problem of being understood as in a way its own kind of threat to Freedom of Expression. Anonymity can be used as a shield for harassers. But there seems to be a sense if you can't stand up to your ‑‑ with your opinion with your name, that somehow your opinion is worth something less. So to the extent that we can highlight the ways in which anonymity is valuable recognizing that there can be threats that poses as well. I think that's important.

Third take‑away is the issue of the right to maintain an opinion. So this is an area of Human Rights law that is fairly underdeveloped. There is very little jurisprudence around it and this might be a subject for discussion. I do think there are ways in maintaining and holding opinions in a digital age can be a little bit different than in an analog or a physical space, and I think encryption plays a role in protecting our opinion insofar as we keep our ‑‑ whether it is journals or just our search histories or whatever it might be, that there is room for some normative development and work either at the legislative or probably more so at the kind of either litigation or the Human Rights mechanism level.

What I would really encourage people to try to find a way to further develop the issue of maintaining an opinion in a digital age.

So the last thing I would say very quickly is so I presented to the Human Rights Council in June. Some of you were there I know. And the reactions, first of all, thanks to Civil Society for standing up and supporting the report on both encryption and anonymity grounds. The interactive dialogue was very interesting because you could what divide up Member States in to three different categories. On the one hand you have those strongly opposed to encryption and very vocal about it. And some saying that encryption and anonymity, especially encryption is an economic issue and has nothing to do with the Freedom of Expression. There is one camp that sits there. I can only name names on the strong proponents. There are the strong proponents and those who raised their voices including Germany and Brazil and many others. They were particularly supportive of the idea that encryption and anonymity, although the footnote is that Brazil I wouldn't say is supportive on the anonymity front. There is very strong support among a core group.

Middle group, states really understand the importance of encryption and they understand it and even support it as a matter of foreign policy and supporting activities worldwide. There are actually several countries that support encryption and anonymity tools but that are under some kind of domestic I wouldn't say threat but a challenge to the extent that arguments around law enforcement and intelligence take ahold in the U.S. debates.

This would be my closing point, to the extent that you can engage in those kinds of debates which was to ensure that the default around these tools is that they are available to individuals to Civil Society Organizations, to others, to corporate actors as a default and that the Government really should be the actor that has to bear the burden of proving the necessity of a restriction in any particular circumstance and that restriction actually advances a legitimate state interest. To the extent that you can do that in your own work whether it is research or advocacy or whatnot I think it would be quite valuable in bringing a report that is designed nor Geneva can be intelligible and translatable at domestic levels.

>> GABRIELLE GUILLEMIN: Thanks very much, David. Now we have Juan Diego Castaneda who is going to share with us his insights from Colombia on the issue of encryption and anonymity.

>> JUAN DIEGO CASTANEDA: Thank you very much. We have laws in encryption but they were not public or we just recently discovered this provision that comes from 1993. It is about securing communications in the framework of security, the need of security facing the radio groups and security issues in Colombia. So the provision says that all communications devices that use electromagnetic spectrum, the user of those devices they can use ‑‑ send encrypted messages or use ‑‑ or use ‑‑ the use of encrypted messages are intelligible language. So that's what we have there.

But the thing about this law it wasn't discussed and it has been renewed like every four years or so and nobody has made the debate about the importance of these kind of laws or encryption. We ask in charisma about these laws, my Ministry of ICTs, police and Ministry of Defense. And they are not actively trying to forbid encryption. The Government isn't also trying to actively promote encryption for the use of citizens but at least they say are not targeting people ‑‑ they are not trying to forbid people from the use of encryption. The thing is that we have a context of when encryption may be used by people. But the main problem now that we don't have any major legal barriers from the use of encryption is that people may not be so aware of the importance of encryption. So that it is beyond what we've seen in other countries where they have to first try to gain their right to encrypt their devices and encrypt their communications because we talk about with ‑‑ we talk a lot with journalists and people, Human Rights defenders and people they are not ‑‑ they are assuming they have been watched and they don't care too much about that.

So our biggest barrier is to get people to actively use encryption and to fight for that in a way that even though it is not forbid they have to put it in like ‑‑ make it a routine to use encryption. So that's ‑‑ that's the biggest barrier to have people to use that. And we have in Colombia also biggest scandals in security and hacking. And I ask like how secure or how useful is encryption when you have people using just some platforms like Facebook or what's up. That's kind of the main uses of Internet in Colombia.

So what happens when people just use some platforms and they don't use actively other platforms. And they also assume they are being watched and they assume it is not ‑‑ it is not too bad to be watched. So that's our main fight to get in to the people's conscience about the security of communications and to try to unbury these laws that get unnoticed from the public debate. And try to fight ‑‑ try to fight and get it clear that there shouldn't be any restriction on encryption. So I think that's what we work on. And well, that's it I think.

>> GABRIELLE GUILLEMIN: Thanks very much, Juan Diego. You said that the law had not been discussed in Colombia. So I would be curious to hear from Chris Marsden who is a professor at Sussex University of what he thinks of the debate in the United Kingdom when the investigative powers bill was published recently. Chris.

>> CHRIS MARSDEN: So I promised ‑‑ by the way I was expecting to have people to start throwing things at me. I am only telling you what is going on in the UK, but I am not responsible. I promised I would divide this up in to the good, the bad and the ugly.

Let's start with the good. This will be the shortest part. We are having the debate in the UK. And the current investigative powers bill is going in front of the Joint Parliamentary Committee scrutiny. It is the response to a previous attempt to introduce a piece of legislation under the last Government that was actually vetoed by the junior coalition partner. Since this May we have a majority Government for the conservative party. It has now been reintroduced in a different way. You can see what the junior partner thought of it if you look up Nick Clegg online. We are having a debate and that's a very good thing. And it is a 300‑ page bill. And there is lots of explanatory Miranda as well. It is almost Patriot Act length someone was commenting to me earlier. Think of it as the UK Patriot Act 14 years later. I should say the other element which I think is very important is that the joint scrutiny committee which considers the draft bill before it is actually introduced as a bill in to Parliament unfortunately doesn't seem to have taken advantage of the expertise that was available from the scrutiny committee that considered the previous, the failed bill from three years. And there are no members of that previous Committee on the new Committee which is to say the least a shame.

And, for instance, the Intelligence and Security Committee of Parliament which is now chaired by the former attorney general of the country is actually conducting its own shadow scrutiny investigation. So we are shining a light in to dark corners. That's the good.

The bad, and I could go through a very long list but we've only got five minutes. So I should make it relatively short. There is no effective judicial review in a way that people would think of judicial review in the rest of the world. So as things stand judges will have the ability to examine warrants for their reasonableness but not a factual check on what the warrant contains. And that's not full judicial review as it were. That's maybe more of a matter problem of judicial oversight. But I think that's probably the major bad. But there were many others. One of them is that there is as you may well know and the chair stated in the introduction is that the Prime Minister has said he doesn't want there to be end‑to‑end encryption which doesn't have a back door for the security agencies. There are problems with that which is the British economy, which if you start interfering as it were in the strong encryption products the UK has a very strong IT industry. And there are a lot of companies that are upset by what is still a draft scrutiny power that may or may not be introduced.

Tim Cook of Apple has been outraged of the encryptions has been suggested. There will be problems for cloud providers. And it has been suggested that this may be a major issue for as it were UK PLC. The UK economy as well. I realize this is a rights‑based discussion we are having, but Governments respond very well to that kind of thing.

I have one minute. Let me quickly say. You know the new James Bond film has come out. On the bad side I should tell you in Britain we think that the security agencies are a combination of enigma code breakers and Austin Powers and if not, James Bond. And there has been quite a substantial publicity push, I repeat that, a publicity push by the security agencies around the film and the publication of the draft bill. Do not expect the British, as we always call them the great British public to push back hard about this bill. Activists are outraged and members of Parliament are outraged. Sys admins are outraged even at a local level. Do not expect the general public to be outraged. They like James Bond and they think he is a tremendous fellow.

On the ugly, data retention cost for a year will be extremely high. There was a meeting, a Committee of Parliament that met on Tuesday. Government predicted the data retention cost would be somewhere in the order of $300 million, using University currency. We have a currency but no one talks about it any more. The actual cost is basically in the exponentially high, at least according to the ISP. And this will affect the one thing that the British people do care about is their broadband connections, the price. That is the one possibility of there being a more general outcry of the bill. The other things to say are the data retention element of the bill it will only retain metadata and not content. Two issues with that. Metadata is content. If you have access to all of someone's metadata you can make a very, very good approximation of what they want. Content, of course, is relatively useless in security services unless you are very targeted because it is extremely expensive to analyze. But the other part is to say that the ISPs, they think it is extremely difficult on a limited budget to separate metadata on a budget. That will be an interesting challenge if it were to become law.

I know this audience is looking at the primary materials, section are the gaging clauses. One is a kind of Snowdenian engaging clauses and the other two are blanket clauses. Anything that the intelligence agencies don't do is absolutely outrageous will be legal.

And one final thing to say is the UK Government will have to declare this is consistent with both the European Convention on Human Rights and obligations as a member of the European Union under that Convention it is incorporated in the European Union law but in terms of whether we will be members of either or both by the time this bill comes in to effect warrants this space.

>> GABRIELLE GUILLEMIN: Thanks very much, Chris. We have Elvana Thaci from the Council of Europe. What we are wondering when you have Member States such as the United Kingdom looking to adopt the kind of legislation that Chris was talking about, you know, what sort of work, what can the Council of Europe do in the face of that?

>> ELVANA THACI: Thank you. I am not going to express any position on the UK legislation. And I'm going to approach your question from the perspective of what we in the Council of Europe have done on the question of anonymity from a Freedom of Expression perspective.

And I will try to present the positions of two bodies of the Council of Europe. First, the European Court of Human Rights and the European Ministers. There are our councils in Europe that may have different positions and views on anonymity. The European Court of Human Rights has recognized many judgments, Internet judgments, the value of anonymity for Freedom of Expression. The court has said that anonymity helps individuals to avoid reprisal and unwanted attention and it's capable of promoting the free flow of ideas. The court at the same time says that this value is not absolute and it needs to be balanced with other legitimate interests of the society.

And I'm going to focus on two cases, two key cases of the European Court of Human Rights where the court dealt directly and indirectly with the question of anonymity. First KU versus Finland. This concerned a malicious representation of sexual nature against a minor. The court confirmed and explicitly stated in its judgment Freedom of Expression and confidentiality of communications must be respected. However these guarantees are not absolute and must yield to other legitimate interests. Most interestingly the court found in this case that the regulatory framework of the respondent country state in that case, that is Finland, had not provided for the possibility of ordering Internet service providers to divulge information of subscribers that is necessary for pursuing the criminal offender in to justice in that case.

The question of the legal possibility of public authorities to get subscriber information from Internet service providers is subject to two cases pending before the court. There is the case of Benedict versus Slovenia and Wrinkler versus Austria. We will wait for the court's judgment on those two cases.

Another case is the recent judgment in Delphie versus Estonia. The court here was called to examine a complaint concerning the liability of a company, an Internet news portal for offensive comments posted by its users on its website. And the court held that the commercial operator of the news portal was liable for these comments, whether from identified users or from anonymous users in view of the unlawful nature of the comments that were posted. The court clearly stated in this case the duty of the Internet portal was not to prevent the posting of these comments. There was no obligation to prevent on the news portal but there was an obligation to remove the comments which in that case the court found that the Internet portal had not removed quickly as soon as it had taken notice of the unlawfulness of the comments. So this choice that the court made between the obligation to prevent and the obligation to remove suggests that the court may in the future not favor an intermediary liability or regime which prevents anonymous speech.

The principle of anonymity has also been upheld by the Committee of Ministers of the Council of Europe in two documents. First is the Declaration of 2013 and Freedom of Expression, Freedom of Communication on the Internet. And there again we have the same line that anonymity is very important for Freedom of Expression. However it can be lifted by law enforcement authorities in the context of their work. There needs to be in these cases there needs to be safeguards in place for lifting anonymity. The same approach we find in another document in the Committee of Ministers of 2014, recommendation on the guide to Human Rights for Internet users which informs the Internet user that anyone, a user can choose to be anonymous on the Internet by using, for example ‑‑ but users should be informed that this anonymity can be lifted should this be necessary for law enforcement purposes.

So what I would like to throw, the perspective I would like to throw in this discussion is this balanced approach that the court, the European Court of Human Rights has taken ‑‑ that the Committee of Ministers has taken on guaranteeing anonymity as a value for Freedom of Expression but having due regard to other fundamental rights and freedoms. Thank you. I'm sorry, I will have to leave very soon to go to another event. Thanks.

>> GABRIELLE GUILLEMIN: Thanks very much Elvana. Now we heard from Europe to Asia, South Asia more specifically, we have Pranesh Prakesh. There was a recent attack at a national encryption policy in India recently. What can you tell us about that?

>> PRANESH PRAKESH: Thanks, but now I will focus on the dry legal details about this policy and about encryption and anonymity relating to laws in India. And we will elaborate on my own personal reflections over Q and A. Before getting to the national encryption policy I would like to ‑‑ which was a fiasco, there is no better word to describe what happened recently, I will briefly lay out the laws that actually govern encryption around ‑‑ in India. There's the first thing, the Information Technology Act, which does assert extraterritorial applicability in Section 69. It provides the Government the power to force citizens and intermediaries to decrypt information. The disproportionate jail term of up to seven years for not complying. I have argued in the past given that it doesn't have any reservations, argued against our right to self‑(inaudible). And more specifically for the encryption policy there is Section 84(e) which provides the Government the right to regulate modes and methods of encryption and applies to all encryption, I quote, "for secure use of the electronic medium and for promotion of e‑governance and e‑commerce". I will repeat that. "For the secure use of the electronic medium and promotion of e‑governance and e‑commerce". Now rules were drafted under this as part alongside the national encryption policy and I will get to that in a bit.

There are a couple of provisions outside of the IT act in the Indian penal code where different kinds of concealing designs to commit certain kinds of offenses are actually made criminal acts in themselves. Apart from this the Reserve Bank of India has guidelines relating to information security which mandates the usage of encryption for online banking.

And lastly there is ‑‑ there are a number of different kinds of telecom licenses which prohibit bulk encryption. Now these are licenses under which the Telecom Operators Act. In that sense aren't general laws, but because these operators provide services for communications to everyone else, it impacts everyone. And these prohibit bulk encryption. Now what's funny, what's interesting is that while prohibiting bulk encryption it also states that and I will ‑‑ that the licensees, the telecom operators shall have the responsibility to ensure protection of privacy of communication and to ensure an authorized interception of message does not take place. So at the same time there is the need to protect privacy and confidentiality, but the most widely used method of doing so which is traffic level bulk encryption may not be used. So how exactly telecom operators are to do this is left up to them. And I don't know how they would.

Coming ‑‑ moving on now to the actual draft national encryption policy, I found out about it one morning and I was ‑‑ as I was reading through it for the first one and a half pages it seemed great. It seemed like the Government, you know, wanted to put in place a very progressive policy that would ‑‑ that would encourage the use of encryption within the Government. And this I may remind you is a Government where the national security advisor's e‑mail address, whether you see ‑‑ whether that server provides for encryption. It doesn't. If you send an e‑mail to the national security advisor, it goes over ‑‑ there is no traffic level encryption because their server doesn't support it. I thought this policy would change all of that except it turns out the Government had other ideas in mind. In fact, the policy also ends up restricting what or attempted to since the policy is no longer ‑‑ it was a draft policy and the draft has been redrawn because it was so ‑‑ horrendously so bad. It restricted what algorithms and key sizes may be used in India. Namely AES3 DS and RC4. And RC4, remind those amongst you, should not be recommended. It was one of the three of the policies at that time would have allowed. There was registration requirements for encryption products and services. And the Government would notify the list of registered encryption products. And only those may be used. So only registered encryption products and services may be used. It had restriction on what kind of encryption may be exported out of India as well as something that the bit that proved the most controversial in the media, at least in India was a requirement for all business to business, business to and consumer or citizen to citizen communications, that plain text of any encrypted information must be stored for 90 days. Okay?

So essentially obviates the need ‑‑ not the need, obviates any benefit you get from encrypting data, this is unclear, would not also apply to mass words, to everything. Essentially anything that is encrypted information the plain text must have been stored for 90 days. And if it had been in place in ‑‑ been followed by Ashley Madison, well, you would have everyone's password as well as apart from having details about them. So I will end it here for now. And I will speak more about the effects of this later.

>> GABRIELLE GUILLEMIN: Thanks very much. Now we have with us Ted from Google. And what's interesting here is that recently we heard from various Governments from the U.S. to the UK that one thing that was very worrying was that the Internet was going dark because of the use of encryption. For instance, the head of MI5, Andrew Parker in the UK said like in order to deal with the net going dark it required the cooperation of the companies who run and provide services over the Internet that we all use.

So what I would like to ask you is, you know, what's Google's response to that. If you have designed products so that you effectively are not in a position to comply with law enforcement or other requests for encryption keys and if not, why not.

>> TED HARDIE: So before I start let me just say you people have depressed me utterly. I cannot express how much happier I was when I came in to the meeting than I am now. And if the Internet is going dark, I would say Google's response amounts to sorry, selling blackout cartons and tacking them up as fast as we possibly can. We are strong believers in the need for strong encryption for all of our users and strong believers that both for data in transport and data at rest.

So I am going to try and pitch this at a fair amount of ‑‑ at a fairly high level. If you would like technical details on this find me. So for data in transport this is one of the things that is underlying things like the https protocol which is the TLS protected version of the http protocol. Google has enabled that on all of its services. And it is the default on all of our services. That was a deliberate decision, and it came after a great deal of discussion about whether it was the right one. And we believe we are in the right place now. It is the tip of the iceberg though. Because in addition to that Google has been pushing a couple of different things that increased the sense at which this is the default. One we took a protocol that was developed internally at Google called Speedy which was encrypted by default. And we made it a default in http/2. It, in fact, does have an optional part of the standard that does allow you to have an unencrypted version of it. We do not implement that part of the standard.

We have chosen not ‑‑ no Chrome or any other application that uses http/2 to offer any unencrypted version. We believe that protects both people who need it and by creating a default of encryption helps make sure that the use of encryption cannot be used to target individuals.

Beyond that we have a project called Quick which is developing a new transport protocol for the Internet and also encrypted by default. And the mechanisms we developed in that for zero round time, for very quick use of encryption we are contributing to the ITF TLS Working Group. So it will be used in the next version of the 1.3 standard. All data at rest in your Marshmallow, I hate our product names sometimes, your Android, Smartphone running 6.0 or Marshmallow and it will be decrypted by default. You would have to work very hard to turn it off. If you have upgraded to Marshmallow from a phone that is not capable of doing that, but in general our Android compatibility requirements require anything from Marshmallow and beyond to be encrypted. It will be encrypted between the time it comes across the network and written to the disk in the data center. There is a wonderful two pager on the cloud that describes the security processes. And it is a good insights in to Google in general because most of what we use in Google cloud is, in fact, just taking what we are already using inside Google and making it available to everyone.

The last thing I will talk about is actually not so much a technology but a policy. And that was the real names policy which as many of you know was a mistake. And Google has apologized for it. So I am here to apologize one more time. We had backed off from the real names policy both in terms of Google+ and in terms of linking Google+ which had a real name. Yesterday we also announced something called About Me. So if you have a Google account you can go to and it will help you work out what you are sharing with other people based on what Google knows about. Hey, you told your gender. Do you want us to let other people know about your gender, yes or no. And even lets you know if people can know your birth date for their calendar. Someone pointed out yesterday difficulties of creating a pseudious account for Google. We are very committed to making sure our users stay safe.

>> GABRIELLE GUILLEMIN: Thanks very much. This was Ted Hardie from Google. Now we have Alexandrine Pirlot de Corbion from Privacy International. So Privacy International has been deeply involved in all the debates in anonymity and encryption. So Alexandrine Pirlot de Corbion, what does PI think of the way forward? What should we do in the face of all the challenges that we face?

>> ALEXANDRINE PIRLOT DE CORBION: I am going to try and keep it short so that we can have enough time for questions. But there are three main points I wanted to raise kind of to wrap up and reflect on what's already been shared by the other panelists. I think one thing what we tried to do at PI and the work we are doing with our global partners is to take a multi‑disciplinary approach and to look at the legal and policy side and also to look at the tech side of this debate.

On ‑‑ so on the legal and policy side in the discussions around anonymity and encryption I think it is important to re‑emphasize encryption is one layer of anonymity and an individual's ability to engage anonymously online and also various other policies in place that need to be addressed in order to ensure that people can continue to engage online anonymously including SIM card registration and creeping up as part of counter‑terrorism measures and state security bills across Europe, but not only real name policies of different services, service providers, also banning other tools such as VPNs and other things and all of these things contribute, they are different layers of an individual's ability to engage anonymously online. And so we are trying to move forward by considering all of those aspects because they are all pieces of the puzzle that come together on.

On the encryption as well because we haven't talked about that as well in terms of how it links to anonymity because it is really important for users to know that as well, the encryption only protects the content. It doesn't protect the metadata aspect which is your identity really, who you are and identity of people you are communicating with.

So on the second level more on the tech aspects I think moving forward it is really important to understand the implications as to what Governments are requesting. What are they really ‑‑ to use an analogy in breaking it down in simplest terms because I am not a techie either but the way I see it is is the Government asking me for a key to my door so they can come in whenever they want, for whatever reason they want, but the reality there are two implications that emerge from that. One they are the only ones. One Government would be able to have that power. It doesn't reflect the reality of it. And the second one is they may also leave the door open or they may leave it unlocked. And what I am referring to now are vulnerabilities. They need to have access to your communications and data. What does that mean about who could abuse and maliciously use those vulnerabilities?

The last point I wanted to make is the discourse in which these discussions are taking place. We only hear the negative side about encryption and anonymity, but we forget there is all the good things about anonymity and encryption. And it allows a vibrant society and not repressive countries increasingly, but also allows the media to operate more freely and protect their sources in order to ensure that information that should be in the public interest is in the public for users, for citizens to know about them. And the reason why one would choose to use encryption and, you know, remain anonymous online is not only about having something to hide. It is just good sense. Best practice to protect what is yours from others and it kind of reflects that, what we have in our everyday life. Why do we lock our doors when we leave in the morning? We have to break the ideas that it is only criminals that use them. It allows the majority of the population to be able to engage securely and freely online just as it was intended to be.

>> GABRIELLE GUILLEMIN: Thanks very much. So we now have just over half an hour to have a discussion. So I'm going to open up the floor for questions that you can throw at our panelists. You have mics at the front. So if you can ‑‑ if you want to ask a question, if you can come to the mic and state your name and organization that would be great.

Thank you.

Please. So we have one question here and then we will take one here at the table and one over there. And I will take those three to begin with. Thank you.

>> GUSTAVO PAIVA: Good morning. My name is Gustavo Paiva from the Federal University representing a group of students in the Internet. And yesterday we had some discussions about anonymity and the initial discussion about cryptography and also the legal aspect of anonymity in Brazil. And since then some of us have talked and we are considering creating a work group to better study the legal aspect of anonymity in Brazil. There is an invite. Anyone who wants to study this, I would be around here after the session. And I recollect so we can make this group work. Thank you.

>> GABRIELLE GUILLEMIN: Thank you. So we have had a question here.

>> ERIC JARDINE: My name is Eric Jardine. And just before I lay out my comment I want to note I do use Tor for about 50% of my Internet traffic because I like to add my signal to everyone's signal to try to provide some coverage to everyone. I would like to get the panel's perspective on not the use of Tor by users as they try to surf the Web and access content anonymously but the use of Tor hidden services on people who want to host services. Where the traffic tends to go on to those sites you find that it is usually drugs or illegal markets in terms of volume of sites and you find in terms of the traffic that it tends to cluster around child abuse sites. So this is an additional layer that maybe there is room for gray in this that maybe you can have Tor or something like that so individual users can circumvent certain censorship and all the rest, but having sites that you can't track down the servers for might be an area that's a bridge too far. I would just like to get the panel's opinion on that. Thank you.

>> GABRIELLE GUILLEMIN: Next question here and then I will get our panel to answer.

>> My name is Moran and I'm an International Society Ambassador and Google policy fellow at the Strathmore University in Nairobi. Those people in repressive regimes who give Governments, where they use the simple encryption tools that are available but we find that most people are not very conversant with these techniques and probably we need more advocacy on the use of tools like PGP and SMS encryption and all that. We find that the technical interventions like those ones that Google is doing, the technical interventions are very important because actually the end user was not very aware on how to use these encryption tools. Already covered. I would like to allow, an attempt by the EFF who have tried to launch certificate authority to encrypt the entire www. This would be a game changer and http will become the new normal if the attempt by EFF goes through. Thank you.

>> GABRIELLE GUILLEMIN: Thank you. So now I am going to turn to our panelists first. If they want to respond to some of the comments and the question I was asked about the use of services on Tor.

>> TED HARDIE: Thank you for the comment about ‑‑ the Let's Encrypt folks are good friends of mine and they have actually also contributed their work to the IGF to create a standard called Acme to share it even more broadly. And I also agree with you what they are trying to do is to make sure it is simple enough to use encryption that it becomes the new normal. And I think that's a very important technique both at the server side which is what Let's Encrypt and Acme address and at client systems. And you mentioned the difficulty of using PGP and SMS encryption. I think that over the next five years or so what we are going to try and do, and we meaning the technical community and not just Google, is to try and make sure that systems like that are sufficiently common and so close to the default that the systems have to become easy‑to‑use. In essence driving that simplicity.

To go to Eric's point, I also want to note that I really appreciate what you were saying about using your Tor traffic as part of the noise that hides the signal for those people who actually need the protection of Tor. And I think that's another part of what we are looking at here. Every person who is using encryption to look at their bowling league, whether they know it or not is protecting the person who is using encryption to look up resources at a battered women's shelter. Because they are increasing the overall set of encrypted traffic which somebody interested in decrypting the traffic must look like ‑‑ must look at. There are a bunch of techniques that we are using for secrecy in terms of TLS, et cetera, which make it harder as well. But the societal norm that encryption is okay and used for normal things is what's going to actually drive change. And I believe that the more each of us can use and advocate encryption for everyday tasks the better off we are going to be when it needs to be used for the lifesaving tasks that Human Rights defenders and journalists engage in.


>> CHRIS MARSDEN: Yeah. Of course, there is always this argument that there must never be a single place in the world where there can be a secret conversation and that is a scarey argument when you think about the implications of it. We've always accepted a degree of risk in exchange for our privacy. It has always been rich, much keen on regarding their privacy than the poor. It does Democratize privacy to a degree that we haven't seen previously. It is a strawman argument. Bit coin is a terrible thing because it enables people to buy and sell drugs. Good old‑fashioned policing could use those problems. If you use Tor sometimes you are a person of interest to the security services.

>> PRANESH PRAKESH: Okay. I will just address the Tor hidden services question. And one is that slowly general services are also getting on to Tor hidden services. So yes, the initial flood of things available as hidden services have generally been things that in some place or the other would be illegitimate or illegal. Sure. But general communication services that I use, many of the Jabber services that I connect to are also being offered over Tor. One is that.

Secondly I disagree with Chris because I do think that there is a concern whereby default at multiple layers you are adding encryption and anonymity. And once that becomes the default where it is no longer something, you know ‑‑ something that a small set of people that we can live with are doing, then that may cause trouble for regular law enforcement. Currently it doesn't. Because currently there are enough digital breadcrumbs, Tor hidden services have been busted many times. So currently it doesn't significantly ‑‑ it isn't significant enough in my opinion to warrant any legal action or Governmental action. But in the future it may.

>> GABRIELLE GUILLEMIN: Now we have David.

>> DAVID KAYE: I will say something really quick in relationship to your question. And it is not really a technical ‑‑ it is not a technical point at all. And it is the point that law applies in this space just as it applies in any other space. And so I think what that means for me is that to the extent ‑‑ because I do think it is important. We are sort of cheerleading for encryption and anonymity. We do need to recognize there is a flipside and there are legitimate law enforcement and intelligence activities out there. On the law enforcement side you can identify some of them. And some of the services there are criminal services. I don't know the extent to which there are, but the point is that to the extent that Government law, enforcement agencies want to deal with those kinds of problems, the same rules related to necessity and proportionality to legal process and so forth need to ‑‑ need to be a part of that identification and that enforcement process. It can't just be some kind of hidden approach to hidden services.

>> GABRIELLE GUILLEMIN: Thanks, David. So now I am going to take five questions from the floor and get back to the panel. So on this side we have the gentleman here. One, two, three, four. We'll take those four for now.

>> Please. Excuse me. Hi. I am Frederic from the Research Center for Internet Innovation. I'd like to hear from you all regarding the actual efficiency of Governments enforcing encryption laws and prohibits or limitations, technically and legally in reference, for example, compared to how they enforce rules against file sharing. So basically how efficient or ‑‑ can they actually enforce bands and provisions and encryption? How successful have they been so far? Thank you.


>> Hi. I am Hasan from Access. I am part of the digital security help line of Access. We are running here digital security to help people encrypt themselves and their data. I'm from Tunisia. And I have a question for Mr. Ted Hardie from Google. Last month, October, Google released a research report saying that TLS connection, 96% of it is downgraded. I want to ask Mr. Ted Hardie what is the responsibility of Google here in this downgrade and what's the Tunisian Government's responsibility.

>> GABRIELLE GUILLEMIN: Thank you. There was one question and actually I think we have a remote question here. Can you please come?

>> Hello. I'm Kim from Local Web which is the Brazilian's biggest hosting company. So free software activist here in Brazil. I just want to state that the Brazilian Government including our President and Petra Energia which is our state oil company has been spied by the NSA and that just happened because they didn't use encryption. And, of course, even if they used they could be spied. But would be a lot difficult, more difficult for them to get those data. And I don't see why the citizens and companies shouldn't be allowed to use the same encryption that is available for the Government. So I see that there are some ‑‑ that the encryption and anonymity can be used for bad usages. But I believe that that's a price we should pay for something that is good. In the same way that freedom of speech can be used for saying stupid things or a knife can be used both to kill someone or to share some food, but we are not going to forbid the use of knives or the Freedom of Speech. So I believe that Governments and law enforcement institutions must find their ways to overcome those problems. But it is not by forbidding people and companies to use those tools. So that's my point.


>> COURTNEY RADSCH: Thank you. My name is Courtney Radsch. And I agree with David. It seems everyone is very pro and there are many reasons why journalists need encryption but anonymity in terms of source protection. But there is the flipside which is we see a lot of violence against women journalists and more broadly there is a whole Working Group of violence against women online. How do you address those issues or child pornography, how do you balance? I heard mention of law enforcement and judiciary that can be very expensive. And we have heard a lot of lack of capacity by law enforcement to understand the dimensions of gender violence online. Are there any other solutions?

>> GABRIELLE GUILLEMIN: Back from Twitter. If you could come to the mic if there is one.

>> REMOTE MODERATOR: Thank you for the floor. We have two questions here in the remote participation hub. The first one is from Benin but now he is on NC France. He works in system security. And he says the issues of freedom and anonymity on the Internet become very complicated with the evolution of the Internet, the Internet protocols and technologies. It is true that encryption and anonymity are essential exploits of the right to privacy and free expression online. Today to prevent abuse what should be the limit of anonymity so that nobody can be in ‑‑ identified. Even on cases of abuse or violation online. What information relating to privacy that should be encrypt.

The second one is from Prashan from Delhi India. He was saying informed consent instead of structure designs such as the thought encryptions should be behind state interventions and actions of private players. Thank you.

>> GABRIELLE GUILLEMIN: Thank you. And actually I am just going to add one more question because we have talked about all the things that can be done by making encryption the default. So that's from the technological side. But I would like to ask our panelists what about the law. What kind of norms should we be pushing for when we think about the investigatory powers bill, what do we do. What is the norm that we are pushing for and what sort of actions, how do we convince Governments of the benefits of encryption. Back to the panel.

>> PRANESH PRAKESH: Can you successfully break encryption and would those kind of legal measures be effective? Well, the answer is it depends. It is a very lawyerly answer but it is the truth. Https, no. Governments can't do much about it. If they try they are breaking most of the Internet. And thank you very much for Google for really helping push that agenda. Thankfully those kinds of things can't be broken. When you get down in to it Governments can say break the VPNs and they have often tried. So a Government need not take action against everything that falls under a policy. It could take selective action and that would be bad enough. That would break my usage of a VPN which I do all the time and that's not a good situation to be in. Don't be passing laws that in general are enforceable and which you will necessarily have to enforce selectively and arbitrarily. That's not good law making. On the ‑‑ should I continue on the gender dimension?

>> GABRIELLE GUILLEMIN: Yes, if you can make some general comments in response to.

>> PRANESH PRAKESH: Okay. On the gender dimension I think rather than focusing ‑‑ one way, if you are focused on accountability and legal action, then sure, some of these problems arise because of anonymity, but if rather the attempt is the focus shifts lightly to empower women journalists, for instance, to ward off abusive messages, to prevent online harms and many of these harms are online harms and don't follow through offline, right, in terms of physical actions at the very least, then the anonymity might not actually be a problem. Then encryption might not be a problem. Because technology can help solve many of these things and different social networks do provide tools embedded within them to take these kinds of actions. So that's something I would highlight.

And the question from India, I didn't quite understand to be frank, but I will note that I am not against encryption by default. So that's not my position. I think it is necessary. We need pervasive encryption because currently what we have is pervasive surveillance and pervasive encryption is the only pragmatic way to ward that off. What I'm ‑‑ so to elaborate what I mean will take more time. I would like to give others time to comment. And if you have time at the end I will.

>> CHRIS MARSDEN: It is worst focusing on the question, these are all parts of the same question, where do you assess the risk and benefit of using encryption versus anonymity versus the danger of bad things or very bad things happening. And I think that ‑‑ so let me set your mind at rest when it comes to Governments. Government security agencies have smart guys. There is no way you can protect yourself from them. They probably don't care about us. Bulk data retention is a little bit over the top. But when it comes to more commercial detection of what you are doing, if it comes to anti‑privacy and other things then I think that's an area where encryption will be very useful. I also wanted to make another point which is that when it comes to using ‑‑ particularly when it comes to using strong encryption, particularly Tor, if you want to use Tor you need a fairly fast Internet connection. Do bear in mind using a strong connection is ‑‑ not only are they a little clunky to use for the most part although obviously privacy by default via https is very helpful, but if you do want to use strong encryption you do need a fast connection.

One of the problems that happened in Iran at the time of the 2009 protest and election the Government simply slowed down the Internet for the Iranian population because they couldn't be connected at the same time. I am hoping that has improved now and hopefully the other end of the panel has some thoughts on that.

>> DAVID KAYE: Two things. So one, Courtney, in response to your question and also taking off a little bit from Pranesh's point, I'm attracted to the idea of putting more tools in the hands of users and producers, those who are online to protect themselves. But I don't think either in principle or as a matter of politics that that's going to be the right answer as a matter of legislation. Maybe not a question of it being the right answer but it won't be a sufficient answer. And so there clearly has to be rules. I think actually there are rules in the UK, for example, that are pretty clear with guidance to prosecutors, that ‑‑ so that there is both protection against harassment but also guidance to ensure that protection doesn't veer in to the space of censorship or limiting Freedom of Expression. And I think we have acknowledged that on both sides. One, there needs to be forms of protection against harassment that that kind of protection should be rooted in Human Rights law because it is the default that expression is free, but that the restrictions are subject to again basic rules, that they are embedded in law. That they are necessary and proportionate to achieve legitimate objectives and that basically the burden is on the Government to justify the restrictions. I mean this isn't a very specific answer. But I think these aren't problems that are unsolvable. Harassment in physical space exists and there are tools for that. And we need to be defining them in virtual space and online space as well.

>> ALEXANDRINE PIRLOT DO CORBION: David stole my line on the last point. There is increasing attention to think that there should be different rules online than there are offline. And I have been ‑‑ we have been in touch with various awareness groups. And there is something that they are trying to tackle because in a way as a movement they have benefitted so much from being able to engage anonymously or topics that they are working on, reproductive and sexual rights that are not issues they can work on openly. Challenges they are facing in terms of movement in terms of trolling online, abusive comments through private messages or more publicly and the need to investigate those. And actually recently someone from those groups pointed out you have got crime offline as well. And there was never a measure to ban gloves so that robbers wouldn't be able to hide their fingerprints. And I kind of see the gloves as the encryption/ anonymity element that we have online. And yes, so that was kind of a response following on David's legal framework, but there are principles in place that have to be respected around proportionality and legitimacy and necessity to restrict or limit rights that are qualified.


>> TED HARDIE: So I wanted to take up the question that the gentleman asked to me in particular, and I am looking at this report because I actually hadn't seen it, but it is part of the Google transparency report. It talks about the request for user information that have come from law enforcement or Government and it talks about safer e‑mails or safer browsing. So I looked at this and you are quite right. In Africa there are cases where the e‑mail encryption coming in Google drops to 0% when the norm outbound from Google is 81 and inbound is 57. That's clearly an outlier because of something going on that's preventing it. We would have to talk a little bit offline about what we could do about that, but I am going to ask Barry Leiba who is sitting at the end of the table to raise his hand. As ITF area director of applications and realtime transport has some areas to deal with the changes that need to happen in e‑mail. The start TS, it is the problem that you have when encryption isn't a default. You start in unencrypted mode and turn encryption on. Anybody can stop you from reaching the encrypted state. In order to restore the privacy we have to get the encryption turned on by default and that's one of the ‑‑ if I had a drumbeat here that would be the main cadence is to try and make things the default.

>> JUAN DIEGO CASTANEDA: I wanted to point out we don't have laws against encryption, but we have all these set of laws that data retention, registration of SIM cards, all ‑‑ lots of stuff that they are not proportionate. They don't have ‑‑ they are not very well balanced. And as long as those exist law enforcement or intelligence agencies they don't have any special advantage over criminals or people of interest. They might got. But they just put all the citizens at risk. So we don't have encryption. We have a lot of things that may affect our right to be anonymous. So even it is not ‑‑ maybe I think debate is all focused on these laws on encryption and what might they have technical aspects. But when you don't have that, you have all kinds of issues, like the ones that were mentioned and all the other laws that might affect your privacy. It is important to get that in mind.

>> GABRIELLE GUILLEMIN: Well, we are almost at the end of our session. I don't think we are going to have much time for further questions. If you can keep them for our panelists after the close of the session. I ask them to have one minute comment each and any final recommendations to what we can do when we all go back to our respective countries. I am going to start from this side.

>> TED HARDIE: My suggestion is for, Chris, when you are dealing with some of the Governments who are making technical proposals which make no sense because back doors in encryption and the registrations that they are asking for data being kept in clear text for 90 days and all of this, I invite you to take a Youtube video, one of those pieces of data served or https which would have to be given in clear text and to print out the COADEC in 1s and 0s and hand it to them on a ream of paper. You will need 50 to 100 reams to print out a fairly short video and invite them to imagine what will happen if everyone in the UK meets the regulation in this fashion. Because I would love to see them try and work out what song was listened to from that paper.

>> DAVID KAYE: So I actually just want to maybe make a plea. One of the problems around just figuring out what's happening around the world on encryption and anonymity is a lack of maybe kind of a collection of what is happening around the world. So to the extent that, you know, you are active in your own country, and no, what's happening and actually have concerns whether it is of a legislative nature or it is something else that's happening at this state, maybe law enforcement or other level, I would encourage you to let me know and I do have a secure e‑mail, and we do have this ability to reach out directly to Governments to highlight where we think that practice isn't consistent with Human Rights law obligations. So, you know, there has been a lot of interesting discussion about what is happening in different places, but to the extent that you can share it, I really want to encourage people to do that. So no sort of closing message with, you know, violence and so forth, but just a plea for submitting material and keeping us informed. And then in turn we can translate and turn it around and share it with everyone.

>> ALEXANDRINE PIRLOT DE CORBION: I have two short conclusions. How we are going to try and proceed forward and I would encourage others to do so. A lot of the time I just feel that there is a complete disconnect between the policy decision making and what's actually the reality of it. And I think Civil Society particularly plays an important role in relaying those human stories behind the policies that they are drafting without really realizing the implications they have. And it is something we talked about briefly earlier and David brought up the legitimate concern of the states. And there is a duty for states to protect their citizens without forcing them to choose between their security and their privacy.

The second point is something I raised earlier as well, in terms of working on different fronts. Not putting all our eggs in the same basket. There are various policies in this debate that interplay and there is encryption data retention policies and other limitations. And we need to target all of them as we move forward.

>> PRANESH PRAKESH: Mine is going to be an appeal, an appeal to all the people here. Start caring about these issues. Use free software that's implementing open standards and using decentralized services that support federation and end‑to‑end encryption. Start caring about these things and learn more about these technologies. There are lots of people here to help. Apart from this as personal security and hygiene and helping others get together with those that care about issues and get involved in technical standards around these and Governmental policy making and writing on blogs, social media. Spread awareness and get involved. Nothing to do with Government or corporations but be the change you want to see essentially.

>> JUAN DIEGO CASTANEDA: We find that very same difficulty, convince people to start using encryption software or encryption techniques and kind of security tools and techniques. We start trying to work with journalists because we think some people, groups are special in the broader context of who needs to start using encryption. So Human Rights defenders and journalists, we encourage them and try to encourage them. Whenever you can send a message to those kind of people when they come back to our countries it is very good for us. We can say some associations they are saying you need to use encryption. That's one good point. And I want to thank everyone in the community that makes the kind of stuff that we can show in our security workshops to point out the risks that people are facing. It comes to mind, me and my shadow, that's a very effective tool to convince relevant people to start using those tools. Thanks for that development on tools and start using encryption.

>> CHRIS MARSDEN: A couple of things. I would love to take up the suggestion. Much greater minds have contributed to my contribution this morning. I spoke to Jon Crowcroft who is a professor at Cambridge and deeply involved in and trying to explain to Parliamentarians about who is involved.

Two people you should read, so first and I almost take this draft bill as kind of the final insult kind of postmortem insult to Caspar Bowden who has been one of the heros of this debate. He died in July of this year. He wrote about the compatibility of mass surveillance with the European Convention on the Human Rights and he has been rising the European Parliament on this for 15 years and my coauthor of my book that says about default encryption and his name is Ian Brown and he is a professor at Oxford. And if you have read anything about Ian and Caspar will be much more educated than what I have given today. It shows that people do care. Pranesh, you can be confident that we are preaching to the universe in this room.

>> GABRIELLE GUILLEMIN: Thank you very much. A final short plug. Like if you want to find out more about encryption/ anonymity you should all read David's report on the topic. Also we have here from Privacy International Article 19, examples of how encryption is regulated and used in a number of countries. You can check it out on Privacy International's website. It is called Securing Safe Spaces Online. And from Article 19 if you want to find out more about these policies, issues, we have our policy on the right to online anonymity. And finally I'd like you to join me in thanking and applauding our great speakers for coming. Thank you.


(Session concluded at 12:33 p.m. CET)