The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
>> MODERATOR: Good afternoon, everybody. Welcome to the session, organized by UNESCO, to launch our new study, and we have a couple of copies of the summary here. So anybody asks a question can get a copy. Otherwise, it's online. And you can also get other brochures which have all our other publications. This new publication is not yet in our brochure, because it came out early last week. We have the author here to present it to us.
I'll introduce the session by telling you briefly something that precipitated the study on this. Three or four years ago, at UNESCO, a debate developed amongst the member states after the Edward Snowden disclosures, and the debate was should UNESCO get involved in privacy issues, which of course deals with some of these issues of surveillance and as a tangential impact on encryption. The member states said, do a study, which is how problems get resolved in the UN system. We did a big consultative study which came out with this publication, for inclusive knowledge societies. This study was presented and refined and ultimately supported in the form of an outcome document at a conference called connecting the dots held at UNESCO, multistakeholder conference. That multistakeholder conference document, called Connecting the Dots, said interesting things. One is that it said here that it should be recognized by UNESCO that, I have to find ‑‑ it says, options for UNESCO related to privacy, it says UNESCO should recognize or could recognize that anonymity and encryption can play a role as enablers of privacy protection and freedom of expression, and it says UNESCO should facilitate dialogue on these issues.
There is also a strong injunction in this that UNESCO should do research on all these kinds of topics. This particular document was presented at the member states conference, so it came a multistakeholder event to the member states event, member states only. They adopted this document actually. This gave UNESCO a mandate to proceed with this basis and a mandate to say states have agreed that encryption and anonymity can play a important role.
That is quite a key acknowledgment. What the member states also agreed was a position that UNESCO calls Internet universality, which is that the Internet if it's going to play the role that we want it to play in democracy, sustainable development, it should be governed by four principles, rights, openness, accessibility and multistakeholder participation.
ROAM. It is easy to remember. In terms of framing all issues using this particular heuristic, if you are speaking about encryption, we need to look at it in terms of rights, look at it in terms of openness, impact on accessibility, and you need to look at to what extent it's also involved, decisions are made on a multistakeholder basis.
This is the way in which we approached the subject and with the support of funding from Germany, we managed to decide we do a study on this. And we made a public call for, a bid for researchers to come and do the study for us, and by Professor Wolfgang Schulz from the institute in Hamburg, the two of them did this interesting study. We will start this session with a summary of what is in the study. The study is online, and those who ask questions can get a hard copy summary. Afterwards we will go to Mr. Marc Rotenberg, and Amalia Toledo, who works for UNESCO in Iraq. We come to Sebastian Bellagamba, Internet Society, and then to Janis Karklins, formerly the ADG in our sector and our Vice President of Human Rights Council. We will have the closing words from a new ADG, Mr. Frank LaRue, former Special Rapporteur. I left out Mr. Amos Toh. We will have you first immediately after the presentation. Let's jump in. We hope we have time for engagement.
Wolfgang, over to you.
>> WOLFGANG SCHULZ: Thanks so much. Does it work? Yes, it does.
Thanks so much for the invitation to present that here. Thanks to UNESCO that made it possible and the front office in Germany that supported this. Some ideas we developed in the study, I will start ‑‑ is the presentation on? I will start with background information. The first one is not, does not come as a surprise, that this topic is still hot. It is not like the crypto wars we had in the '90s, but nevertheless, even if you go to randomly a session here, it is very likely that encryption somehow plays a role. Yesterday there was organized by the colleagues from Stanford University a workshop on that, and many others that touch the issue.
We believe that the reason for that is that not only which is of course important it affects people individually, and their modes of communication and their protection of privacy, but it has also a structural effect in different ways. Encryption can create subnets in a way, and lead to fragmentation. Countries can insist on policies that are applied to operators or manufacturers of things in their given country, fragmentation again is a issue.
You see that it is not only the thing that is obvious, but there are many other aspects to encryption and encryption policy, and that makes it important to talk about that.
What we could build on in our study is the report of the Special Rapporteur, Dave Kaye, on freedom of expression and opinion, and he already highlighted the importance of both anonymity and encryption to guarantee freedom of speech in practice.
When we talk about encryption, you see that, and I mentioned that already, that encryption can guarantee or at least support a lot of properties of information, but it's not identical of course. When we talk about authenticity or anonymity or things like that, encryption plays a role. But of course we know it's not identical. So you have to be very careful what you are talking about, in a debate, when you use encryption as a concept.
What we can see is that it's a complex and dynamic governance ecosystem that is, encryption governance, encryption policies are designed, it's not only of course the users and states matter, but the intermediaries play a fundamental role. We talk about encryption on very different levels, on the level of the, what the user has in his or her hand, content delivery networks and other things on the way in between, where we can talk about encryption, data storages and things like that.
It's important again in debate to point to where encryption is used.
And it becomes more and more apparent, and we highlight that in our study and in the recommendations as well, that this intermediary level plays a major role. Policy that just says the user should make use of encryption does not really have the full picture to say the least. But it's very much about encryption on the way in between, and the role of the different stakeholders in between.
We see and highlight that in our study that there are some that very much rely on encryption, that are minorities of course, that in many countries have to face severe consequences when they engage in public communication or private communication. But journalists are of course a group that very much needs the possibility to encrypt communication, again in different ways. When they do their research, it's important, but also when they share their opinions and information and want to do that without interference, then encrypting of communication with the user can be of special importance. Again it's not just one use case we are looking at. There are many, many different use cases we are talking about. And of course, when I talk about journalists, we have the question that is one question and the media sphere that is important in many aspects, that bloggers play in many countries a very important role to facilitate public communication, and not only traditional journalists but also bloggers depend on mechanisms to encrypt.
The core of our brief study is on the human rights aspects of encryption. I just want to present some thoughts we had laid down in the study, where we want to unfold aspects of the debate that is already there on human rights and encryption and that are two aspects mainly, that is to look at the aims of protection when we talk about human rights, what is covered by the protection of the human rights, and of course the justifications for limitations. We very often find very simple statements that encryption is, should be or is already a human right, but we all know that there are limitations, that there are legitimate limitations, so it's very important to talk about the justification for limitations when we want to have the full picture.
I just highlight some elements, some ideas we have unfolded in our studies, some elements that are not so obvious, we believe at least. The first one is that we believe that the self‑protection of speakers is within the scope of the freedom of speech. It is not the speech as such, as measures to protect the speakers, but they fall within the scope as well.
I think that's more or less obvious. Nevertheless, it is not common sense in some legal orders I know, and the constitutional order is not protected by the constitution or at least it's unclear. There has been no precedence, and very, very few authors elaborate on that. We do that in the study and come to the conclusion that self‑protection is part of the aims of the guarantee under article 19.
Another issue which was very broad, and not only something to be discussed when we talk about encryption, is intermediaries. On the one hand, the use of intermediaries as instruments of communication is within the scope of article 19 as well. And even more controversial is that the intermediary's functions are protected in our view as well, so the intermediaries can claim human rights when it comes to privacy in communication at least in specific functions. We do not say that in all the functions they perform, they are protected. But when they act as what we call privacy intermediaries, protecting the privacy and therefore the uninhibited communication as we call it of their users, then it is covered as well.
One element that already is important and regarded as part of the test you have to go through when you see whether human rights are infringed or not is that you consider effects as well. In many jurisdictions, you find that as a kind of additional element that you say an individual is somehow affected, but there is a structural effect that is a chilling effect on the communication as well, again it has to do with what we call uninhibited communication which we see as one of the core elements that freedom of speech is protecting.
Another thing we want to suggest as points of discussions are procedural aspects. What we can see when it comes to encryption policy is that you find at least in some countries informal arrangements between the Government and industry about possible back doors or whatever, and these informal arrangements and even acts of enforce self regulations or things like that are not as such bad. A lot of studies on self‑regulation and co‑regulation are quite confident that it is a powerful and human rights respecting concept, or at least can be designed in a way, but there are some problems associated with that. And one problem is that it can blur the responsibility, and you do not know whether there is in fact an act indirectly by the industry that affects human rights.
What we see is that we have to frame human rights more than we did before in a way that these procedural guarantees, that the state has to make clear that it made a decision, and is not allowed to blur it by using informal arrangements with other actors. That is one key element that we believe in future and that is not only true for the encryption, that is also true for other human rights aspects as well. That is why we put it forward.
Second aspect as I mentioned before is justification of limits and proportionality of limitations to freedom of speech, privacy as well, but we are focused on the freedom of speech issues. We have some, collected some arguments that we believe are important and have to be considered when we think about justification of possible limits to encryption that are set by states. One thing is that we believe that the structural effects that these kind of limitations have, have to be part of the proportionality test.
So when we have limits to encryption that have these kind of structural effects on, for example, specific provider, many people trust what their communication, and then this has to be taken into account. The second thing is that we believe that the effects on vulnerable communities, actors like journalists, but that is also true for many other groups in specific countries, have to be taken into account when balancing these things, and we believe that the level of certainty as regards the risks that are put forward justify the limitations, have to be taken into account.
We believe that it is hard to justify limitations to encryption, at least specific limitations, when it's a mere theoretical risk and not a concrete risk here in place. We find in some legal orders very subtle differentiations between types of risks, and we believe that when we talk about limits to freedom of speech by limits using encryption, these risks have to be taken into account. And under these assumptions, for example, it's very very hard to justify that using encryption is criminal offense in itself, which is the case in some states, when there are no concrete risks is demonstrated by the Government or the state.
I come to the last point, that is recommendations. We have a lot of recommendations, grouped by stakeholders, in our study. I just mention a few here. We believe that governments, given the importance of the issue, should include human rights aspect into their encryption policy and provide transparency, especially when it comes to the informal arrangement with industry actors. In this respect, we believe that it is not enough for an encryption policy to just look at the end user and say they have to be enabled and literacy, media literacy, Internet literacy has to make sure that encryption is also covered by these programs. That of course is important. We won't deny that. But nevertheless we believe that is not enough.
We have to talk about encryption on other levels. I mentioned at the beginning, if we want to leverage the outcome as regards privacy protection and freedom of speech protection, again, we focus on the especially vulnerable groups. We believe that it is not enough information at hand as regards the necessity for some groups, for minorities, vulnerable users, and to make informed decisions, to make evidence‑based decisions at least. We need more information about that. I think that's something important and neglected before.
As I said at the beginning, it's important and it is not always the case that to see encryption policy as a part of the broader concept, with all the intersections with other policy fields as well. So my last recommendation is, please read the study. Thanks so much.
>> MODERATOR: Thank you so much, Wolfgang. Let's jump in, because of time. We will go to Mr. Amos Toh, office of U.S. (background noise).
>> AMOS TOH: Thank you so much for putting together this informative panel, and congratulations to Mr. Wolfgang Schulz on a informative and thoughtful report.
A disclaimer, I'm speaking more in my capacity as a fellow with the U.S. Irvine School of Law. A lot of things I say will be informed by my capacity as an advisor to David Kaye, but may or may not reflect his positions as well as the mandates.
I think at the risk of problematizing this discussion, I want to start off with two broad questions. To protect the safety of journalists online and therefore offline, can we simply focus on the specific profession of journalism, and single them out as special unique benefactors of encryption and digital safety, as we look at the broader community of people that enabled access to information.
The second question I have that I would like to throw into the discussion is that in the context of digital security and safety, can we look at encryption in isolation or must we look at digital security wholistically, must we look at encryption as a part of a suite of tools both online and offline that are vital to safety and security.
On the first question, I think in David's September 2015 report on whistle‑blowers and sources, when we talk about the safety of journalists increasingly especially in the digital age, we are talking more about the function of collection and dissemination of information, particularly that which is in the public interest. If we tie protections to function rather than specific group of journalists or profession of journalism, this encompasses bloggers, organizations, human rights researchers and freedom writers and so on. This broader definition has salient and international human rights law, particularly under article 19 where it says freedom of expression is protected through any media. So journalism and transmission of information through any media, including digital and contemporary forms of media.
I also wonder and I throw out this challenge, we must think about issues of digital safety, even more broadly than journalism in this broader context. When we talk about protecting encryption for journalism, are we talking about protecting encryption for all people? If we single out journalists or any other vulnerable group as unique or special benefactors of encryption and other digital technologies, is there a risk that we could be incentivizing restrictions on these measures because we are attracting undue scrutiny by saying that journalists are given special protection because of the use of encryption.
I think that is an open and debatable question that I would love to hear some responses on. That is really the first part, right, like who are we protecting. Then the second is on encryption, I think the report, UNESCO report makes this very important link between encryption and anonymity. I think that is a really important link to emphasize. David's June 2015 report was not just about encryption but also about encryption and anonymity. Encryption doesn't necessarily guarantee anonymity as most of us know in this room. It doesn't protect the metadata, the titles of E‑mails.
If we don't, if journalists or any other person engaged in information gathering doesn't use encryption in combination with anonymizing tools, their sources may be discoverable. David identifies in his report not just restrictions on encryption that are concerning, not just back door access, not just decriminalization of the use of encryption, but also SIM card registration requirements and restrictions in the use of VPNs and even the use of pseudonyms. These are all concerning restrictions that we must start addressing and bring meaningful human rights discourse and scrutiny to in order to advance the conversation on digital security.
The final thought I'll leave with everybody is, must we also convey the need for sensible and practical balance between online and offline security to not forget that offline security measures are important, and there is a need to emphasize cryptography and other encryption tools. I read in a report somewhere that journalists who have all these highly advanced technologies, they reach the border of a particular country and they encounter resistance in the form of, if you don't give us your passwords, we will break your fingers.
What kinds of offline measures, more rudimentary measures bringing basic devices that don't contain vital information across the border and using traditional means of security, how do those complement encryption and other digital security tools. That might also be something that is an important feature of this discussion. Thank you.
>> GUY BERGER: Thank you, Amos, that was good food for thought. You have set agenda for questions. Let's go on, we will jump ‑‑ Marc is still connecting, so we will come back to him. Let's go to Amalia Toledo.
>> AMALIA TOLEDO: Thank you. I'm here to share with you the experience from the south, specifically from Colombia, and the experience we as a small NGO has with journalists in Colombia and an analysis that we did. We found that there is little known or awareness on digital safety and security for journalists in Colombia.
So their communications, whether by phone, chat, E‑mail, any kind of communication were very insecure. We decided to work on the development of this app to contribute on the protection of their communication, their privacy and security of their communications, and this app is called Hansel. Despite the setbacks or challenges we have been facing with the development on this app, being a small NGO with developers that are activists, but they still need to live. So they have to dedicate their time to work and not necessarily on developing this app, although they are very interested on working on this.
We have identified through the whole process where we still work with journalists, on digital security, that implementing digital safety protocols with media workers or outlet is very difficult, mainly because data security requires an extra effort, that they are not, they don't have either the time to put that effort into considering digital, their digital security, or because they are still working or dealing with many other priorities for them, like their own life, for instance, or sustainability of the media outlets when we are talking about media outlets that are based not in the capital city but in the regions, or where they are still, they still need to improve the work conditions. They still have other priorities, and safety of their communication is not there, it is not one of this priority. But to spread the use of digital safety, we are going to have a better opportunity if media outlets engage in this. We are sick of working with journalists where they have a very busy agenda where they have to run all the time, to get the news, and to sell one article to get something, some money to live, to make the month.
We are sick to work with them because it's difficult for them to implement digital safety measures. So now we have a better opportunity if media outlet take the responsibility of developing digital security protocols, implement them, and help journalists to take care of their communications. But it shouldn't be the sole responsibility of journalists, and media has to make more efforts, media outlets have to make more efforts on this. And they are not doing it at least in Colombia, as I say, for many reasons, when you have the mass media, because they don't care, or when you have small media outlet, it is because they don't have the capacity. They are still dealing with many other stuff. But they have to take the responsibility on this, and to help their employees improving their journalists, even though this is not going to solve anything because there are a lot of independent journalists out there, and there is still a need to work with them on raise awareness and make them implement their own threat modeling and their own security measures and so on.
But for that, in both journalists and the media, should consider also that digital safety of their sources and also of the audience they are addressing with their media. That is something that we believe so, for instance, we have been talking with media outlets in Colombia about implementing https in their platforms, and it come one ear and go to the other one, and no one is listening to us with this, because they are not aware and they don't care about the audiences. They only care of their survival, their own survival and not the audiences that also need to be secure, as my colleague here has mentioned before.
We have some success with journalists when we work with them, when we say maybe you are already used to living in risk because your profession is difficult in Colombia. They are still killing journalists in Colombia. Maybe you already accepted that. But your sources, no. They are forcing them to take the same risks as the ones that do by yourselves, you agree to take. So those are the only areas where we can, we have had some success, but still, digital safety process is very difficult for them. I'm not talking about encryption because as you say, I don't believe this is about encryption only. This is about a more holistic approach for me, and I forgot what I was saying. (chuckles).
But that's okay.
>> GUY BERGER: Thank you. We have started with the state and we have been to the individual, in a vulnerable community, and individuals institution. I'm sure our next speaker will focus on the intermediaries, but I yield my chair to him so he can use the laptop.
>> MARC ROTENBERG: Hello, everyone. I'm Marc Rotenberg, director and founder of the Electronic Privacy Information Center. We began in 1994 after the launch of the first petition on the Internet, and the basis of that petition was the freedom to use encryption.
We opposed this clipper chip proposal from the National Security Agency, which would have established key escrow encryption for all Internet and telephone communications, and that's in fact how Epic got started. There is a lot of history to tell. I know I have only a few minutes. I won't take all that time of the session. I'm going to post on the Epic Web site the slides, if you are interested. I've recently written a book called Privacy in the Modern Age, the Search for Solutions, which talks about some of the history. It is a nice Christmas gift, by the way. You can find it on line at epic.org/bookstore. Did I mention the URL? It's epic.org/bookstore.
Okay. If I can get the slides, please, quickly we will jump right into the presentation. Thank you. A brief history of the crypto wars and lessons learned. In the early 1990s, Government was worried about hyper privacy, the inability to get access to evidence in criminal investigations. They had two roles and they were conflicting. One role was to ensure the security of the information systems. The other was to enable access in the context of intelligence gathering and law enforcement. They describe these as equities, and they put out two proposals in the United States, one technical, one legal. Key escrow encryption would enable a third party who retained a key to gain access to private communication. The legal approach which we came to call CALEA was a mandate, if you are offering a telephone service you had to ensure that you could provide to your law enforcement agency the ability to access the plain text of the communication.
Those were the two key proposals in the 1990s in the United States. The NGOs were a bit confused in the U.S. The ACLU focused on law, not on technology. With a apologies to my EFF friends who are doing great work nowadays, back in those days they got confused, got money from the telephone companies, signed on to CALEA, and we were all very upset.
What happened in the technical world was most significant. It was the leading experts in cryptography who said this is just bad. Weak encryption is bad. It doesn't matter your political party, doesn't matter your position in society. If we don't build strong security into our communication networks, everyone will be vulnerable. We wrote a letter to the President, in 1994, and we urged him to oppose efforts to regulate encryption.
Soon that letter to the President became a petition with 50,000 names on it. That was a lot of people for the Internet in the 1990s. That was like half the Internet. We were telling the President of the U.S. that this stuff really mattered.
We had books and Freedom of Information Act lawsuits, and Phil Zimmerman, wrote PGP, he was under indictment from the U.S. Department of State for exporting arms. He is not a arms exporter. We had some failures. CALEA became law. Telephone companies in the U.S. are required to ensure that their networks can be wiretap friendly. But the clipper chip proposal was withdrawn.
The OECD adopted very good cryptography guidelines which we often point to as soft law. NGOs began to monitor national policies and there was open Government litigation. Here are lessons we learned 20 years ago about this issue. Then I'll fast forward with more recent lessons.
In this debate, tech experts are crucial. You do need the people who understand the technology to be able to explain in general terms, in nonpolitical terms, respecting everyone's concerns that bad encryption makes us all vulnerable. We had this debate with the FBI director recently. I debated him in fact. He said, we have 500 cell phones that we can't crack. That is a bad thing. What do you say to that? I said, Mr. Director, there are also three million cell phones that are stolen every year in the United States. Because those cell phones have strong encryption on them, criminals and other bad people can't commit more crime.
We are sorry about the 500 phones that you can't open, but it's better for you in the world of law enforcement that those three million stolen cell phones can't be easily opened, either.
Public debate is important. Internationally human rights norms matter. This is why we thank UNESCO for their very important work. You need to do the reporting, you need to do the litigation. Outcomes matter. I said a word about the problem of money. I'll leave it there.
What has happened? Snowden. Snowden told us a lot. Told us about mass global surveillance, he told us the agencies engaged in surveillance rather than fixing the vulnerabilities were exploiting the vulnerabilities. This is a dangerous thing for a Government agency to do. Particularly with regard to its own companies. Right? Instead of going to Microsoft, by the way, you should patch your new OS because it's easy to get into the kernel, NSA said we have a easy way to get into the Microsoft OS kernel. Guess what, hundreds of millions of people's communications including U.S. business and others were vulnerable as a consequence. Data of entire nations was gathered and systems of accountability broke down. We had some reform. There is good news, we thank Mr. Snowden for that. Since you bought me from the museum exhibit of the 1990s in encryption to talk to you here in the 21st century about the current debate, let me tell you what I've learned over the last 20 years that may help you in your work and advocacy.
This is no longer a debate just about privacy versus security. That is always understood for a long time, people need privacy, the Government needs access. We need to figure out how to strike that balance. We realize now that in a Internet of Things, with connected vehicles, with home thermostats and with door locks all on the Internet, there is no justification for weak encryption. When we did our brief in the Apple versus FBI case, we said if you look at what is on a iPhone, it is not just your private messages. It is your passwords to your bank accounts, your remote servers, the ability to control physical devices like your car or your door locks. You need strong encryption because security of the user as well as privacy is implicated.
We have also learned over 20 years that the databases are more vulnerable than we understood 20 years ago. We thought good security, good encryption, databases could be maintained, could be protected. I don't know if you have been following the news about the elections in the United States recently. We have had some problems with our E‑mail. We have had problems with our online voting. We are having some problems with our technology. Let me tell you. But that is also reflection of the fact that these problems turn out to be much harder than we thought they were 20 years ago.
This is still a stronger argument, even to governments, even governments know this, even as they say to you, we are concerned about what bad people might do, with strong encryption, they know that they are vulnerable to hacking and to attack as well. Last two slides.
I believe that the case for strong crypto has grown over time. The more that we have learned, the more experience that we have gathered, the clearer it is that to regulate, to control, to restrict, is going to be counter productive even for the short term gains, even if the FBI gets to open up those 500 iPhones that they have, those 3 million stolen iPhones also become vulnerable.
At Epic we launched this page 20 years ago, online guide to practical privacy tools. We have updated it. We welcome you to visit us online, download some of the software, see what is useful. We can't guarantee that anything necessarily works. But at least if we all give it a try, maybe we will have a better privacy and a little better security. Thank you.
>> GUY BERGER: We heard a appeal that everybody should get into the dark Web, which interestingly, there are no panels or sessions using this fear mongering term of the dark Web which shows the maturity of the IGF. I thought he was going to speak about intermediaries, but we will come to them. I'm sure he has something to say about them.
We have Mr. Sebastian Bellagamba, Internet Society.
>> SEBASTIAN BELLAGAMBA: Thank you very much. Good afternoon to everyone. Internet Society's mission is to promote the open use and evolution of the Internet for the benefit of everyone. I think the important part, there is two important parts. First of all, for the benefit of everyone, not just for technology itself. It is because of the benefits it brings to our people. The second is the openness objective. We believe in a open Internet.
We see the Internet as one of the more important enablers of human rights, the exercise of human rights. In that sense, we believe that encryption and other trust enabled technologies are critical to this. We shouldn't be able to exercise our human rights without just technologies in place, they support our freedom of expression, the commerce, everything that we do online is supported by encryption.
As such we believe that encryption should be the norm for all online traffic and data that we exchange, all users should be able to communicate confidentially and anonymously online, including everyone, including journalists. Individuals have in our perspective the right to use encryption, and other tools to protect their data. And the use of encryption should be, and in particular the end‑to‑end encryption should be not limited by Governments but promoted by governments.
Governments in that sense should not ever outlaw encryption technologies or mandate the use of encryption back doors.
In that case, we believe as was pointed out before in other interventions, that even though some governments might not intend to censor using this kind of technologies, back door to technologies, they may end up doing so. When we see encryption, we believe in strong encryption. For us, strong encryption means unbreakable encryption.
Any weakness in encryption will be, it is not that they might or could, will be exploited. There is no other possibility in our view. It is going to be exploited either by hackers, by criminals or by governments. We don't want any of those scenarios. In your very good example about the cell phones, I will take it eventually because I like the example, if law enforcement agencies can enter a device, criminals can too.
We don't want that. If we, instead of unbreakable encryption, we use any kind of weaker encryption, privacy and anonymity of all users is put at great risk. Our point is that encryption is not, in our view, I mean there is a false trade‑off between security and privacy in our perspective. That is my biggest, my strongest point that I wanted to raise. I think some people try to convince us that there is a trade‑off between security and privacy; in order to be more secure, we have to let our privacy go. We don't think that way.
Actually, I have a off line example for that, that I will always use. I don't know, 30 years ago, we don't have that many cameras on the street. We are watched everywhere today. And we were more secure 30 years ago than today.
The lack of privacy, because of the cameras, does not prevent unsecurity in the streets, apparently it's, the reason for that is some other reason. It is not the, that they have to watch us in order to keep us secure.
The only trade‑off between security, I mean in our perspective, the only trade‑off possible if we don't do encryption, and we don't, any other trust enabling technology, is between more security and less security.
So there is no other trade‑off that is putting at risk. Thank you very much.
>> GUY BERGER: Thank you very much. You mention unbreakable encryption. I think everybody will know from Apple FBI case eventually the unbreakable system was broken, which does raise the question then, as to whether, if nothing is completely unbreakable, is there still a case then to have a degree of proportionate limitation by governments under certain conditions of transparency and so on.
Anyway, we will have that debate whether it's either/or, either you are going to have the most encryption possible, or you, any weakening of that means you've got nothing.
Let's turn to Janis Karklins, Vice President of the UN Human Rights Council in Geneva.
>> JANIS KARKLINS: Thank you very much and UNESCO for inviting me to this what I found very fascinating conversation.
Let me do a disclaimer. I'm not speaking on behalf of Human Rights Council, but I'm speaking as a Vice President of Human Rights Council that is a member of council, and I hold this post until end of this year, will be replaced by Ambassador Georgia. I would like to make a few points, but also react to certain, also react to what has been said during the discussion.
Let me start by warning that it is not very wise to put equality sign between governments and criminals. Governments do the job with a certain purpose which is very different from a purpose of criminals. Of course there are always tensions. These tensions always existed between competing interests of different Government agencies in charge of certain issues that they have been entrusted to them. And also, we see over a period of time that sophistication of activities and risks have grown enormously. If you remember early '90s, the virus was when you are typing your text on computer and suddenly, letters started to drop down and form a file. That was considered as a serious virus but actually that was a joke in comparison with what we are facing now.
We of course do not know where it's going. But we know that this needs to be addressed in a way which is balanced, and governments certainly are part of that conversation and some are more involved than others. Now I'm coming to Human Rights Council. Like UNESCO, Human Rights Council is intergovernmental body. The difference is that UNESCO is in charge of very specific task and that is promotion of freedom of expression through media development, which also includes protection of journalists, Human Rights Council is part of UN machinery which address all aspects of human rights and its promotion and protection. There are different mechanisms that we are using, one being expert device which comes from special procedures, special rapporteurs, independent experts who inform intergovernmental debate, and form opinions within Human Rights Council. I would like also to note that it is not optimistic to expect immediate or fast developments when it comes to promotion of the human rights around the world.
It is about midterm impact that we need to talk about when it comes to Human Rights Council decisions, and their interpretation and implementation on the ground.
But nevertheless, it is very important element, because more than 194 governments are participating in that conversation and form their policies as a result.
Another element that Human Rights Council very rarely speak about implementation of specific policies, but rather stay at the level of principles. It is of course very valuable principles, like the same rights should apply online and off‑line, but when it comes to implementation, of course, it varies from country to country.
But now turning to topic of our conversation here, I would like to make two points. One point is that we need really to think about terminology and definition of privacy, which is certainly changing when we are venturing in this digital age. Historically, we can even say regrettably, we have traded willingly or unwillingly privacy for information. Privacy today is considered as currency for whatever we can find and read online. Again without value judgment, this is a fact. The underlying economic model of internet is such that we are paying with our personal data.
Second element, I would like to say that encryption, specifically for protection of journalists plays certain important role, but this is not the golden bullet. It does not ensure a hundred percent security of journalists, because online anonymity is not absolute, and will never be, even with encryption of metadata, still using different methodologies and analytics, one can by gathering sufficient big data set, one can figure out links between people or Internet users and figure out who is the source and who is providing information.
So, and this needs to be understood and kept always in mind. Nevertheless, I would like to encourage UNESCO to continue in that reflection, and most importantly, share the outcome of the conversation at UNESCO with the Human Rights Council, because that is imperative, that we sort of share our knowledge and understanding about those issues in UN system widely. Thank you.
>> GUY BERGER: Thank you very much. Let me ask Frank LaRue, the ADG of, assistant director general of communication information, my boss (chuckles) to make some remarks, and then we will throw it open to comments and questions, and get responses from the panelists.
>> FRANK LaRUE: Thank you very much, Guy.
I'm going to be very brief, because I think it has already been very rich and there is a lot of material for discussion. Before, I want to make two comments.
I fully agree, especially with Amos and Amalia Toledo, to began mentioning first of all that what we are talking about is not encryption per se or not anonymity per se. We are talking about questions of privacy and security of communications in general. This is very important. Encryption and anonymity are both very important, and this latest study we have, I think, highlights it and has good recommendations, as well as everything we have heard today, and there has been a long struggle on this for a long time.
But I think to put it into perspective, what we are really talking about is the creativeness of mind of human beings, is to develop its own thinking and its own expressions, through different means, can be word, can be art, can be any form, but to develop freely. This means the possibility of rethinking and debating and or gathering information, and yes, it includes journalists who have a right to maintain the privacy of their sources and who have the right to maintain their investigative journalism confidential until they decide to publish it, or but it also means that people that are in opposition to dictatorial regimes who are doing human rights can maintain the testimonies of victims private and safe, so as to avoid repression against them, or people that are establishing opposition policies against authoritarian regimes can also, and somehow develop their communication anonymously.
I think that this idea of privacy is essential for communication. This is why we always link them and it has taken different forms. Of course, online is more serious, but we have always recognized privacy as this space of human beings where people can develop themselves.
So my feeling is that, yes, there is several contradictions that some states may raise the question of the challenge of security, national security, that is always there, vis‑a‑vis the privacy of communications. But in a way, in one of my own reports I said that the security of nations is to establish safety for individuals and to protect individuals. It is also to protect institutions and the institutionality of the state. But it's also to protect the Democratic system and the political system. We cannot use illegal procedures or unlawful procedures or weakened fundamental human rights in the name of national security, because then we are weakening our own political system and the strongest societies vis‑a‑vis any threat are the most Democratic societies, the ones defended by their own public.
My feeling is that in weakening privacy, in weakening the element of communication, we are also generating a weakness of our society. I think eventually it backfires against states, the misuse of technologies, of penetration of communication of surveillance and monitoring of communications. It seems to me that what we are generating, because there is no ‑‑ it made us reflect and now we are demanding certain policies, but it shows us technology is going so fast, by the time congress regulations have new policies, technology has advanced ten years ahead, and is becoming so easy to do, it is that now all of a sudden they are not selective monitoring one message or two messages, it can be mass monitoring of all the messages. Why? Simply because it can be done. If it can be done, why not do it? This is the logic of those individuals designing this.
Yes, I believe that the use of encryption is legitimate, but again here, with this I finish, I think there is some, a challenge in this, and I think encryption is not a magical solution and a magical wand because also, encryption can create a myth, can make us feel safe, and in moments when in reality we are not.
Either because those that designed them, if it's a privately source created and sold, encryption system, there will be, yes, the back door procedures that will eventually be given to the governments, we may or may not know about them, but even those that were not done that way, we don't know how the technological advances have developed to the extent of making it invulnerable. I think that again, it may be good, may be wise to use encryption because ultimately the more encryption there is going around, the more difficult it will be to monitor those comments, but it would be a mistake to believe that encryption give us a foolproof totally closed form that we can fully trust in our communications.
I think that we have to design, Amalia Toledo was saying, global perspective of safety that allows different things from digital safety and cyber safety to personal mechanisms of safety or even deciding what should be put online or not.
>> GUY BERGER: Thank you so much, Frank. Let's get some comments from people. One, two, who else indicates? Two at this stage. Please, can you identify yourselves.
>> I'm part of the Internet architecture board. I want to take up the comment that Vice President of the Human Rights Council made which is a very important one, that of course governments and actors from illegal contexts are not the same. We certainly understand that, and it is an important point.
However, it is a very difficult thing for us to design tools that in hands of those two different bodies behave differently. The intent of the actor does not affect the weakness of the system. If we craft a system that includes a weakness in order to enable an actor with good intent, the reality is that that weakness is there for any actor. The IGF looked at this during the year of the clipper chip and published a document RC2804 which describes its response which is a purely technical one. We do not know how to design a telecommunication system which is weak only in respect to one actor. And as a result we will not.
During the recent Snowden revelations much the same set of considerations came up about what response the technical community should have. Again, the response was looking at the effect on the network itself, and saying that mass surveillance had created such a loss of trust in the network that the only way to restore that trust was to enable encrypted communications confidential communications, both from metadata and payload, to the greatest extent possible, so that confidential communications which did come from minority populations from journalists or from others were not singled out by the mere presence of their encryption. I certainly take the point that the threat model of a journalist would be different from a threat model of somebody engaged in banking or sending notes to their kids. But if we have the baseline to include methods like this, we believe the network as a whole is stronger both for the common use case and for the use cases of those who are potentially directly under surveillance or otherwise under attack.
We certainly do not wish to have an antagonistic relationship with the governments who have their jobs to do. But the simple reality is the best way we can build this Internet is to build it as strongly as possible with regards to both encryption and confidentiality. Thank you.
>> My name is Asid, I'm representing Pakistani not‑for‑profit organization. Most of what I was about to say has been said by the gentleman here. But I want to point out comment was made about governments not being able to connect to networks but when the governments have dark history of human rights violation and done more damage than all the criminal elements of a country combined, it becomes slightly difficult.
That is one. When we talk about the journalist safety, working on that for more than five years now in Pakistan, so what we have seen is that the governments have found a way about the encryption, across encryption. This new law in Pakistan, cyber crimes law has a element that anybody who is under surveillance or under any investigation, while even anybody who is under any doubt of activity being a journalist or not has to give up his passwords, encryption keys and devices without any warrant from the court. The police or investigating agencies has to notify the court in 24 hours, notify. All service providers in Pakistan have to keep record of the traffic data, with both data, the voice communication and data communication of everybody, for one year. That in itself is a huge challenge of protecting the journalists, the integrity and identity of journalists' sources because everything is, service providers and any Government can get it from service provider. How does the panel see that? Thank you.
>> GUY BERGER: We have one remote participant. Let's take that question, and we have somebody here and then you and we will get responses. Remote participant. Can somebody read it to us?
>> Yes, hi. We have a question from Cecil from the Civil Society council at the OECD, addressed to Marc Rotenberg and also to Frank LaRue. The question is, I am thinking of smart phone applications that facilitate encrypted communications like telegram or signal. These are useful software. But they run on top of platforms like Android or IOS which provide very weak protection against realtime intrusions while being used. The same thing happens with other operator systems like Windows or OSX. Do you think that fostering or even enforcing encryption by default policies in the software industry could be a fusible approach to solving this problem?
>> GUY BERGER: Thank you. Where is the mic, please? We have a speaker here.
>> Thanks. Brian Holland, we operate dot CA as a top level registry in Canada. If this was a easy problem, we would have solved it. The richness of the conversation around this table has been remarkable for this discussion. I find myself in my day‑to‑day role, there is a dissonance to what we do because on the one hand, I operate large databases. We heard how easy they are to compromise. That is the thing that worries us on a nightly basis, ensuring our own security.
On the other hand, on the other hand the other thing that keeps me up at night is threat to the network. There are many security related issues and a stronger Internet I'm a fellow traveler with my colleague at the IAB, on the other hand, it makes it harder and harder and harder for us on the front line operators of pieces of the network to continue to ensure a safe stable and reliable network.
I find there is that dissonance in my own role and we hear it here today around the table, but how do we create that safe, stable secure Internet that also enables us to protect individuals and protect the network itself from the day‑to‑day operations role.
>> Thomas Richmond, German Ambassador for cyber foreign policy.
I wanted to reply to your question, why do we single out journalists and why not talk about everyone's security and right to encryption.
Why did we fund this study, first of all because it led to the mandate of UNESCO so that was good reason enough for us to do it. The second reason is of course that the Human Rights Council and other organizations that look at the protection of human rights have for a long time had a long tradition of focusing particular attention to journalists, protection of rights of journalists is, has been a political issue all the time. So in that sense, it looked natural for us now, also to look at protection of human rights of journalists and their working editions in the Internet because situation has changed.
You are also aware that Germany and Brazil are running a General Assembly resolution on privacy in the digital age. In this year's resolution, we had tried to introduce some language on encryption and other technical means to secure and bolster privacy. It didn't get through, because it was due to bad drafting, we will probably try again but this is a sensitive issue for sure, and not everybody around at the UN and elsewhere is totally happy with anyone who raises the issue.
>> GUY BERGER: Thank you so much. I should pass around the summary to those who made comments and questions.
But let's go in the order of the speakers. It may be fragmented, but give a response to one or two of the points that were made.
>> Thanks. First, to the question, journalists as a group, and functional approach, found that extremely interesting aspect, and that is not only an aspect we discuss in this encryption policy context but other context as well.
I believe that these kind of functional approach is appropriate, but on the other hand has had the problem of clarity sometimes, because when it comes to court decisions, then you need very good criteria to outline and frame this function. So it's not an easy one, but nevertheless, I think it's definitely so, that we have to think about structural functions for public communication, and one is what we call journalism, but that does not mean it's a specific trade or something like that. It is very much focused on the context.
To the remote participant's question about encryption by default, I think that is an intersection of two different fields. One is what we discussed here right now and I would be very much in favor of that, because it has to be solved on this level. There are studies about the use of encryption and what really people, hinders of doing that, it's so easy, but nevertheless very few people do that, and even companies it's increasing, but it's not the norm right now. So that is of course one thing. But on the other hand, it's about autonomy and making people see what they do, and risk analysis of the individual. That is very important. Frank pointed to that. And this false sense of security can be a problem. It is a well‑known thing in privacy theory, they have a privacy paradox that when you believe that everything is okay, your privacy is protected, you act in a way that is not okay in the end. Technology has to some extent make people aware what the risks are. That is not an easy trade‑off. I see that. But it is a trade‑off we have to consider.
>> Maybe I should clarify my question I pose about encryption and its relationship to safety of journalists, and encryption protecting the safety of everybody. I think definitely the interests of journalists and the safety of journalists is critical and should be definitely talked about and elaborated upon, but I also think that when we talk about encryption and we talk about digital security, we need to emphasize that when we seek to protect encryption and cryptography, we are talking from everybody including the journalist who is doing investigative reporting and needs to protect sources, all the way to somebody who is inputting personal information into a refrigerator and whose personal information might be stolen from the things that, information that he puts into the network of Internet of Things.
That is what I meant to emphasize in terms of how we need to emphasize that this conversation is not just about certain groups. It is about that, and it goes beyond that at the same time. Thank you.
>> AMALIA TOLEDO: A short point, we have to keep raising awareness. We have to seek commitment from the different sectors, from Government, from intermediaries, or the private sector, but as I said before, from media outlets and media workers as well. Just to make a better world, to say it, more safe Internet.
>> SEBASTIAN BELLAGAMBA: Thank you. I think the Internet was not designed in principle with this secure feature in mind. I think the years of experience on running the Internet have turned the mind in certain way. My colleague pointed out, the revelations of this pervasive surveillance of communications from the Internet has changed the game in certain way. We have to do everything in order to protect the integrity of the network and protect the security and privacy of the people that is using it. I agree sometimes technology can produce some false security. But I still think we are better off with the kind of security that we can provide that without it.
>> I'll jump in, make a few quick points. First of all to UNESCO again in support of the report, it's a wonderful thing that you joined privacy protection in the interest of journalism and freedom of expression. Too often in this debate I think people try to balance privacy and free expression. I think that is a misunderstanding. This is a good area where strong tools of privacy and confidentiality enable better reporting and better public accountability. That is my first point.
My second point, we respect the concerns of Government. One of the lessons I've learned over the years is to take seriously that there are bad people out there who certainly intend harm, but I do think that Government in seeking these enormous powers for weakened security has a corresponding transparency obligation to make clear the need for the authority as well as the risks. When we have participated in these debates, actually in the early days we did a series of Freedom of Information Act requests as to the local FBI offices. We asked the question, straightforward, has encryption been a obstacle to your criminal investigations? The answers we got back were no, actually, at the time. That may have changed.
We did change, we did help with an amendment to the U.S. wiretap reports, we now know on a annual basis how many cases, where encryption is an obstacle. So we have some data. We can make the assessments. So again I'm not ignoring the concerns of Government, but I think Government has a need to be accountable. Finally, if I may make a brief contradiction about the privacy paradox, this is something I feel quite strongly about, we need to change our understanding of this phrase. It's not about people engaging in behavior and saying that they care about privacy, but doing something in fact that reflects lack of understanding of the privacy risk.
I think that is unfair to people. I don't think most people have the ability to make the judgment that it's not the most current version of the OS and therefore I can't use this app, and if I do, the text messaging would be insecure. We would never think that way with consumer products. We would say if someone offers a E‑mail service, it should be secure. It should work. It should be encrypted end‑to‑end. If it's not encrypted end‑to‑end, it is not a E‑mail service. It's something else.
We need to put less responsibility on the user and more expectation on the service provider; if you are going to offer a communication service, it needs to be encrypted. If you are going to store personal data, you need security measures, the data should be encrypted. We don't have the ability to measure the thickness of the walls of the bank vault. If we give our money to the bank, we have to trust them that they will be able to protect our money. We need to think that way with our data, as we turn it over to these Internet companies.
>> If I may, completely agree with that, when we talk about privacy paradox, we mean completely different things sometimes. I wasn't only referring to the aspect that risk awareness matters somehow in these things, it does not mean to say that we do not have to think about encryption on the level of service providers. On the contrary, that is one of our points we make in our study.
But nevertheless, in the whole debate, I think it's important that people act under the comprehension that there are specific risks and these kinds of paradox only I was referring to. As regards results, there is no contradiction at all maybe with framing what the privacy paradox is.
>> Thank you. I fully appreciate that the same technology will be used by governments and criminals. No question about that. Only governments do perform functions that are entrusted by population, and some of them may be basically contradictory. And there are governments to perform their functions better than others. And for one side Government is in charge of protecting and promoting human rights including freedom of expression, including privacy, and from other side, governments are in charge of also providing security for population. And most probably people don't like to be killed in a concert by a random individual, and normally when these things happen, then the first question the population asks usually is, where was the Government, why Government didn't do what it is supposed to do. From that perspective, and Government are composed, governments are composed by individuals, and as we know, individuals are acting in sometimes irrational way. So when you burn your fingers in one way, you immediately shift, go from one ditch to another ditch. In that respect, that is a search for a balance, to keep all actions on the road, in order to promote one and promote human rights and privacy and protect population from harm that can be potentially done.
That is one element.
The second element, unfortunately, this is the classical situation, then we are spending 99 percent of time talking about 1 percent of bad things. Instead of talking 99 percent about 99 percent good things and devote 1 percent time to 1 percent bad things. This conversation (overlapping speakers) is necessary. And forums like this are very useful for you to understand better Government perspectives and for governments to understand perspective, you are also human and you can go from one ditch also to another ditch in no time. Thank you.
>> Frank, we have one minute left.
>> Literally to remind all of us about a couple human rights concepts. Number one is, privacy and human rights, privacy and freedom of expression are two very different human rights. They are not in contradiction. They are complementary and have to be understood together. They are very different but they need each other. They will not be full freedom of expression and other freedoms as well without privacy. And privacy is, exists for the purpose of many rights but including freedom of expression.
Secondly, as human rights, they are there for everyone, for all population, not only for certain sectors but clearly we must recognize that certain sectors of population run a bigger risk, because of the social role they are playing. Journalists is one of them. Human rights defenders is another. I think this is very important for those that are critical of opposition, in the cases of authoritarian regimes. Oftentimes you have people that are of a profession that has several characteristics like cartoonists who are journalists themselves, but who are using art, or artists as well using the expression of art for their cartoons to send messages. They are oftentimes suffering the consequences.
In general, we can see that this is a right that everyone should enjoy like all human rights. But that we must also look at the most vulnerable sectors and the possibility of establishing particular protections for the most vulnerable sectors.
Finally, on the question of governments, I think that governments vary. Some governments are legitimate, some other are not so legitimate. Some of them defend the population. Other ones, others repress their population and dissident opinions. But the point, the main point is that what we are struggling for in freedom of expression is to have a Democratic system and everyone can express themselves freely. In this sense even Democratic governments I find fall sometimes into the temptation of trying to curtail the most critical of opinions, and this is true, we learn that in many incidents in history, and this is important to prevent by mechanisms of protection and safety, which is why we are saying that the safety of freedom of expression and the protection that we should use in all our communications, whether traditional analog communications or digital communications should be a global perspective, because it is true that oftentimes we will try to be hacked by delinquents or we will try to be penetrated by illegal states or just by other actors. The idea of using safety mechanisms, have a broad concept of safety is important for own preservation of communication.
>> I want to thank my colleague who set up the study and our moderator and our speakers. If you didn't get a chance to ask your question, you can still claim a copy of the UNESCO publication. We don't want to take them back to Paris.
(end of session at 16:35)