IGF 2017 - Day 2 - Room XXV -WS61 Between a Rock and a Hard place?: Identifying Encryption Policies that are Human Rights Respecting

 

The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> MODERATOR: Hello?  Can everyone hear me?  Sorry.  I hope everyone can stay for this session, too.  We'll start in two, three minutes.  Hi, everyone.  So we're going to start this session on identifying human rights respecting encryption policy between a rock and a hard place.  Just to introduce myself, my name is Sheetal Kumar, and I am the program lead.  I'm being ‑‑ oh.  Oh, my goodness.  That had an effect.  Okay.  Well, that was the intention.  Thank you very much for coming to this session.  I know it's the end of the day, and everyone must be exhausted.  And it's been a busy day for everyone, I'm sure.  As I was saying, my name is Sheetal Kumar, and I am a program lead at Global Partners Digital which is an organization based in London but working globally to build a digital environment built on democratic rights and value.  I'm also joined by my colleague, Richard, who's on the left there.  And he'll be helping facilitate this session as well. 

And I'm joined by the Internet society and from paradigm initiatives.  Before we start and go straight into the agenda, I just wanted to give a bit of context as to where this session has come from.  And it's actually a follow‑up.  Last year at the Global IGF in Mexico, I shared a debate on cybersecurity who has got our back.  And it focused on encryption.  Why?  Because earlier that year, as I'm sure everyone here knows, a case had brought the technical issue of encryption into the world headlines.  When the FBI requested that Apple build some kind of exceptional access into a locked iPhone as part of a criminal investigation.  And so this idea of building some kind of vulnerability or back door into a device resurfaced.  And there had been a similar standoff in the 1990s.  It wasn't new.  But there were particulars to that case that made it rather interesting and difficult. 

And so on that panel, we debated that particular case.  And what we heard from across the spectrum from cryptographers to policymakers, no, not consensus, but recognition, that back doors with just not a very good idea.  They're not good for cybersecurity or human rights.

They're just not very good response to these issues.  And that's not to say that we've moved completely from that call for backdoors, but something that did come out of that session last year from the participants as well was that we need human rights‑respecting policy responses to these issues.  Over and over again, we need human rights‑respecting responses to these challenges.  So the challenges that encryption poses but also to the challenges of the technical responses sometimes proposed by those law enforcement agencies like backdoors can pose to human rights hence the title of the session, "between a rock and a hard place?"

So encryption‑related policies should be human rights respecting.  What does that actually mean?  What I mean when I say what does it actually mean, I mean if you were to write it down, what would it look like? 

So this is something that many of us have been grappling with, and there are a number of society initiatives and others that have sought to bring clarity to these questions.  And that's exactly what we're going to do as well in this session is what does human rights‑respecting encryption actually look like?  We're going to do that through a format which is interactive.  So if you're tired, then beware, you're going to have to get up and walk around a little bit.  Or at least turn around and talk to people next to you.  Warning. 

Two parts.  First we have our two panelists here.  Who will help us frame the discussion.  Two panelists couldn't make it, unfortunately.  They were both women, by the way.  I took that into consideration.  So this is not a (?) Because I'm moderating it.  Unfortunately one of them couldn't make it and the other one had to leave early.  That means we have more time for the breakouts.  Hopefully that will work out.  And during that breakout, just to give an overview of what that will look like, we have, as part of our advocacy on this issue developed a set of key messages we believe should underpin encryption‑related policy and those built on international human rights standard.  And we'll use that as a basis for our discussion.  We have real‑life experts of laws of encryption which we will be marking against those criteria in breakout groups.  And then we will come back to share our views on how those real‑life experts actually relate to the criteria that are set out.  I have materials for that.  So don't worry, we'll get to that. 

As I said, we'll first start with the framing of the issue.  I'm going to ask the panelists to introduce themselves.  And then you have four minutes to discuss the issue of human rights and encryption.  And I know you will say a few words about why encryption is so important, but also what the dilemmas are in this space and where this debate is going now more than a year after the FBI case. 

>> PANELIST: Thank you.  Thanks.  And good afternoon.  I've never been the last session before cocktail.  So I hope you'll be up to the challenge.  Yes, so I work at the Internet Society.  The Internet Society has advocated for 25 years, actually, for an open and secure Internet.  When it comes to secure Internet, obviously encryption has a lot to do with it.  And on the topic of encryption, we actually have a very simple position.  We do believe that strong encryption should be the norm for all Internet traffic and data. 

And why is that, and that's because such encryptions are really the bedrock of trusted exchanges online on a daily basis in many domains from daily e‑commerce transactions, financial transactions, and actually also to protect some very fundamental rights such as the ability to communicate online in a confidential way. 

Now, of course, every technology has two edges.  And the same technology that can be used by millions and millions of law‑abiding citizens can also be used to conceal criminal activity.  And that double edge is, of course, what gets law enforcement and intelligence agencies nervous, and that's what creates some of the attention that Sheetal talked about. 

So in that context of ‑‑ if we consolidate those two sides, encryption policies that you're going to discuss in the breakout have a difficult task to reconcile and address two important and sometimes compelling subjectives.  On the one hand, securing infrastructure, communications and data.

And on the other hand, enabling law enforcement and intelligence agencies to access communications that are targeted and suspected of illegal activity. 

Now, the encryption policy debate has evolved over the past four years.  As mentioned before from a place where dominant disclose of governments was related to backdoors.  A sort of special access for governments to encryption.  There's a bit more debate that has shifted to asking companies to set limited or reasonable levels of encryption and also growing asks from law enforcement to be able to circumvent encryption when they get Court orders.  What is often referred to as lawful hacking. 

And in a way, that sort of shifting policy debate reflects a change in the encryption market.  And you know some encryption services are actually more law enforcement friendly than others.  For example, in the last four years, we've seen some companies actually getting rid of their ability to decrypt customer content.  That is the case on many devices that are encrypted by default.  For example, if you have an Apple Smartphone, an iPhone.  And also for millions of users using end‑to‑end encryption for messages and content.  So that's a change in context for law enforcement.  But, of course, not all companies have this type of encryption levels. 

And I was reading that, for example, in China, they have encryption in transit, but they don't have encryption on the server side which actually allows a level of government access and control.  So, again, you get into those breakouts, and I think that's one of the biggest dilemmas facing both law enforcement and companies is how much encryption is enough and who gets to decide.  And, you know, I think you also have to think about can encryption policies achieve adequate levels of proportionality between security and human rights?  Can things like lawful hacking I just mentioned be a solution that leaves the majority of users protected while others are suspected criminals?  I think these are all questions that we need to think about in the current context that are very difficult.  It's good to have the expertise of the people in the room to (?). 

>> SHEETAL KUMAR: Thank you very much for outlining what the issues are, the lines of the debate, so to speak, but also how complex the challenges are now that things are moving in some spaces and we won't hear the call for backdoors necessarily go away, but we will see more calls and more encryption workaround.  And the question of how much transparency regulation that needs to follow as a result is a hot question, I know.  And you also brought up some other questions around how much encryption is enough?  Should there be limitations?

Who gets to decide in and I think the really key question which we'll definitely address in the breakout which is can encryption policies respond in an adequate manner to the necessary and proportionate principles.  And I'm sure there will be some debate around that when we look into it.  So I'm going to ask you to make your intervention and to say a few words about how the issue plays out in your context in Nigeria perhaps and more broadly, if you like.  Thanks, Gbenga. 

>> GBENGA SESAN: Thank you.  Gbenga Sesan.  And just to clarify from the onset that there are two sides to this reaction from the government in terms of encryption and policies.  There is the genuine interest in addressing criminal issues.  And I think (?) And there is the opportunistic approach where a government wants to do something, a government wants to (?) Do a crackdown.  And then saying that, you know, terrorists use encryption then it's like an excuse because the real intention is to ultimately crack down on idle position or (?) And I think because if we don't, then we miss the entire debate.  Then we believe that everyone, you know, actually has intentions to fight crime.  That's the reason, you know, why we haven't.  But that's not true.  There are scenarios where ‑‑ I'll give you examples.  Scenarios where bills have been drafted, laws have been passed.  And a year and a half after a law has been passed, there's not one place of any, you know, criminal activity that's been taken to court, but there are journalists or bloggers or media people who have been taken to court or arrested or even jailed because of an interesting section of that bill. 

We have the difficult decision around encryption and human rights is the fact that whatever the policies are, there must be a commitment from ‑‑ I want to say government.  I refer to all, you know, letters of government.  Including security.  Because one of the reasons why, of course, they react, surprisingly, when you say that, you know, there's actually an argument to be made around encryption and human rights is the fact that this is sort of a description to (?).  That's why they are concerned.  But I think there's a mutual respect process.  And by process ‑‑ I'm not referring to process as one side.  I'm speaking of process, agreed.  One of the things that we found extremely useful is over the last few months, we've conversations between unlikely partners.  And I feel like we're partners, because I'll give you an example.  Between Civil Society and national security and things like that.  Typically you would have these two groups, okay, you know what?  That is the enemy.  But you have conversations because people see things based on, you know, where they walk.  It's like the story of, you know, the visually impaired man who was touching the elephant.  Touch the hair and said oh, my God.  (?) A lot of concern is concerned.  Everyone is a criminal literally.  So if there is encryption involved, the question is ‑‑ and I've had this question asked in some of our work.  We've had cases where we've gone to court and the typical response is what do you have to hide?  If you have nothing to hide, why get involved in encryption anyway?  And you really want to get offended when you hear that.  But when you put yourself in their shoes and realize that everyone is a criminal, right?  Like everyone is a potential criminal.  So then you can understand where they're coming from there. 

But when we have more conversations, even robust debate, argument and scenarios, then you can see where that comes from.  And that's why I say that the first thing is we must have a commitment to respect process.  There's a reason why ‑‑ and this is good, by the way ‑‑ that businesses now subject, in many cases, businesses subject government to their own processes.  If you make a request for a takedown, we have an internal process.  You have to first of all get a court warrant.  Because many years ago, I hope it doesn't happen in any country anymore, where you have government officials calling a service provider is and say listen, there's a threat here.  Shut it down.  Without a letter or document that says we have a court warrant or we have this process.  Because I think that one is important. 

The other is, you know, an interesting trend that we're seeing now, which is a good thing.  To say that human rights is good for business.  He referred to that when he talked about people who are getting rid of certain (?) Because they are subject to abuse in terms of basic human rights.  And so the fact that we have a lot more recognition of the fact that human rights (?) Business in the sense that if I use your service and my data is protected, if I use your service and I foe that it is not likely that you hand over my data to, you know, to anyone who is about to abuse it in terms of my rights, then I'm likely to respect your services more.  And eventually, of course, that's probably good for your business. 

But apart from this in terms of businesses is the fact that citizen awareness is important.  Many times, so we do this training for various institutions, media, for Civil Society, even for policymakers.  And you would be surprised when we have conversations around privacy.  A very simple example was a few years ago, the government of Nigeria through the Central Bank decided that even though they had accounts for a very long time, you now had to enroll in a bank verification which was designed to, in a way, check corruption.  (?) Defend corruption.  But no one ‑‑ a few people, very few people were asked questions about why should I give information to get access to banking services I already have access to in a context where there's privacy laws.  And I think that, you know, apart from the fact that the U.S. government checked, human rights is good for business, we also need to be a lot more aware.  Any policy that we're designing around encryption should also involve opportunities (?) As possible.  We used to think of encryption, and I'm sure you can test this, right?  Your various committees.  Measure encryption with various people and the first thing that comes to mind is it's for geeks and things like that.  But right now everyone takes that for granted.  End‑to‑end encryption.  (?) Platforms that have end‑to‑end encryption.  And we need to define encryption.  Maybe by not using the word encryption, but seriously making sure that it is the default, physically the default.  And like I said earlier, there's no side of this policy‑making, there's no way that this policy‑making should be one‑sided. 

One of the reasons why we have, you know, Civil Society, one of the reasons why we have to ‑‑ we've got to challenge many laws is because many times these bills are drafted with an agenda.  And any agenda that is not objected to a process to a ‑‑ I don't want to use the word Court order.  I think (?) You know, inclusive.  I don't want to say (?) You know.  Because then someone can bring on a different view.  Otherwise you are going to have people, you know, in the same direction.  And at the end of the day, the policies that come out are going to be suspecting, bills not respect everyone (?) Otherwise you're trying to hide something through encryption.  So the actions of government need to change.  The action of business being good enough is changing.  But also the action of citizens in terms of the tools that we use.  Maybe what we need is, you know, what someone referred to as a quiet application where people make demands at a basic minimum for a service, for a platform, is one of encryption.  And if you don't bring encryption to the table, then you're literally not considered (?) The size of your marketing. 

>> SHEETAL KUMAR: Thank you very much, Gbenga.  And also, what I really liked about what you said was that you set out some of the challenges but also some solutions, which is also great because we hear a lot about problems but not always solutions.  And some of those issues that you identified were that there are both illegitimate or legitimate reasons that are used for cracking down or for weakening encryption.  And that we need to be sensitive to that in our advocacy.  One of the solutions that we need greater clarity of process around how these issues are topical, due process, of course, as well.  And that there are actually many good cases to be made for strong encryption that we can use.  We're actually, I suppose, quite fortunate in this space that it's one of the issues where Civil Society groups, in general, and an industry can see eye to eye and the benefits of strong encryption.  Of course, that doesn't mean that there aren't challenges, but the legal and technical responses which aren't human rights respecting need to be addressed.  So, for example, we mentioned backdoors and maybe some issues around the other responses, too. 

And then you mentioned lastly how there should be a shift around what is considered the norm.  And a lot of our advocacy work in this space in trying to shift the debate especially in high‑level policy spaces from one which paints encryption ‑‑ strong encryption is primarily an enabler of crime to one which recognizes integral value as an enabler of human rights and trust and security in the network.  And so where basically it would just be inconceivable for people to demand anything but strong encryption.  So hopefully we're moving in that direction.  And what's really important is legal and policy responses that ensure that reality.  And we'll come to what those might or might not look like in the breakout session. 

We technically have just half an hour left.  But we started a bit late.  And I'm aware that in the following session, there are some ‑‑ we're getting into the detail of policy.  So there might be some questions from the floor for our panelists.  So I'd like to take just under five minutes to take any questions should you have them.  If you could introduce yourselves.  And if you also have a question for one of the panelists in particular, if you could direct it at them, that would be great.  So I'm opening it up now to the floor for any questions.  Yes. 

>> AUDIENCE: Hello.  My name is Daniel.  I'm from IGF Brazil.  And what I really think is that we should be talking more about the future.  (?) Microsoft language simulator.  Google has a simulator computer.  The encryption changed drastically.  And the governments, get over it.  What we should be discussing is to don't let them do this kind of thing.  In Brazil, we had had whatsapp blocked two times because the committee investigated.  And bills will be passed.  What we really should do for the future? 

>> SHEETAL KUMAR: Thank you for that question.  Which I'll definitely put to the panelists in a second.  But I wanted to make sure ‑‑ we've got a couple questions, at least.  Are there any other questions?  At this point?  No?  Okay.  So there's just the one pretty challenging question, I think, about the future. 

[ Laughter ]

Oh, excellent.  Excellent.  There's also a question from a remote person. 

>> AUDIENCE: Yeah, from a New York district attorney calling for weaker encryption from companies, he's saying how do we explain that it is a backdoor?  Because he's saying it's not a backdoor.  How do we explain that it is? 

>> SHEETAL KUMAR: Okay.  Thank you very much for that question.  So we'll take both of those questions together, I think.

The first one on the future of encryption, the challenges that certain developments that computing might bring to this debate.  And then secondly, more back to the present day, what are some of the arguments that can be put forth to those who argue for regulating encryption and actually making clear that this is ‑‑ that those requests are a form of asking for a weakened encryption.  Would you like to take that one? 

>> RICHARD WINGFIELD: Yeah, on the future and that gets back to the discussion we had before about encryption by default.  I think that's really something that a lot of efforts in the Technical Community are being put into making sure that you don't have to think about enabling encryption, right?  And also I think that's something that we see with, again, whatsapp and messages and encryption by default.  And I think ‑‑ I think that's very important as well because ‑‑ and I think in the future, you'll see, again, more and more encryption by default across services from a technical perspective and not a legal one. 

And I think that's important because I had a discussion with someone the other day that said that ‑‑ well, the challenge is when you make encryption a choice for users, while other countries where people will be bullied into turning off encryption or not using encryption or criminalizing encryption.  Then I think (?) It's a default thing in the background that people don't have to think about the better. 

On the second question, well, you know, there's ‑‑ there have been reports from several eminent technical experts in the past that have really, I think, confirmed the fact that you cannot have, again, good encryption or an access sort of backdoor for good guys and bad guys.  When you talk about weaker encryption versus (?) At the end of the day, either criminals of foreign countries used in one way or the other.  So I think that's really a fine line. 

And, of course, I raised the question in my intervention, is there a level of sufficient encryption that sort of again enables government to have some way to circumvent and lead to a general level of protection for people?  But I think that's, again, a really tricky question because in a way from a technical perspective, you have to turn on encryption or you don't.  I think that's one of the challenges that seeps over into policies. 

>> SHEETAL KUMAR: Thank you very much.  I know, Gbenga (?)

>> GBENGA SESAN: Yes.  When it comes to the weaken encryption conversation, there are two things when we talk about encryption.  There's (?) And there's opportunity.  So if you weaken encryption, does that remove it?  It doesn't.  If you weaken encryption, it actually increases the opportunity.  So what you do with a weaker encryption is you do nothing to the Internet.  You think you're trying to correct.  But you increase the opportunity.  It's like, you know, you lock all the doors in the house.  You just want to open.  If you live in a city where every weekend you read reports about robberies, you probably don't want to do that or you want to fly a plane with a weaker engine, I'm not sure.  I'm not sure you'd want to do that. 

And in terms of the future, one of the things that I'm sure that, you know, every engineer worked on (?) And everything around the Internet, one of the things that I'm sure they also talked about is if we had an opportunity to start over again, all over again, everything will be designed with encryption.  So going forward, Internet of things and everything that has been named will now have an opportunity to set the U.S. back.  And I think that in all of our interests (?). 

>> SHEETAL KUMAR: Thank you very much, Gbenga.  So I feel like a few points are coming through very strongly around what human rights respecting encryption policy might look like.  And one of those is that strong encryption by default should be the norm.  And it should be where there is any reference to encryption in policy and law, it should protect strong encryption.  There are nuances obviously to this, and there are ways that policies and laws are written which may complicate this.  There are different types of laws and policies from cyber crime acts to strategies to all sorts of different types of instruments that deal with this issue.  And what we want to do in the next session ‑‑ and I'm going to invite my colleague, Richard, to help facilitate it ‑‑ is not laws and policies, perhaps at some point abstract conversation to the real world and consider whether those might be human rights respecting or not. 

So what I'm actually going to ask you to do ‑‑ and it's a pretty radical thing to do at the IGF ‑‑ is to stand up.  Please stand up.  And then Richard, would you like to come over and help us to get into this next session?  So ‑‑ yeah. 

Okay.  So if you ‑‑ if you find the people next to you and group together, I know that the chairs are mobile.  So you can move them around.  We have ‑‑ if we have seven groups, that would work really well.  So perhaps just naturally find the people around you to group together with.  And I would ask Nikolai to join one of the groups, too, in case you're interested.  And then we will distribute a table and a set of excerpts of policy and law, and we would ask you to consider which of the excerpts relate to the criteria which are set out in the table, have a discussion about that, be able to justify it when we come back in ten minutes.  And so it should become a lot clearer when you see the table and the excerpts, so we'll go right ahead. 

All right.  So we are going to get started.  I know that there was ‑‑ there was some disagreement, there was also some confusion about how to use the table, but I think for discussion, we should hopefully come to some interesting conclusions.  Can I ask ‑‑ first of all, ask for volunteers.  Then we can go around the tables.  Can I ask for a volunteer to say whether they found any policies which were clearly human rights respecting on this table?  Does anyone want to volunteer?  Were there ‑‑ yes. 

>> AUDIENCE: To we weren't completely sure, I guess, but we found that policy number 8 was quite respecting. 

>> SHEETAL KUMAR: Yeah.  Did anyone disagree with that?  No.  Okay.  Great.  Fantastic.  Were there anywhere where it was clearly, clearly not human rights respecting?  Number 3.  Not sure.  So number 3, which is one relating to decryption.  Why were you not sure about that one?  Okay.  Did anyone else ‑‑ did anyone else have any opinions on number 3? 

>> AUDIENCE: Oh, sorry. 

>> SHEETAL KUMAR: Another one.  Yes.  Over there. 

>> AUDIENCE: Yeah, we definitely found number 3 not human rights respecting.  Actually, almost all of them except 8 and 9 were in that category for us. 

>> SHEETAL KUMAR: Okay.  And why would you say that number 3 is not human rights respecting?  There's a language there regarding safeguards, for example. 

>> AUDIENCE: The language, the safeguards are nowhere near strong enough.  The official region believes.  (?) We found the safeguard to be nowhere sufficient in relation to human rights respecting. 

>> SHEETAL KUMAR: Okay, thank you.  I know we have a lot of experts in the room who work on this issue a lot.  I was wondering if you have any opinions as to where encryption orders can be human rights respecting or whether the language is strong enough.  Yes. 

>> AUDIENCE: Sorry.  I'm not generally in favor of decryption law in general, but I would say number 6 comes fairly close in terms of how it is a technical assistance requirement but that only requires decryption where the provider itself has provided it or require in situations where the user has had an extra level of encryption and there's a level of judicial review that's included.  So this is not my favorite thing, but it at least comes a little closer to any other examples that we've seen. 

>> SHEETAL KUMAR: Thank you very much.  There's some criteria there in what you mentioned about what strong‑enough safeguards might look like when it comes to decryption orders.  Did you have a comment there on that table? 

>> AUDIENCE: On number 3, (?) The order.  You can order to encrypt information but not somebody who's accused of something and does not distinguish between both. 

>> SHEETAL KUMAR: Okay.  So there's an issue with the lack of clarity in the text there in the language there.  Thank you. 

>> AUDIENCE: Thanks.  Maybe I'll add to that (?) But what I do note about 3 is that it's quite specific on the search warrant (?) It meets the criteria. 

>> SHEETAL KUMAR: Okay, great.  We may be up to de-identify these, of course, because of the space that we're in.  But like I said, these are speaking from real laws and policies.  Obviously there's a disagreement here about whether adequate safeguards exist in some of these encryption orders, for example.  Were you convinced by that on that table about 3? 

>> AUDIENCE: No. 

>> SHEETAL KUMAR: Okay.  So stronger safeguards needed.  What about number 6?  That's pretty strong.  From what we've heard. 

>> AUDIENCE: It may have some prejudice against the interior. 

>> SHEETAL KUMAR: Okay.  That is interesting.  Okay.  So we have a couple minutes left.  Were there any others where people felt they wanted to share their views on how the language could be said to be either human rights respecting or not or wanted any clarification on any of those?  No?  Yes.  That side. 

>> AUDIENCE: I'm just wondering, because I'm reading the laws they have and some of the discussion we had before this.  It feels like there's a lot of self‑incrimination involves there because if you're in court, they can't exactly ask you whether you confess to the crime.  (?) And in that case isn't it decrypting the data itself in some cases?  (?) But I think it involves cases where you're ordered to decrypt something to show a message you have.  And the next thing you know, your prosecutor says something else? 

>> SHEETAL KUMAR: Yes.  I think the question on self‑incrimination is one of the more interesting aspects of the debate.  It certainly relates to decryption very clearly.  And I understand the laws under the jurisdiction when it comes to self‑incrimination.  Whether you have a right to not self‑incriminate.  But Richard, did you have any comment on that particularly? 

>> RICHARD WINGFIELD: Yes, you're right.  There is a general sort of right to sort of deprocess, or the right to not self‑criminal Nate varies a lot from jurisdiction to jurisdiction.  So maybe one question which looks at ‑‑ a number of people said they felt it wasn't human rights respecting.  And I wondered why, as a law that's required witnesses to assist courts in providing evidence or documents upon a court request, why you think this would be a human rights respecting piece of legislation?  (?) The fact that it could be private.  Okay.  That's good to know. 

>> SHEETAL KUMAR: Okay.  Thank you very much.  I'm afraid we're going to have to wrap up here.  But I hope you found that instructive and thought provoking.  There are a few things that I feel did come out of the discussion here and certainly have come out of a lot of discussions on this issue in the past few years.  One is that from a human rights perspective, strong encryption by default should be the norm because of its important role as an enabler of human rights and trust and security in the network.  But any restrictions should be very limited, targeted, the law should be clear.  That doesn't exist.  But there should be due process and very strong safeguards against abuse.  And we heard what those might look like. 

And then we lastly touched upon about where the debate is moving.  The use of encryption workarounds is part of that, of course.  There are various different types of encryption workarounds.  We've talked a bit about decryption which is any number of encryption workarounds.  When law enforcement, for example, can't access information which is encrypted, they will either find other ways and one of those is obviously decryption.  Not necessarily agreement of what that might look like for it to be human rights respecting, perhaps.  But some idea of what that would look like, very strong safeguards, for example.  But also, what I think is a really interesting aspect of this debate and where it's moving is legal hacking and the use of exploited vulnerabilities that do exist in software because humans make it.  And as opposed to actually introducing backdoors or using those that are already there which is happening and whether or not that might be a solution going forward.  And if it is, which is a very contentious topic, what that would look like and how much transparency is needed.  And I know that's going to be the topic of a debate on Thursday, I think it is.  So hopefully I'll see some of you there. 

Lastly, before you go, we have, as part of our work on this issue, developed some tools to support human rights advocacy on this issue.  I have a few copies our guide to encryption policy for human rights defenders which explains the link to ‑‑ I'll find those in a second ‑‑ the link to human rights.  First it explains what encryption is.  It explains its relationship to human rights.  The spaces at the national, regional and global level that encryption policy where it's been made.  And some key policy ‑‑ policy messages or key messages that advocates can use in their advocacy.  So if you have interest, please come and talk to me or Richard.  And it's also online.  So you can find the guide to encryption policy for human rights defenders by Googling that, by using a search engine of your choice.  Sorry.  Whoops.  By using a search engine of your choice or going to the GDDP website.  So thank you very much for staying.  And for contributing to this discussion.  I hope you found it useful.  And I'll see many of you around soon. 

[ Applause ]

(The session ended at 18:20.)