IGF 2016 - Day 4 - Room 2 - WS196 - On cybersecurity, who has got our back?: A debate

 

The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

>> SHEETAL KUMAR:  Hello.  Okay.  Great.  Thank you so much everyone for coming.  I know it's Friday.  It's the last day of the IGF, so you never know if people are going to be able to make it, so very grateful to you all for making the time to come to this session. 

I will be chairing the debate today, and my name is Sheetal Kumar.  Before we get started, I'll let you know what the format of the debate will be.  I will introduce the speakers.  We have four speakers with brief bios, before I make introductory remarks, and then each speaker has two minutes to post an opening statement, and then I will pose one or two questions to the follow speakers, who will have a chance to respond, and they'll also have a chance to respond to each other's point.  And then we'll open up the debate to the floor, and I actively encourage you to participate.

You can ask a question to each of the speakers or to all of them, either of them or all of them, I mean.  And then each speaker will make a one‑minute closing statement before I make some very brief closing remarks.

So, we'll get started right away.  Our first speaker is Brian Berstein, from 2008 to 2010 the Associated Press Technology Editor before joining the IT Tech Review.  He has authored a number of articles including the Crypto Debate, and including the FBI versus Apple case earlier this year.  He started with GEO in 2004, and the career track led to the Open Society Foundation, where he managed the media program for four years.  He supported nationwide efforts on professionalism, ethics, the open Internet, and advocacy for access.  He's been a founding member of the Pakistan Coalition on Media Safety, the Pakistan Coalition for Ethical Journalism, and Alliance for Access.  He has represented the Pakistani media development sector at the national and international sector level.

We're also very lucky to be joined by Tatiana, international standards to fight cybercrime, the comparative analysis of cybercrime regulation, self‑coalition, public/private partnerships to address cybersecurity issues and multi‑stakeholder approach to fighting cybercrime.

And I believe we're also joined by Dominique Lazanski; although, I can't see her right here.  Can you give us a wave?  I think she was running from another session.  She will be joining us soon.  She works on Internet policy and Internet governance for the GSM Association and is walking toward us now, which is great.  Thanks, Dominique, for joining us.

She began her career with positions at Yahoo, and at Apple helped launch the first iTunes store in the U.S.  The conference in London, where she secured business participation.  Since joining the GSM Association in 2013, she's led the members, ITU, and Internet Governance Task Force, so I'm sure you all agree that we have a really great panel of speakers here to share their expertise with you and their perspective, and differing perspectives on the issues, and I'm looking forward to them being able to do that over the course of the next hour.

We often hear we live in an age of data, our speech acts, our shopping, our work, and so many aspects of our lives intersect with and depend on digital communications platforms, which generate huge amounts of data on a daily basis, every click, every tap, every swipe.

And yet, we do that on what we often hear is a network that wasn't designed for security.  The security of that data, and of our trust in using the platforms we use, relies, in part, on encryption, which protects among other things, the confidentiality, authenticity, or availability of information.

And if we also understand that those same properties are central to cybersecurity or the preservation of information and the networks underlying infrastructure, then encryption is not only central of our lives, but use of the Internet itself.  So why, considering all of the benefits of strong encryption, would we ever debate or consider restrictions on encryption or weakening encryption standards at all?

Restrictions can include, government licensing requirements, export controls, encryption key disclosure, and encryption policies to increase government access, while strong encryption measures can stimulate the adoption of strong encryption standards.

As use of the Internet has become more widespread, private‑sector companies that are the gateway for our use of the Internet via electronic devices, have moved to increase strong encryption.  In particular, since Snowden revelations in 2016, a number of the big device manufacturers have moved to put in place full disc encryption, end‑to‑end encryption by default, meaning they don't have access to the devices they develop or communications provided through their services.

What happens when law enforcement has the legal authority to access data but not the technical ability to do so, to access a device or decrypt encrypted data?  This issue has reignited debate following the media coverage of a confrontation this year between the FBI and Apple when they sought access to an Apple smartphone, which Apple claimed they couldn't, themselves, access and refused to even do so.

This debate is in no way limited to U.S.  It led to calls in law enforcement agencies to enable new measures that would allow exceptional lawful access to data, and a patchwork of legislation across the world, South Africa, Finland, and Belgium are among those key disclosure laws exist, and countries from the UK to Russia to UAE to France have introduced legally‑mandated data law retention laws which compel ISPs to decrypt data as well.

But what many human rights defenders point out is that these measures undermine the rights to privacy and to freedom of expression by exposing personal data to unwanted access and increasing the vulnerability of data to malicious actors.

And what of cybersecurity, the resilience stability of the Internet itself?  One of the arguments Apple made was breaking into the devices would introduce vulnerability to the network that could not be contained and a Pandora's box of technical insecurities awaited.  Is that true?  Is there really no middle way?  What are the legal and technical solutions to ensuring that the Internet is strong, secure, and resilient, that is human rights‑respecting, and allows government agencies to do their job in accordance with the rule of law?

As more and more people and devices connect to the Internet, I believe this is a policy debate we can't afford to get wrong.  So, on that note, I'd like to invite our first speaker, Dominique, to make a two‑minute opening statement.  Dominique has to leave in half‑way through, so thank you so much for joining us.

>> DOMINIQUE LAZANSKI:  Thank you, and sorry for being late.  And, I apologize in advance for leaving, I think it's the last day, so I think we're all running around doing reports and various other things.

And I want to thank you, actually, Sheetal, for inviting me.  I'm just going to offer a couple of thoughts, in light of what the opening statement was, and I'd prefer to, quite frankly, hear from you and have a bigger discussion.

So, in thinking about cybersecurity, and in particular, in both the business environment and the mobile environment, there are a number of things that we think about internally.  And again, I also have some pamphlets from ICC BASIS, which is the business group or the organization that comes here and represents us here, and I'm going to leave them with you, as well.

A couple of different things.  I mean, I think, getting around the legal requirements is something, I think, we want to touch on.  I'm going to look at legal requirements and standards just in my intervention, briefly.

First of all, in terms of legal requirements, I think a lot of us here who work in the industry, have to address the fact there are a number of growing contentions that continue to happen.  The rise of encryption, certainly, challenges us as operators as well as the business community to figure out how we are going to comply with different legal requirements as well as manage our networks, in particular, in a changing environment.

And in Europe and where I'm from, and where I live, the NIS, the information security network information security directive was a discussion that was, you know, took place over the last couple of years, and it was quite interesting to have a long discussion about how we, as industry, have risks reporting different ecosystems on the network, through the different network operators, through the content providers.  And how we deal with that, in terms of not only just across Europe, but requiring to have legal requirements in other countries in‑country, as well.

So, we've been thinking quite a lot about that.  And, as we see in a lot of developing countries, many times the legal requirements around cybersecurity are quite strict and quite strong, for good reason, whether it's national security or whether it's, you know, very nascent and growing environment for the market.

It's something that we think about a lot, and it's something I'm bringing up and highlighting here because I think it's a good discussion point to have going forward.

But the other thing I, kind of, want to just touch on, is the Internet of Things and growing security issues that we see from a mobile perspective, obviously, but also, in general, from business.

Standards are at a very early stage for Internet of Things, as our devices, as our connectivity, authentication, all of the different aspects and different layers of IOT that will enable IOT to happen from low‑powered networks as well to various other areas.

And so, when we think about security and cybersecurity, I think the main focus in the coming years, in the coming months even, will be privacy around that issue, will be connectivity, will be security, will be data, data collection, big data.  Many of these topics we heard throughout the week discussed, and I think this will become even more evident as all of the development of standards, as well as the different objects and the networks mature.

I would just say, perhaps from an IOT perspective, we at the GSMA have developed a very flexible, very agile security framework for IOT.  We're in our second iteration of that.  The first one was launched last year in March, and the focus on the IOT security framework that we have is that it really looks at, you know, network security but endpoint security as well, ensuring that consumers, as well as individuals, as well as operators who deploy these particular devices have all the tools that they need to do basic things, like change passwords and enable that they have confidence and trust in those devices.

So, going forward, we look at flexibility and interoperability as key, but that goes hand in hand as key but developing standards that are flexible and interoperable.  But the connection is there between privacy and connection of data and how all of these things are going toward and what the development of them are.  We actually still don't know it's at this point.

So, I'm going to leave it there.  I hope that was somewhat thought‑provoking, but I would like to sort of open it up for more questions.

>> SHEETAL KUMAR:  Thank you very much, Dominique.  As you mentioned, or your points definitely point to the complexity of the issue and the need to engage all stakeholders, but also some of the active or proactive measures that industry is taking to try and address this issue.

I'd like to invite Brian next to make his opening statement.  Thank you.

>> BRIAN BERGSTEIN:  Thank you.  It's good to be here.  It's an honor to be here.  There is so much to take on.  I know you asked me to only stick to a few minutes here, and I want to make sure we have time for a robust discussion, so I'll just try to hit the top level of a few things.

One of the questions that you asked us to debate in this session is whether tech companies should share our, you know, consumer data with governments.  And I think that, you know, the answer surely is, it depends, in general.  In principle, the answer is, yes.  Of course, not in all circumstances, not in all countries, not without due process, not through bulk surveillance.

There are a few things I would like to get everybody thinking about, because even if you agree with the premise that, sure, there are instances in which it's appropriate for governments to have access to consumer data, I think it's useful to examine our assumptions about why that is so.

So, the first thing, I think, is that this idea does not have to at all be in conflict with our right to use encrypted communications.

For one thing, plenty of our data right now remains unencrypted, so it might be used by tech companies in the course of their business.  And with due process and proper procedure, I think there are times when it's appropriate for tech companies to share what they know when law enforcement, for example, has a legal right to that information.

It's been understandably temping since Snowden to argue that tech companies should distance from government, whether we're talking about national services or local police.

And after all, it's natural for us to want more privacy and more protections against misuse of our personal data.  The only thing I wanted to get across, is that it is possible for this distance between tech companies and Internet companies to grow too large.  I'm not saying the gap is too large now, I'm not saying I buy into the going dark argument, but I'm saying that there is a big caveat here.

It was interesting to read the recent UNESCO report in human rights and encryption.  It said encryption, generally speaking, is necessary but not sufficient to protect people and sensitive information in a networked world.

So, I thought about that.  Okay.  Well, if it's not enough, if it's necessary but not sufficient, what else could we possibly have to give us these protections that we need, the protections that the technology alone cannot provide?  And the answer is, our laws and our institutions.

So, there are both ethical and practical reasons for nurturing these institutions and the rule of law.  I know I only have a couple of minute, you know, but on an ethical basis, I think democracy and civic participation, you know, depend on a world in which there is some transparency.

If we imagine a world with no limits on privacy at all, I don't think we could hold elected officials accountable, let alone criminals.  And privacy, like freedom of expression, is not an absolute right.  There are limits to it.

But, there was something that I heard this week in another workshop that really stuck with me, which is the practical side of this.  If governments have no efficient and thus, auditable process, a process that can be monitored for obtaining data about people, and again, obtaining the data only under due process when warranted, not in bulk surveillance, if they don't have a process that is actually respected and nurtured, they will often hack their way into systems that hold in data.

And we know they will do that.  We've seen it done.  They will install malware that facilitates surveillance that they cannot get through other means, and that has far worse consequences for human rights, and transparency is diminished in that scenario.

I don't want to go over my time, and I want to leave lots of room for this to continue, but I think that it's important that we keep asking the question, not whether something is good for privacy, but we need to also ask how that measures with our other values.  Is it good for democracy?  Is it good for civic engagement?  Is it good for human rights?  I think we should strive to nurture and reform processes which law enforcement can obtain, at times, obtain consumer data, and it's not in conflict with the right to encryption.  So, I hope technology will work to strengthen the institutions and not to weaken them.

>> SHEETAL KUMAR:  Thank you very much for bringing those questions to the debate, and I'm sure we'll talk about how policy measures can transfer from one regime to another, particularly, we can come across rather repressive political systems where one set of policies might not work for democracy, or in the interest of democracy, as they might do in another political system.

So, we'll come back to those questions.  At this point, I'd like to invite Tatiana to make her two‑minute statement.

>> TATIANA TROPINA:  Thank you very much.  I will be speaking from my lawyer experiences.  Sheetal, you refer to human rights defender, you said many human right defenders make this statement, and I'm sorry to say, but I think that human rights defenders are wrong, just plainly wrong.

To put this debate into all or nothing context doesn't do good for anyone because this very question, which is posed for this debate, doesn't provide distinction between different types of encryption, different type of data, and it completely blurs the boarders between legitimate demand of law enforcement agencies and the dangers of Pandora's box with regard to encryption.

I think that, first of all, we have to think about safeguards around getting access to data.  If you look from historical perspectives, interception of communications, which is also data, is data.  Or interception of letters exists as long as criminal use phones or correspondence to communicate with each other.

According to existing telecommunications in many countries, providers have to capture the communication for criminal investigations, and they have to provide it to law enforcement agencies in readable format.

What is specific about this is that, in most of the country, law enforcement need a court order for this.  They need a court warrant, and it should be given against a specific person for a specific crime, and I think this is completely legitimate demand if we're talking about that data.

The problem is like in FBI/Apple case, the enclosure of encryption dangers everyone when you cannot disclose the information, when you cannot provide the key for this encryption without endangering all of your customers.  So, but, I do believe that, in case of government access, the court order, it's completely legitimate demand for the crime investigation.

Then the statement doesn't distinguish different types of data, meta, like traffic subscriber, dual location, in most of the legislation jurisdictions, they have different safeguards.  Then again for which propers, for crime investigation, for intelligence.

So, I do believe that the companies shall not allow the access to encryption techniques.  They shall not allow the access to master keys and not provide the backdoor to encryption.  Here we have full stop.  When it's about crime investigations, when it's about legitimate demands of the law enforcement agencies, which are submitted with a due process with the safeguards with court orders, of course, governments, law enforcement, should be allowed to access data for legitimate purposes.

But only under the strict safeguards and for only individual crime investigations, and that shouldn't be upon bulk data.  I don't think we'll get rid of it, I mean, this is just reality, but I think it should be limited by applicably.

Thank you for exposing, and for the, perhaps, unhelpful black and white nature of the debate that can sometimes be used to frame this question.  And on that point, I notice there were already a couple of hands for question, which is really great, but before we go on to the floor debate, I'm just going to ask our final speaker to make a two‑minute statement.

>> PRANESH PRAKASH:  I'll just move here.  Can you hear me?  Okay.  So, I think the question ‑‑ thank you so much.  The question for me is whether or not there should be legal means around encryption or whether or not governments or law enforcement agencies should be given authority to encrypted data.

Let me start by saying there is absolutely no other way of saying it, which is that a weaker encryption system is definitely no excuse for anything.  It does not solve any problems, it does not provide any solutions to anything.  A weaker encryption system can exploit the government or developments or other groups in the world, that's one.

Also, you know, I've been hearing this debate here at the IGF, whether or not there should be this idea that a backdoor should be built in an encryption system for government and law enforcement agencies to use that.  And I feel this debate truly lacks a global perspective because when we say government as a stakeholder, we need to understand that not all the governments around the world are the same.

There are governments who are responsible and who are involved in actively bombing their citizens as well.  There are governments in the world who are actively involved in threatening journalist, killing them, threatening human right defenders, killing them, and so on.

So, we need to understand that such a power given to law enforcement agencies or the government can definitely be used against them and we have seen this, you know, and we have seen this happening in Pakistan for a very long time.  So, I think this is one point that we should really consider, you know, when we are talking about it.

And, I think for me, there is this another point that it's not really about this debate between security and privacy anymore.  It's not about privacy.  So, if you want access to me, it's not just access to me, or my private message, it's my bank account number, 10 applications that I'm using, 10 things of data that I have on it.  So, that is security for me.  That is a security for citizens.  So essentially, what we are talking about is security versus security and not security versus privacy.  So, that is, I think, another element we need to consider in our debate.

And frankly, I agree with what was said about encryption and, you know, UNESCO report to some extent.  It's necessary, but it's definitely not the only set of things through which we can see, sort of, security in other means.

So, let me just give you an example in Pakistan.  This new law, which is apparently the cybercrime legislation in Pakistan, it allows and gives access to the government to hold anybody's data, anybody's devices on a simple doubt, and hold it for 24 hours.  And the person whose data is being held is required to give up the encryption keys, the passwords, and everything.

And this process does not require, you know, a court warrant.  So, we need to understand that while we are having this debate of encrypting data or whether or not law enforcement agencies should be given access, there are countries, and there are laws which have already found a way around it, and they are using it against journalists, human right defenders, civil society, in general.  That's perhaps, I think, another point we should consider in our debate.

Lastly speaking, I also need to, sort of, have a look at the other side of the argument.  Here is a hypothetical situation.  What if it means stopping 10 attacks on civilians and saving hundreds of civilians, rights?  How does that argument stand against the argument of encryption, the argument I'm making here?  I think it's a question open for the debate, and we can talk about it.  Thank you very much.

>> SHEETAL KUMAR:  Thank you very much, and also for bringing in the perspective of the situation in Pakistan, which is just how complicated it can be when democratic countries legalize certain measures that can, quite evidently, result in the violation of rights if they don't follow rule of law safeguards, including, for example, ensuring that there is a court warrant before compelling a user to allow access to their electronic devices.  That's clearly a worrying trend.

At this point, I just wanted to ask, if any other speakers wanted to respond to any of the other's points?  If so, please do.  Otherwise, I can see, because there are already questions on the floor, we can take those questions.

Okay.  Great.  So we'll take the question from the floor there.  The lady, please.

>> AUDIENCE MEMBER:  Thank you.  I'm Solineta from Fiji, speaking in my personal capacity.

In a past life, I used to be a group regulatory counsel for Talco, so I know what it's like to receive warrants or to release information to law enforcement officers ‑‑ I mean, to give information to law enforcement officers on a subject and receipt of warrants, that's one.

I've also been a criminal law attorney; although, I don't practice that quite as much now days.  But, having said that, I hear what Tatiana is saying in relation to, absolutely, you know, there should be no release of data unless there is, you know, proper warrants.  And of course, the evidentiary rules and procedures differs all across jurisdictions.

I really like what Pranesh said.  If you look at the bottom line, you have completely utilitarian states who are completely not democratic, 100 percent surveillance, who deliberately throttle Internet access, like in Iran, and even in Ethiopia recently.

You know, literally force the telco to throttle access, and then on the other hand you have democratic state, partially democratic states who have a more efficient check and balance mechanism; although, there is no perfect state.

Having said that, I would like to say, who has our back in terms of the security debate?  Just hearing the three panelists, the three debaters debate on the issue, I would say, we really need to consider what was said from day zero and what the technical values workshop sort of suggested.  We need to look beyond design and architecture, look beyond code and encryption, and look into philosophy in terms of, how do we infuse ethics, ethics on all boarders, that can be universally applied, regardless of the type of government, type of judicial system, diversity of evidentiary systems.  Thank you.

Thank you very much for that comment, and I will ask the speakers to respond, should they wish.  And, I just wanted to make a brief remark there, just to offer some more food for thought on that comment. 

We often hear that there have to be legal and technical solutions to these issues, and what's interesting, is that it's really not either or.  So, pushing for more privacy by design, privacy in the base layers of the infrastructure of the Internet, is certainly something that can be done, but it also requires an enabling policy environment.  And some would argue, that it doesn't help, necessarily, if trends or policies are moving in a direction which would not enable those types of innovations to be made, so they certainly interlink and reinforce each other.  So, that's just one brief comment I wanted to make.  And, if there are any other questions, we can take those now before responses from the speakers.

Can I ask the gentlemen there?  Yes.  You were second.  Thank you.

>> AUDIENCE MEMBER:  My name is Raj, from India, representing mobile operators.  The question I have is, do government, especially democratic governments, are they constrained by the same set of guidelines that they use within their borders as to what they use outside of their borders in their activities vis‑a‑vis cybersecurity.?

>> SHEETAL KUMAR:  Thank you for that question.  We'll take one more question before the speakers respond.  Yes, please.

>> AUDIENCE MEMBER:  I'm Michael Nelson.  I handle global public policy for CloudFlare, a web security firm.  We protect about 4 million websites from distributed denial of service attacks.  And in the last year, we've actually doubled the number of websites around the world that use HTTPS and stronger forms of encryption, so we have a real stake in building out a more secure web.

I have a question, which is actually the hardest question I've dealt with in cybersecurity policy in the 28 years I've been working in this area.

I was in the Clinton White House, and I had the onerous task of defending the encryption proposal.  At the time, it wasn't the debate between strong encryption and weak encryption, it was a debate between weak encryption and no encryption.  We learned a lot of things.  The most important thing we learned is that, if you're going to build a system for law enforcement access, you have to make really tough choices about which countries are going to be inside your circle of trust.

And so, the question I have for both sides of the debate is, if you are going to allow governments access to data, encrypted or not, under some regimes, do you have a way of telling countries, no, you aren't a nice country, you can't be part of our club of countries that have access to the data of potential criminals?  That is, actually, the hardest question we dealt with at the White House, and I believe that's actually why it never got international support.

>> SHEETAL KUMAR:  Thank you very much for both of those questions which both touch upon the difficult question which we come across constantly in questions related to Internet policy about jurisdiction and differing laws and differing actors intersecting on these issues.

I would like to ask Tatiana to respond, if possible.

>> TATIANA TROPINA:  Thank you very much.  I would like to respond to the last question about the countries who do and do not have access to data.  I'm, actually, working in this field for quite a number of years.  And yes, there is a nice club of countries who have access to data, who share data between each other.  And Europe, between legal matters and mutual assistance of 2000, allows direct access of data.  It doesn't work, but it does as well.

U.S. is a very specific case, because U.S., for example, does not intercept communication under the mutual legal assistance regime, and there are some problems with the handling of data out to foreign law enforcement agencies.  But I do believe, if you look at the mutual legal assistance from historical perspective, it has always been mutual.  It has been bilateral.  So, if there are some multilateral mechanisms, even countries who have participated in them sometimes, have different agreements between them, between two countries, for handling data for easier procedures or better safeguard.

And yes, there are clubs of countries, and they share data between them for the purpose of criminal investigations, sometimes using easier channels, and sometimes it is a bit harder, and some countries are excluded.

You know, once I was at a very interesting meeting, I think it was the United Nations on mutual legal assistance in cybercrime investigations.  U.S. was blaming China for the lack of safeguards and lack of cooperation in the system.  And then, Chinese representative came and took the floor and said, okay, so now, how many requests did U.S. send to us?  Like, let's say 100.  How many of them got response with within 24 hours?  80 requests.  How many of them didn't have any response?  1 or 2.  Now, vice versa.  How many requests did we send to the U.S.?  100.  How many of them got response?  1.  I mean, seriously.

>> SHEETAL KUMAR:  What year was that?  How long ago was that?

>> TATIANA TROPINA:  Like four or five ‑‑ no.  Three or four.

>> SHEETAL KUMAR:  So, I think, that shows not all is fair and just in the world of cyberspace as anywhere.  Thank you very much for those comments, Tatiana.

I wanted to ask if any of the speakers have any further responses to those questions before we take another question from the floor.  Yes, Pranesh.

>> PRANESH PRAKASH:  Thank you.  I just want to point out here that when we categorize comments as democratic and nondemocratic, we're running into a risk of oversimplifying things.  There are many areas where there are gray areas, and there are many governments where there is a very strong civil military friction there.

And while we have this impression that all the law enforcement is actually being done by the civil actors, in reality, on ground, it's not.  So it's, I think, counterproductive to say that, you know, a democracy government should have this right and so on.

The second point I want to make, this discussion, while it's extremely important, I think this is how the government, the very oppressive governments, in countries like Turkey and Pakistan and so on, that should be put on and crackdown on the Internet in these countries.  When we really talk about government as a stakeholder, we should think of the global perspective as well.

>> SHEETAL KUMAR:  Thank you, Pranesh for that, and for reminding us to keep thinking in terms of the global perspective, which is a point I'd like to revisit before the end of our session.

At this point, I think there is a question there at the back of the room.  Do you have a mic?

>> AUDIENCE MEMBER:  I do.  Andrew from Australia, I have two points.  I'll start with a short one.  As a software developer, in that capacity, when we hear it's not helpful to make the debate black and white, what that often sounds like to us is, please nerd harder, nerds.  Surely you are smart people, you can solve unsolvable problems.  Of course, this is a simplification.  Of course, there are areas where it's truly not black and white.  There are legal and philosophical issues.  But sometimes technical issues, you cannot confront and say, please don't make it black and white.  Sometimes there are underlying technical issues that are black and white.  It's very annoying to software developers to be told to stop making it black and white.

Now, my more substantive comment.  I would like to respond that it's not security versus privacy.  It is, indeed, security versus security.  If my personal devices are compromised, the government gets access to my devices, as many people do.  I have legitimate access in my workplace, and also, perhaps, if I'm a consultant to clients.  Once my personal device security is compromised, I now have several problems.  My bank account is vulnerable, it's difficult for me to certify the false evidence is not been planted against me, and I have potentially compromised the trust of all of my clients and my employer.

So, you know, there is a serious security problem for anyone when their personal device is compromised by law enforcement.  It can, in fact, be completely life changing because you now have an enormous task to reestablish your entire life security.

>> SHEETAL KUMAR:  Thank you very much for those comments.  Dominique, would like to make a comment in response.

>> DOMINIQUE LAZANSKI:  I absolutely, I absolutely agree with you.  And partly, I'm a policy person and I work very closely with engineers, quite a lot, and this is the issue because, I think, one of the things that's just been highlighted is that every country has a, sort of, very complex set of, you know, whether it's norms or legal framework or requirements or whatever it is.  And so, our operators in each country have to deal with that as they operate their networks in every country.

However, that's, sort of, more the policy side of stuff.  But from the technical point of view, the technical stuff works.  Right.  It works.  It's very, you know, it's very specific, and yes, we would like to see nerds nerding harder.  That's awesome, but it doesn't matter.

It's the attention we always have.  It's sort of super ‑‑ it's super jurisdiction, right.  It's international, it's global, it's all that.  I think one of the things that this conversation is going toward is, kind of, addressing both of those issues.  And, I think, that's a really, really good point, but it's something that we see all the time and it's something that I find challenging when I go and work in the standards areas with a policy hat on.  It's a very difficult thing.  So, I would ask you, and I would challenge you to say, what would you need from us, from a policy, from a government, from a framework point of view, to make your life easier in some ways?

>> SHEETAL KUMAR:  Great.  Thank you very much, Dominique, for being able to be here for the time that you were.

And, I can see that there are a few more questions there and responses.  I just wanted to go to the lady in the back because she's been in the queue for a while.  Thank you.

>> AUDIENCE MEMBER:  Thank you Sheetal, and thank you to the panelists.  It's been a very interesting discussion.  My name is Samara, a legal researcher from the center of communication governance.  And we've been engaging a lot with the encryption policy in India.  I agree with Tatiana when access to information needs, by the government, when it goes to a judicial court ordered for checks and balances is incredibly important.

To give you a brief context, we had a national encryption policy, it invoked a huge public outreach.  It was within 24 hours.  All that is fantastic, but now we hear the Indian government is actually in the process of protecting this technology, which allows them access to encrypted information from Israeli, in same, I think that assisted the FBI in the Apple case.

So, my question is, really in this context, when the government is accessing such technology, and it's the executive operating it, what is the process as we as academic or civil society organizations do to adopt to ensure that checks and balances are inserted?

>> SHEETAL KUMAR:  Thank you very much for that question.  Was it directed at any one of the speakers or just generally?

>> AUDIENCE MEMBER:  Just generally.

>> SHEETAL KUMAR:  Okay.  Thank you.  Were there any of the speakers who would like to take on that very tricky question?  Yes?  Tatiana?

>> TATIANA TROPINA:  I can say only from an academic point of view.  We do work with the government, and we do help them in the legal forums to implement everything with the safeguards, so this is what we are doing.  Because, I mean, maybe it's just because I live now in the democratic country, or what is considered to be democratic country, of any legislative process that implies access to data, for example, reform and interception of communications, reform on data retention laws.  There are consultations, and we are able to contribute, and sometimes we even have the whole project for drafting these legislations and of helping on agreeing on it.

And yes, we always, always insist on particular safeguards, say, on due process.  And for example, Germany now has data retention law, but a few years ago, when there was a major debate and disagreement between two ministries, the Ministries of Justice and Ministry of Interior, they had different position on data retention debate.  And for example, our institute issued the report, which said, there is no proof that data retention actually helps to prevent a crime, there is no proof.  And because of this report, the data retention law was actually delayed for another few years, and I believe that this report actually contributed to that data retention period in Germany is very, very short. 

From academic perspective, I can definitely say we're working on this, we're cooperating, we're trying to meet in the middle because we do understand that sometimes access to data is necessary for crime investigation.  And, I also understand because I'm trying ‑‑ I know how law enforcement is struggling, so it's not either or.  It is, yes, but there are very strict safeguards.

>> SHEETAL KUMAR:  Thank you, Tatiana.  And I think we have heard quite a lot in terms of this debate from both sides, that the issue is not more data the better, but better capacity to analyze existing data.  So, that doesn't necessarily solve any of the questions, but I just wanted to make that point before asking the gentlemen there for his question.

>> AUDIENCE MEMBER:  Hello everyone, my name is Ricardo.  I am from Brazil.  I am a lawyer and researcher.  I would like to just make a comment.  In Brazil, we have some hard decisions that it's necessary, the policy enforcement to access that personal data on a cell phone.

In one of these cases, that you have mentioned, the Supreme Court U.S. decision, California versus ‑‑ California versus Riley, that mentioned it's unconstitutional, the poll police enforcement access the person's data on cell phone without warrant.  That's my view.

>> SHEETAL KUMAR:  Thank you very much for that.  And, I think ‑‑ and that goes to the heart of why this debate around the FBI and Apple garnered so much attention, I think, because it really makes a difference when a country like the U.S. sets a standard or sets a direction on these issues.  It can have an impact on countries well beyond the U.S., as we've seen, Brazil.

Brian, would you like to make a comment?

>> BRIAN BERGSTEIN:  Yeah.  So, I'm well aware, being American, right, my perspective is different than many of the other people who have spoken, but I do think that ‑‑ let me just say that, I think this discussion and others I've been in this week are a reminder of just how much technologies amplify already existing situations in countries.

So, where the rule of law is weak, where there is no difference between civilian police and military.  It's a very dangerous thing to give, quote unquote, law enforcement access under the laws of that country.

The other thing I want to say, it touches on the last comment and some of the others, as well.  The case you cite is U.S. ‑‑ the California versus Riley, the Supreme Court U.S. held that the police need a warrant to search your smartphone, which is really a fascinating decision, in a sense, that it puts the smartphone on the same legal basis as other personal effects.  So, it's just like the police have the right to, with a warrant, look at my car, look at my drawers, look at my personal effects.  They do have a right to look at my smartphone.

So, if they happen to, you know, know the password or guess the password, you know, the evidence on there is open, legally.

The thing that I just want to make sure I can stress in the limited time we have here, is that as much as we keep saying that the debate shouldn't be black and white and government ‑‑ defining government is to convert a problematic actor here for all of these reasons.  Governments differ around the world, but I think it's also important to keep in mind that it's very problematic to rely on companies as the defenders of our civil liberties.  That should worry us, right.

Again, I happen to be speaking in a country where some of these civic institutions, such as the court system, happens to be strong, at least for now.

But the fact is, no matter where you are, companies are not set up to reliably be defenders of civil liberties.  That's just not what they're in business to do.  You know, their interests change any time.  Their obligations to you are limited by what is on the user licenses that no one bothers to read.  And if you want to argue that a company like Apple has our best interest at heart, what if the technology fails?

What if we're getting our civil liberties from Apple and not everyone can afford Apple technology?  Wealthy people have more civil liberties than other people.  This is why, I think, it does come back to institutions, and we need to nurture our civic institutions, reform them.  And in countries where this doesn't exist, the technology is really a secondary question.

There was one thing that Tim Cook said in the whole Apple/FBI standoff this winter that I thought was really problematic and really stayed with me.  He was asked by TIME Magazine, okay, we understand the computer security benefits of having full disc encryption on an iPhone, but what if, you know, there is some data on the phone that could be useful in solving a crime and that was the only way to get that evidence, or what if, you know, there was the ticking time bomb scenario, and you had to get on that phone?

He said, well, you know, I think the problem, though, is that this should be between you and the police.  If the police have a problem with you, this is his words, if they have a problem with you.  He didn't even say if they could get a warrant they should be able to come to you and make you give up your passcode.  Or if you don't want to give up your passcode, there should be some sort of penalty for that.

And that really stuck with me because here we have Tim Cook, essentially, arguing that we should roll back in the U.S., which is the 5th Amendment, which is ‑‑ at least most courts have held that the 5th Amendment, which is the right against self‑incrimination, or sort of a powerful fundamental human right, he was arguing in this scenario, that our need for this optimal computer security on the iPhone might make it such that we should consider rolling back the Fifth Amendment and require you to give up your passcode.

I know some countries do, in fact, have laws that are key disclosure law, but this is fundamental.  It's a constitutional right against self‑incrimination, and courts have held, that because your passcode is something you know, you do not have to reveal your passcode.

Anyway, I think it's a reminder of how companies are not optimal defenders of our civil liberties, and we need it balance the technologies that these companies are selling us with other efforts to strengthen our institutions, defend our rights, and that is ‑‑ it's not enough to retreat only into technology.

>> SHEETAL KUMAR:  Thank you very much, Brian, for that response.  We'll take a couple more questions before we wrap up.  I just wanted to see if there were ‑‑ I'm sorry.  I can absolutely see your hands.  I just wanted to see if there were ‑‑ if there was anyone else who hadn't yet made an interjection.  Please do, thank you.

>> AUDIENCE MEMBER:  Thank you very much, I wanted to go back and rewind to when Dominique asked the question and left, I think it's a key question.  To give an example, the Dutch made a policy that they're not going to use backdoor, that makes it easier for companies like us who are designing systems without backdoors.

The other thing that would be really useful is if countries adopt add lot more transparency in what they do.  I refer to it as mutually assured disclosure.

You can collect a lot more data from me if you tell me why you're collecting it, how you're collecting it, how you're going to protect it, and how you're going to use it.  And that's the world we're going to go to.  There is going to be 50 or 100 times more data about me in 10 years, and I really want to know where that is.  And it's going to be incredibly useful for law enforcement to have access to the data with proper transparency and proper warrants.

So, I guess, I would ask the group whether they've seen examples where transparency can build trusts and if they've seen example, to quote our friend from Australia, where countries have come to understand that more security, more encryption, leads to more personal security and national security.  Because we have to tell that story.  We need some good stories.  I've heard a couple from Access Now, there is a poignant one about the woman in Mexico who was reporting on the drug gangs using her cell phone until the corrupt police in that town, who were threatened by the drug lords, got ahold of her phone, and cracked the phone.  She got killed.  She was bravely fighting the drug gangs, and she could only do it until her encryption was breached.

So, we have to tell stories like that.  And the whistleblower to uses encryption to expose a corrupt lack of security, that couldn't do it if there was surveillance on his line.  Let's tell about 10 of those stories and get the message out.  Thank you very much.

>> SHEETAL KUMAR:  Thank you very much, and I just wanted to pick up then on that question that you had.  Are there any more good stories or good cases that we can point to, to help us through this very messy and tangled terrane? But before that, if you would like to make your comment.  Thank you.

>> AUDIENCE MEMBER:  Thank you.  I would like to pick up on a point that Brian made, which, I think, is very critical.  It's a call for reflection.  So first, I want to put on my MEG hat and say that some of the things that are coming from this workshop are really critical, and there is taking stock of the IGF and also the future in terms of the evolution of the IGF, certain critical questions that the community needs to ask, and not limited to this room, but certainly, potentially, intersessional work or whatever it could be I don't want to dictate or anything, but it's just, sort of, to suggest to you.

Now, I'm going to take off that hat and speak in my personal capacity, if that's okay.  If we look at global trends, look at the recent Trump ‑‑ Trump is now president, the recent U.S. elections.  The media was traditionally received to be the watchdog in the human society, about you now you have individuals taking back the power, and sort of, utilizing social media to, sort of, push, to sort of, push and lobby and what not.

So, in the context of cybersecurity, if you look at what's happening across the world, I won't speak to the northern hemisphere, I'll point to one country in the south, which is Australia.  One of the repercussions that we're seeing because of fear and also increasing cybersecurity trends, now Australia's Attorney General released a press statement in September of this year, where they want to pass a law, where there is a proposal in their Congress to bend the de‑anonymousization of anonymous data.  That has massive implications.  It's something four years ago, we were seeing that look.  This is where we're headed to.

And, really, what got me interested in this particular session is, who has got our back?  Clearly, it's not the private sector.  Clearly, it's not mainstream media, and we can say ‑‑ I would say, in a multi‑stakeholder community, if it's all of us, and I'd like to take us back into really considering what are, you know, what are some ethical standards that you, as a community, could cover your law enforcement or government or civilian, can sort of subscribe to, regardless of different polarize the political systems.  Thank you.

>> SHEETAL KUMAR:  Thank you very much.  Thank you.  We have four minutes left, so I would ask for one more question before we wrap up.  Please do.  Is there a mic roaming at this point?  Thank you.

>> AUDIENCE MEMBER:  I'm Stewart Brown from the British government.  You're asking for examples about what can work.  The UK has just passed it's investigate powers bill.  I know that some people in this room might not agree with all of the content of that bill, but there is a question of process.  And what's really fundamental is trust, trust between governments and their citizens.  And that trust, in our experience, can only be brought about by an inclusive dialogue, so whether that be about the powers that are given to the police forces and to the intelligence services, or whether that be about the approaches that are taken to cybersecurity, this is a discussion that can't be had by government alone anymore.  It has to be an inclusive discussion that involves the public.  The bill has just passed through our parliament, has actually undergone more public scrutiny than any other bill that has, pretty much, passed in UK history, subject to three independent reviews and an intensive public debate.

At the end of the day, not everyone is necessarily going to be happy with the outcome, but going through an open process, are you even going to be able to start to build that trust?  Thank you.

>> SHEETAL KUMAR:  Thank you very much for that, and making the point about the necessity for more inclusive processes.  Because as we've heard, over and over again, we can't necessarily control what a certain country will do within a certain jurisdiction, but we can build norms around how those policies come about.

But, also, just because we do have to wrap up.  And thank you very much for everyone for all of your comments and questions.  This is, obviously, not a discussion that is going to go away.  And so, I'd like to thank you for being here and for taking part.

I'd just like to ask the three remaining speakers to make closing statement of 30 seconds.  Sorry, I've cut that in half, but also to reflect on this very important question about, yes, we can't necessarily control what will happen within a certain jurisdiction, and many times the outcomes of these debates will, as Brian said, be amplified by existing contexts in situations.

But what, in the world that we're in, can we call for in terms of global norms?  What can we say beyond anything else?  These are the things that we can all stand for that will inform policy‑making going forward.  If I can ask Tatiana, first.  Thank you.

>> TATIANA TROPINA:  Thank you very much.  30 seconds.  There is no right or wrong.  There is no black or white.  Any access to data should be restricted and done with the proper safeguards and with a court order.

>> SHEETAL KUMAR:  Thank you.  Brian?

>> BRIAN BERGSTEIN:  I just want to say how much I've enjoyed being here and learning from everybody who has spoken.  Now I'm doing to 20 seconds.  I would say that, keep in mind that the Apple/FBI case was really misleading.  It was not actually a good example of what's at stake here, and one reason is that, a fully encrypted smartphone like the iPhone, there are still many ways in which you're exposed, even if you use that phone, right.  You are still subjected to malware, you're still vulnerable on open WiFi networks.  And it's a reminder, among many other vulnerabilities, it's a reminder of how the supposed protection that got law enforcement all freaked out, even with that in place, there are many ways in which data about us is in the clear.  Law enforcement is not going dark, and yet, the fight for encryption on the iPhone should just be a very small part of what we need to do.  We need to strengthen our institutions and all other things that are part of that necessary, but not sufficient, equation in the UNESCO report.

>> SHEETAL KUMAR:  Thank you, Brian.

>> PRANESH PRAKASH:  Thank you.  I completely agree with Tatiana.  The more transparent ‑‑ the more transparent mechanisms, the more safer and the more better it is.  To quickly answer the question, who has got our back; unfortunately, it's the most damned stakeholder we have to put the faith in, and that is the government, because it's their responsibility to give us protection.

But unless they're transparent, unless they're legally, sort of, in a way that they can actually offer us protection, it's not really going to help.

>> SHEETAL KUMAR:  Great.  Thank you very much for all of your contributions.  I just ‑‑ to very briefly close, I'm going to give myself ‑‑ well, no.  30 seconds.  It's that, what we've learned over the past hour and from these rich discussion, and certainly, I've learned over the past year or so working in this field, and over the past few years with these IGFs is that the Internet policy field is really complicated, and there are no easy answers.

But with this particular debate, it's so important, I think, for policy to be based.  And, I think we've heard this elsewhere, that it's based on fact and not fear, respectful of human rights, and as long as we keep that in mind, hopefully we can make progress.

And following the whole, you could say, potentially misleading, Apple/FBI standoff, there was a lot of calls from academics and journalists for further public debate out of this issue, and at least that has come out of it in the short term.  I hope, in whatever small way possible, this debate at the IGF has helped to contribute to that as well.

Just before we all go, I'm not sure if there is still some copies left, but there are some travel guides or introductions to Cybersecurity for Human Rights Defenders, which Global Partners Digital have produced and authored by Karli, a Legal Director at Policy National.  I encourage you to take a copy of that.  It's also online.  I thank you very much for everyone for coming, and especially to our speakers.  Thank you.

(Applause).

(session completed at 11:48 a.m. CST)