IGF 2021 – Day 2 – WS #278 Networked trust: encryption choices to a reliable Internet

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> André Ramiro: Hi, everybody. I would like to welcome everyone who is attending this session. Networked Trust: Encryption choices to a reliable Internet.

    Welcome to those who are in person at Katowice and, of course, for those who are watching us remotely.

    To help me with this mission, we'll count on Nathalia Sautchuk. She will be our on‑site moderator, straight from Katowice.

    And then, as Rapporteur, we'll count on Luiza Brandão, who is the Executive Director for the Internet Society, also from Brazil.

    Before passing the word to Nathalia for her to introduce, finally, for our speakers and open our first round of interventions, just a few words about our panel. It was a proposal conducted by several entities. Among them, the organization I just mentioned, the Internet Society, the Brazil Chapter of the Internet Society, the Center For Democracy in Technology, the software Freedom Law Center from India, and the (?) Foundation.

    Very previously, too, we considered that encryption policies are more broadly speaking encryption choices can be seen as a trust model that has to be deployed collectively towards a collective sense of trust in the digital ecosystem.

    So we believe that it can be seen as a possible approach that might help stakeholders to collaborate with themselves when it comes to a sense of trust with cybersecurity with our special focus, of course, on encryption. And we count, of course, on our speakers to develop those issues.

    Finally, I want to encourage the public to participate with us by putting your comments and questions on the session in the Zoom platform and your impressions to the YouTube transmission of our workshop.

    Okay?

    So we will be gathering keywords from the speaker notes as well as from the public questions to give form to a cloud of words and expressions that we will, of course, be sharing with everyone later.

    So that's it from me for now. I pass the word to Nathalia.

    Nathalia, please be welcomed again.

   >> NATHALIA SAUTCHUK: Now we will start presenting our participants and panelists, first, of course.

    Thank you, all, to be here and online, of course, also.

    So we have Patrick Breyer, member of parliament. He'll be the first to answer our questions in the following round.

    Pablo Bello.

    Lidia Stepinska‑Ustasiak, it's a little difficult, but the member of the Internet Governance Forum program committee.

    Jeremy Malcolm, an Executive Director of the Foundation.

    Vittorio Bertola.

    Shall we start our session here. Basically, we have two rounds. Our discussion will be based on these two rounds. I have one of our amazing panelists will have four minutes to address this discussion, these specific questions that we pass in advance to then think a little bit about that and putting their perspective. Their stakeholder group perspective.

    This important issue is encryption.

    The first question I want to put here and will call Patrick for being the first one to address these. It's about a safe digital space.

    How should government, Internet, business, and other stakeholders protect its citizens, including vulnerable citizens, especially? We know there's a lot of questions in this regard. How do we protect them against online abuse?

    Please, Patrick, go ahead with your thoughts about this question.

   >> Patrick Breyer: Good afternoon. Thank you so much for having me. I'm a member of the UN Parliament since 2015 and a member of the civil Liberties Committee.

    In Europe, the encryption wars are full on.

    The European Parliament is fighting for mandatory encryption and the proposed privacy regulation. We fight for this because we know everything today that everything that is not encrypted is not safe from intelligence services.

    Since you mentioned, explosion of persons, we know, from Edward Snowden, that pictures are shared of naked persons or even of genitals, just for fun, these are shared by people, and these intimate recordings and pictures belong in nobody's hands other than those concerned.

    The European Parliament is now fighting to outlaw member states interferences with the right to encryption in the new Digital Services Act. Part of the Parliament's position will be to call for a provision that will guarantee users and services the right to offer encryption.

    Also, on the positive side, we've seen from Europe a government coalition agreement last week in Germany ‑‑ or two weeks ago from Germany, which defends the right to encryption, which is, of course, an important player when it comes to policy.

    On the other hand, both the European Commission, the government, and the member states, the member state governments, are attacking encryption. They're calling on providers to find technical solutions to allow them to access communications, content, stored data which basically means, you know, rendering encryption ineffective.

    Most of all, the European Commission, beginning next year, wants to present legislation, which I call chat control. It's legislation that would mandate communication service providers to screen all private messages for possible child pornographic content, fully knowing that these filters are up to 86% inaccurate, according to the Swiss and federal police. A vast majorities of these flags are not criminally relevant.

    Knowing that only by chance will ever sexual abuse be detected in this way. This is mostly about the circulation of old material. Also, knowing that the European (?) Of Justice, analyzing everybody's communications, so mass surveillance, is this acceptable in emergency cases when national security is threatened?

    Despite all that, they want to go ahead with this. They even wanted to be applied, the screening obligation, to be applied to encrypted services. So in order for WhatsApp, et cetera, to be able to scan and screen private messages, they would at least have to build a backdoor, which is called client‑side scanning, which is debated at length when it came to Apple's spy plans. Therefore, you're probably all aware of the risks it would entail. Specifically, it would open up the window to use this function, also for other purposes and even for interception purposes.

    That's why the five I's. I've spoken about these. I hope there will be a lot more noise about this. Plans are going to be announced on the 8th of March of next year. Until then, we have a chance to defend our right to encryption enterprise.

   >> NATHALIA SAUTCHUK: Thank you, Patrick. I have a lot of things I could ask now. But let's go straight to Pablo.

    Pablo, please go straight to your thoughts. You're a stakeholder in this field, yeah.

   >> Pablo Bello: Thank you for the invitation to this place. I'm very happy to be with you, of course.

    Okay. As Patrick said, encryption is under threat. It's under threat in Europe. It's under threat in Brazil, in Latin America, in different parts of the world. So this discussion is absolutely relevant and central for the evolution of the Internet, from my point of view.

    The question is how to defend encryption successfully? I think, in some way, it's one of our main challenges. To go to this debate, I think it makes sense to go back to the basic. It's something that everybody here knows better than me, for sure. I think there's a very important point we need to stress louder and say even louder, that we, the Internet community, should reject the false threat of safety and privacy. I think this term is critical in terms of how we're able to build a stronger narrative to explain the risk that are behind the weakening of encryption.

    We do not believe there's a threat of privacy and security. If we lose privacy by weakening encryption at the same time in the name of security, at the same time, we are losing both. We are losing safety

We are losing privacy.

    Of course, there is no such thing as a backdoor just for good guys. This is something that does not exist. If there is a backdoor, hostile governments, hackers will find ways to exploit it. So, as many people have said, weakening encryption will put people at risk.

    This is something I will repeat that people have said many times, but I really believe we need to continue saying that in a bolder way because this false narrative that is used for many governments worldwide goes to the very basic ‑‑ appeals to the very basic questions that society has. So it's important to take care on the problems that are underneath the issue of the going‑dark narrative.

    So this idea that encryption is needed and there is no threat of is important, but it's not enough, from my point of view, in order to win this battle.

    I really believe that it's important, also, to have better responses to the challenge of the going dark. This discussion is there. Society has concerns. So one thing is to reject this false dilemma between encryption and security. But the other thing would be don't take steps in the right direction in order to mitigate theories that we know exist on the Internet.

    Societies have bad people doing bad things. That's something that we know. Child exploitation is not acceptable. Terrorism is not acceptable. Organized crime is not acceptable.

    Serious capacity with the ability to affect our institutions or put people's health at risk is not acceptable.

    So I think at the same time that we are very clear that backdoors are not the answer, that doing that put lives in danger, I think we, as an Internet community, we should do even more in order to tackle things that we know are there.

    From a WhatsApp perspective, we know that integrity measures to identify abnormal behaviors, like (?) The use of cross‑platform signal analysis, their design of the product to offer a strong privacy feature, settings for users while remitting utilization through other measures and to offer users a channel to report dangerous content are the base to combat misuse on our platform, and we believe that these efforts are needed to successfully defend encryption, and we need to do more on that side.

    Finally, it's not brave ‑‑ it's easy, but it's not brave just to say we won't cooperate with law enforcement in diplomatic states. It he's not brave to tackle issues we know exist. We need to do our main effort in order to keep Internet safe or at least as safe as possible because our society depends on that.

    So, just to summary, I strongly believe it's important to strengthen our narrative to be very clear that there is no threat of privacy and safety and at the same time do more from the technical point of view, from the product design, from our integrity efforts to combat the misuse of the Internet in any of our platforms.

    And this is something that is feasible. It's false but it's feasible. I think we're late on that to find the right approach on that.

    Nathalia, sorry.

   >> NATHALIA SAUTCHUK: Thank you, Pablo. Very nice, your points.

Now we come back here to on site. I invite Lidia Stepinska‑Ustasiak to put your thoughts on that.

   >> Lidia Stepinska‑Ustasiak: Thank you.

    The interest of how technologies impact societies and individuals, I observed that it continues to evolve. There's still a knowledge gap between policymakers and public around encryption. And, of course, legal and technical aspects are very important, and standardization in area of cryptocurrency is important but also very challenging; but I think that what is more strategically important is users should make choices on their safety. They should be informed on enforced encryption methodology.

    The question which appears is: Can the EU really create such a balance between security and privacy?

    And, probably, the answer is: Legally, no.

    If legislation were to be passed, data access policies and capabilities differ among member states. So the problem with encryption, for example, in criminal investigations, vary from one member state to another. So the discussion and further chapters are still ahead from us.

    Thank you.

   >> NATHALIA SAUTCHUK: Thank you, Lidia.

    Now I invite Jeremy to comment and put, also, some thoughts on these questions that we are discussing.

   >> NATHALIA SAUTCHUK: Jeremy, I'm not hearing you.

   >> JEREMY MALCOLM: Thank you. I would like to talk about child sexual abuse. It's an intolerable and heinous crime. However, surveillance and censorship are firefighting approaches that have weakened this crime.

    It would create a false accomplishment and would do nothing to those who use safer communication channels.

    If weakening encryption is not a way to ensure online safety for children, what is? The answer is simple, but politicians don't want to hear it. We need to stop fighting fires and get ahead of the problem of child sexual abuse by investing more in abuse prevention. This means treating child sexual abuse not primarily as a criminal justice problem but a health problem. When it's viewed as a criminal justice problem, we wait for a child to be harmed before we take action.

    When we view it as a public health problem, we take action to prevent children from being abused in the first place by reducing risk factors and increasing protection.

    Why don't politicians want to buy into this approach? Because it's not popular. That's the only reason. In fact, prevention work is highly stigmatized because people falsely believe it's a more lenient approach. Some of you may have heard about a prevention researcher, Dr. Ellen Walker, who lost their job last month because of public outrage over an interview they did with my organization on the work with child sexual prevention, regarding people attracted to children.

    An op‑ed was written for "The Washington Post" last week. She said, If we really want to ‑‑ we should also encourage those most at risk of harming a child to come forward. Assisting those who have an unwanted attraction to children, it helps kids. It helps everyone. They would rather just double down on censorship, surveillance, and mass incarceration, which the public believes are the only solutions.

    That's false.

    And members of the Internet Governance Forum need to help get that message out there if we want to help promote more long‑term solutions to this problem.

    At a rights com workshop that my organization held this year, there was the follow message reached.

    We encourage policymakers to adopt a comprehensive approach to combatting CSA that is guided by public health presence principles and human rights standards.

The only problem is how to respond to that fact. Will they finally turn their attention to the underfunded abuse prevention or will they continue to waste resources on communications that they've already lost?

    Thank you.

   >> NATHALIA SAUTCHUK: Thank you, Jeremy, for your comment and thoughts.

    I would like to invite people that are participating on site and online to make comments in our chat. We do a cloud of words about the comments after a while.

    Now I invite Susan to talk about this.

    Susan, are you with us?

    I'm not hearing.

    Okay.

    Okay. Since I'm not hearing Susan, Vittorio is here.

    Go ahead with your comments.

   >> VITTORIO berTOLA: Thank you. The problem is we do not have a shared understanding of what actually safety means. So, I mean, people complain that encryption sort of undermines safety, at least when people say. The problem is that generally they have a different definition of safety than the ones ‑‑ than the people that defend encryption do.

    In the end, encryption is just a tool.

    There are cases which clearly encryption increases the safety of end user's data and communications. When you encrypt your end‑to‑end communications ‑‑ I mean, we're an open‑image company, and we spend time with our ISP customers to turn on encryption. Even now, there are people that turn it on only as an option, and there's a large amount of unencrypted data.

    There are situations where it increases safety.

    We have to acknowledge that there are good and bad things regarding encrypted flows.

    There's also disagreement on who is responsible for safety. Part of the discussions come from the technology providers, especially the big global platforms, they generally think it's their duty to provide this kind of safety, and they think they can achieve this by encrypting all the communications. It's understandable.

    At the same time, many states, even in Europe, not just the Italian states, but democratic countries like the European ones, think controlling safety and what happens on the Internet is the purview of the states and not the Big Tech providers.

    It's their duty, not just their right but their duty, to prevent bad things from happening.

    So this is also not easily addressed. It may be better to address these kind of conflicts views before addressing the fact of when we do encryption and what safeguards, if any, are for access to data in certain cases.

    Also, there's business discussion. The encryption is now being used by the Big Tech platforms to transmit data. It's not just the states but the users have access. If a user buys a $10 IoT device and put it in my home, I have no control over which data it is collecting and transmitting about me.

    Maybe there are no answers to these. It's hard. Let's include everything. Let's never include anything. I think we have to tell people ‑‑ we have to find middle grounds and solutions that should not involve breaking encryption, backdoors and these kinds of special access, these are technical solutions that are, in my opinion, too dangerous. So we should not be aiming for that.

    But we do need, as a technical community, to have an answer to the people concerned about certain types of content that is highly dangerous and not just seizing child sex abuse content.

    My conclusion would be that we need to discuss and possibly get into details over what we want to allow and what we don't want to allow but try to find middle grounds between the two positions because the debate, otherwise, has been going on for quite a long time without any real solution or any real events.

   >> NATHALIA SAUTCHUK: Thank you.

    Now, Mallory, please go ahead with your thoughts.

   >> Mallory Knodel: I'm the chief technology officer for the Office of Democracy and Technology. I work as part of CDT. I'm at the Steering Committee level for the Global Encryption Coalition, which is mostly Civil Society organizations but also small businesses and tech organizations that are working to follow policy around the globe and to push back against any backdooring of encryption.

    My comments are on your question about safety. So how does end‑to‑end encryption ‑‑ or encryption, more broadly ‑‑ contribute to safety for everyone? I think others have mentioned that, for everyone, it's important to think about the most vulnerable of us as well, which is typically how approaches to security privacy and safety online should work from sort of the bottom up.

    I also want to maybe focus my comments on what could be useful within the agreement of the Internet Governance Forum, since that's why we're here today. So what can all stakeholders think about as they engage in these discussions going forward and how maybe this conversation has changed over time. As many of us realize here ‑‑ and for those that are new to this conversation, it has been going on for a very long time. There's always this debate about what can we do about unlawful access and what do we do about the questions that Vittorio brought up.

    I will give you a preview of my inputs. We really need to establish securities infrastructure or some very clear boundary beyond which we will not erode encryption further in order to achieve the aims of this panel, which is both safety for everyone in the most vulnerable as well as trust in our online communications.

    Let's do a quick reframe. Backdoors are feature requests. They're an additional request or an idea that has been presented to services that offer end‑to‑end encryption on top of what the main aim is, which is to make secure communications both authenticated and confidential for them. Those requests, always, I've worked with ISOC and other organizations on technical proposals and how to do that. They always discuss vulnerability, which is not safety measures. That is, in fact, creating a lack of safety.

    So avoiding, then, these vulnerabilities in end‑to‑end encryption and encrypted services is something that we have to keep in mind. So this is where I think states often play it both ways. On the one hand, they really need strong encryption themselves, but in the erosion of this, the proposals that create backdoors, it increases the hackability of these services, not just for law enforcement that have received ‑‑ like, gained a warrant to look into communications but actually broadly. Once you introduce vulnerability in one way, it can be potentially exploited by others who have not gone through a core system to obtain such access.

    Another risk in terms of vulnerabilities, it gives users a false sense of security, which is very dangerous. For lay folks, it was mentioned before, the ability to have a clear threat model of the risks involved, if services have backdoor vulnerabilities introduced, then most of us, most of our neighbors, family, friends are not going to be able to understand that threat. The fact that it didn't just lawful access but hackability that will put them at risk.

    This will also, at the same time, drive sufficiently motivated criminals and others away from services that have been weakened to other places where they can use encryption.

    Vittorio pointed out, there's not just one tool. There are many. It's a standardized protocol. Anyone can implement it. Some will remain strong. Others, probably the biggest, most accessible applications out there will be requested to have these vulnerabilities introduced.

    I think, as well, I just want to say that in order to introduce these features or weaken security, as I've described, states and those states that are thinking about introducing these in legislative processes have not really grappled with the other human rights considerations, the human right to privacy, the human right to information where encrypting can interfere with (?).

    Confidential conversation, you can say what you want to who you want.

    Then there's the social model and cultural rights. Those are not being, I think, sufficiently considered and balanced when proposing these changes through legislation. I think there should be a high bar to undermine these. Yet, we are already being asked to develop these features without the parliamentary discussion making the case or trying to clear the high bar.

    Patrick Breyer mentioned these tools are flawed. The feature requests being asked for are not even promised to do what they say they will do. We really ought to be thinking about the other end of things. So getting to my final point, how can we, instead, establish a baseline or a boundary beyond which we won't go so that encryption always remains strong, end‑to‑end encryption remains strong, and we can build up from there, rather than this highly subjective notion of, you know, what courts should be able to do and what they shouldn't. You know, just drawing a line and, upon that, building trust and doing some more social‑focused aspects that others have pointed to.

    So thanks very much.

   >> NATHALIA SAUTCHUK: Thanks.

    Now we go to the last introduction in this first round. I would now like the invite Prasanth Sugathan to give your thoughts in this topic.

   >> Prasanth Sugathan: Hello, everyone. Good morning, good afternoon, depending on which part of the world you're joining from. Great to have you all here and maybe be part of the discussion on this very important topic.

    As far as the safety of the Internet, in India, currently, we have a committee appointed by the highest court of the country called the Supreme Court of the country, looking at the issue of surveillance and the use of software sold by the NSO group. The greatest human rights politicians, they've all been affected by this that has been found in many of the mobile devices.

    The very fact that bad actors ‑‑ unfortunately, sometimes these are (?) ‑‑ they could into these devices and get into any possible communications we have. It underlines the importance of encryption.

    This discussion on encryption, we have ‑‑ the Parliament is talking about it. The focus seems to be on a misplaced issue of national security, child sexual abuse material, et cetera.

    Encryption is said to be the problem, and the (?) Is the solution. That is an issue. India, along with Japan and the alliance countries are a sign of a statement where they were really asking for backdoors to these intricate platforms. That is differential a problem where if we can't even get an informed debate from the policymakers on these issues, it's going to affect all of us.

    People that are part of Civil Society, we're not doing a great job of informing people. We have come a long way from when encryption was introduced by the military. Thanks to technology by platforms like WhatsApp and Signaler it's very easy for anyone to use.

    Fortunately, the focus is so much now on technologies used by these and often with the phrase of "going dark."

    There's a lot of other users of encryption. It is very important for business. It is very important for citizens to conduct their day‑to‑day life.

    The parliament panel from India recommended banning services, saying that services often create issue s. These were committed over the dark web without law enforcement agencies (?).

    In this pandemic phase, if we could get on with our daily nice, this was thanks to VPN and such communication services. The very fact that policymakers would think of coming up with a solution (?) We need to have people understand the importance.

    When we were having a panel discussion at Parliament. It was said that they're not interested in the whole issue unless they electorate, the people who elected them, are concerned about this issue.

    Unless they think it's an important issue that's going to affect their people, their electorate, this is where (?) Similar laws are debated in most parts of the world.

    We currently have a new set of rules. There's intricate platforms like WhatsApp and Signal. The new laws mandated that if the law enforcement asked these platforms to figure out who initiated a message, then they should be able to find data to ‑‑ they should be able to tell the law enforcement agency who originated that message.

    So when these platforms store metadata, that's going to affect the privacy and security of users. I think it's come to a stage where we need to stay that the right to privacy also includes the right to infliction. I think that is something that maybe all of us should demand from our government.

    I will stop here.

    Thank you for having me.

   >> André Ramiro: Okay. Very exciting thoughts. I wanted for us to give some time at the end of our session for the public to participate with us as well. I wanted to move straightforward to our second round of questions.

    I believe you have excellent hoops in terms of technical requirements that have been mentioned in some remarks. So how should international standards address the different requirements and references of governments and citizens in different countries?

    I mean, we have some parents for end‑to‑end encryption in the United States. We have scanning that has have been very much mentioned here already.

    On the other hand, we have in Asia and Brazil, provisions for sharing messages and end‑to‑end encryption platforms in the landscape of cybersecurity and, at the end of the day, encryption.

    Different standards worldwide. So how do we address those challenges in terms of, let's say, a type of governance, a global governess of encryption.

    For this second round of answers, I would suggest for us to begin with who spoke last, if there is no problem.

    So if you're comfortable, best over to you.

    I was suggesting for us to begin with who spoke last. So repeating the questions: How should international standards affect the preferences of citizens and countries worldwide?

   >> Prasanth Sugathan: Standards are a concern. To look at the kind of laws that each governments are talking about, we have similar situations in most countries. When it comes to standard, the problem is technology cannot be a solution to everything. So that is where, yes, the standards can help us with the technology advancements, but when you have things that undermine this technology, that's a part of it. There needs to be more discussions on the political side of things.

   >> André Ramiro: Okay. Mr. Prasanth, thank you.

    I pass the word to Mallory again.

   >> Mallory Knodel: I think when we think about trust, trust is a concept between people. Although, we do try to approximate trust through technical standards. If we're doing that, we want to be really specific. When it comes to encryption, there are two main elements to that. The first is authentication. So knowing who you're talking to or what server you're talking to, and then the confidentiality. So knowing you're having a private consideration, and nobody else that's authorized to be in it is having it with you.

    Services, from a technical perspective, are implementing these standards, to fulfill user demand for privacy respecting trusted and confidential communications.

    We also see, at the same time, though, policies that are intentionally trying to erode it, like others have mentioned. They're intentionally trying to degrade the provision of those services.

    I will just borrow the language. It's (?) Countries and Japan and India that wrote the at the same time that included the words: Companies should not build anything that denies access to service. They were trying to prevent unlawful ‑‑ companies should not deliberately build services that users want. I think that's really disturbing that that would be mentioned. It's already hard. Right? I think the major platforms offering end‑to‑end encryption, for example, are really trying. There is an effort in the engineering task force to standardize messaging layers security protocol that could be applied to text messaging and so on.

    It's been a difficult effort because the challenges are high. It's group chats. That's not between two people. It's between many people.

    So reducing spam, tackling disinformation. Making sure objectionable content is not available. These are things platforms are invested in doing. Instead, they're having to spend time fighting the ability to just provide the service at all.

    I think user choice means that we need to think about ways to do content moderation and end‑to‑end systems that center user needs and consider user choice. CDT put out a paper earlier this year that tries to do that. It looks at the different sort of proposals for how to do content moderation and end‑to‑end systems. It comes down on a couple that may work.

    Palo Bello mentioned metadata analysis. We feel that's promising, as long as platforms don't creating more metadata than they already need to deliver a message from one place to another. But there's already so much you can do to look at coordinated behavior on platforms or other things to make sure that spam and other things are taken care of.

    Other things might be more around user reporting and how to do that, encrypted platforms that enhance the user experience but does not violate the promise of confidentiality.

    And the main question around globalization, or globalized approach, I mean, I think it would be great if we had a proliferation of strong, end‑to‑end encrypted options. Right now, they're centered in the most popular services, which is great because it reaches a variety of people, but I think we do want choice and this goes back to my original point around if we could establish strong standards, strong implementations that center user choice and ensure the authentication and confidentiality building user trust, proliferate those, then that is essentially what we have, and that's what I would call security as infrastructure.

    Thanks.

    

   >> André Ramiro: Thanks again, Mallory.

    To Vittorio for more on this.

   >> Vittorio: It reflect as difference in national values and cultural approaches. So, again, it's not really just about encryption. It is that, for example, the border, the maximum extent to which law enforcement should go and the procedures to get to that ‑‑ I mean in contrast with the privacy of communications ‑‑ is really vary from country to country according to political stories with each country. Same with political control.

    There are countries like the U.S. that has a broad First Amendment that protects free speech in Europe. Others don't want the see certain types of content over the Internet because they think it's endangering their democracies, be it propaganda, et cetera, child sexual abuse, et cetera. There's a long list of content. There's a list of things considered illegal in those countries, and when that's made available, those countries get upset when it's allowed, and citizens in the countries get upset.

    They set rules themselves rather than accept the American approach, and I think this is also true for the trust issue. Building on what Mallory was saying, it's not like encryption automatically gives you more trust. It depends. As we were saying before, I'm a bit worried of the message that encrypted communications automatically give you more safety, privacy, security, and, in general, more (?). If I know no one is intercepting my communications, can I trust my communication before than before. At the same time, if there's a suspecting user ‑‑ if they're checking the (?) Then I find myself on a dangerous website. I've seen people, including my own family, really scared. All of a sudden their phones started to tell them to download an app. And there was no safeguards because the filters had been bypassed.

    We should be very careful about the messages we send. Even with end‑to‑end encryption. There are companies that use it heavily to promote their products. If I send my communications in a secure, private, encrypted way to a company that has user providing, then they get less privacy than before.

    So this is also something that needs to be communicated to users.

    The app that still has access to the data is important.

    If there's access in Europe, in the end, the U.S. intelligence agencies have the way to get access to my information (?).

    I think we're simplifying the discussion too much. We should not start by discussing encryption. We should start with trust and sovereignty and these kinds of issues in general. From that, it will descend a policy on encryption.

   >> Susan: Thank you. I had technicalities getting on originally.

    I want to talk ant international digital safety. When think about that, the question is the automatic news headlines.

The right issue is that we have national security, business security, economic security, public safety, and privacy, and encryption ‑‑ end‑to‑end encryption provides all of those pieces, as well as making it hard to do investigations.

    Putting them in opposition, privacy versus security, is a real mistake in thinking about the problem to begin with. It's one of the reasons, for example, we saw the change in both the U.S. and the EU back in 2000 on export controls on encryption. At least the U.S. saw it as advantageous for various reasons. They knew it would make their job harder, though not that much harder.

    Other nations were already encrypting, not just the more technologically advanced nations. What it did do was make law enforcement's job harder in the United States, and it's been harder throughout the world.

    When we think about the issues of international safety, as I suspect you all said when I wasn't here, every nation has its own definition. China has one. Russia has another. Iran has a third. The U.S. and France differ, and so on.

    About the only thing that nations agree on is that CSAM, child sexual abuse material is terrible. It shouldn't be encrypted.

    As we say that, some countries in the world are large purveyors of CSAM and allows that terrible crime to happen.

    If we go to encryption sits on both sides of the issue, then we get to the point where each nation is really going to makes I own decision. It makes it complicated for the technologists. It means we have multiple standards, but I don't see a way around it. You're not ever going to get to the point where U.S. and Iran and France and Russia and China are going to come to agreement on end‑to‑end encryption.

    As for the issue of a collective sense of trust, goodness, it's not encryption that is the problem. It's our protocols. Why don't we have DNS SEC available? That's where the problems lie. Mallory is absolutely right, that we need encryption, ubiquitous end‑to‑end encryption, authenticated end‑to‑end encryption, which is absolutely critical.

    I heard comments that we could always use the metadata. Yes, law enforcement can use it. In fact, the tech companies are also using the metadata. As we watch what the tech companies do with the metadata, which is not just to deliver your packets more efficiently in a deliverable way, to deliver what you want, it's also to figure out things about you and to probe into your privacy in rather extreme ways.

    So I don't think the answer ‑‑ while the metadata is always there and let's not worry about it, I think we need to start worrying about the metadata more. I will point to the Supreme Court decision in the United States that said: Seven days of location data, collection of location data, has such breadth and comprehensiveness, that it needs a warrant.

    I will stop here.

   >> André Ramiro: Thank you, Susan. With this, I will pass the word again to Jeremy.

    Please, Jeremy. The floor is yours.

   >> Jeremy Malcolm: Thank you. In how global policy can be developed, I'm going to give one example of an attempt to develop a global private sector approach to detect child grooming in chat communications.

So, in November 2018, a meeting that I attended called Preventing Child Online Grooming, working together for maximum impact was utilized by the UK home office and tech companies to launch the development of an AI tool that industry could share to detect text‑based child grooming.

    Now, scanning text conversations for sexual meanings is notoriously inaccurate and systematically biased.

    Allowing such scanning to take place violates the trust of the very groups we want to protect. Without trust in the security of systems, all conversations cannot happen.

    Some talk about their abuse online and reenact it with consensual partners.

    When survivors don't feel safe, they don't come forward at all.

    When that tool had been rolled out by platforms, the European Parliament determined it was illegal because it attempted to determine the meaning of private communications. Allowing the private communications of people who are under the suspicion of no crime to be scanned by experimental inaccurate AI robots and particularly flagged by police as evidence of child abuse violates law.

    This attempt at a global private sector response to CSAM in private communications fell flat. This is just one example of how well‑meaning attempts to make conversations safer actually violates human rights and reduces trust in the privacy of online communications.

    Anything that allows governments and maldoers is counterproductive and hurts people.

    Once again, returning to my point from my earlier intervention, the solution is not more censorship and surveillance which undermine trust but, rather, more stigma‑free support that can prevent problems from arising again.

    When people have confidence, they're more likely to reach out for help when they need it, knowing that what they have to say will be kept in confidence. That's where I believe we should be focusing our attention, rather than trying to stuff the genie back into the bottle.

    Thank you.

   >> André Ramiro: Thank you. I will pass straight to Lidia. The floor is yours.

   >> Lidia Stepinska‑Ustasiak: Thank you. Encryption is perceived in two potentially conflicting ways. On the one hand, it's a tool for privacy and security, and, therefore, in this context, it is an essential component of Europe's open societies and digital markets. But the second way of thinking is a second way of encryption for criminal activity and therefore an obstacle to law enforcement.

    For example, after terror attacks in 2016, a year later, the EU responded with a series of provisional but non‑legislative measures.

    Another context for this debate because the rapid rise of sexual child abuse online. It gets new importance during the pandemic when children had to spend a lot of time not supervised at home.

    In 2020, the European Commission launched two very important strategies in this context. One for combatting child sexual abuse specifically, and another for updating the security union strategy more broadly.

    With strategy, we're trying to find a compromise related to security safety and declares to promote an approach which both maintains the effectiveness of encryption in protecting privacy and security of communications while providing an effective response to crime and terrorism.

    But when we talk about standards, the discussion on technical aspects definitely should be followed by joint efforts of governments and private sector to build common understanding on the safety and trust, and it will be also important as part of a capacity‑building strategies on a national or European level.

Thank you.

   >> André Ramiro: Thank you, once again, Lidia.

    And I pass it to Pablo, who represents WhatsApp who is, of course, at the center of those international challenges.

   >> Pablo Bello: Thank you. I would follow with the previous speaker. In order to create these talks on encryption, there is a relevant challenge for us. It is how to include the regular users in these conversations. I think this is a relevant challenge. We know, of course, for human rights activists, for journalists, encryption, it's key. It's very, very important. But not just for them, not just for people that is obviously in danger. Of course, they need encryption and have too many to protect their communication.

    But for everyone, encryption, of course, helps to keep personal information secure, protects financial assets, property data. In the pandemic, we saw a lot of cases of health services, people relying on WhatsApp, for example, to get information and to have assistance with medical doctor.

    For the economic, encryption right now is part of the infrastructure, the critical infrastructure of how the economy of the society works. I think this is very important to stress and to bring regular users to these conversations.

    WhatsApp has more than 2 billion users worldwide. It's huge. It's, by far, the most relevant communication platform in the world. Nonetheless, I would bet that not all of our users are aware of the importance of encryption. For sure, even the less part of our users, the smallest share of our users, are aware of that.

    We're doing a lot of campaigns to explain the importance of encryption, and we will continue to do that. We've been a vocal defender of encryption in the media and forums.

    As you know, WhatsApp does not hesitate to defend encryption, even in courts, as we're doing in India or in Brazil. We will continue fighting encryption. The NSO, in the United States, it's part of the same approach. I believe that this is not enough in order to change the narrative and to win, in the end, these battles.

    Encryption is key in almost every aspect of our lives. Our challenge here is still to translate better the technicalities of encryption for everyone with concrete examples related to the ordinary life of regular people.

    I think our collective sense of trust is not only built collectively but also protected collectively. Encryption will undermine users' safety and will endanger the lives of activists, of course, but will also put risk and integrity of government services, financial services, communication, access to health, in the regular life.

    At the end, I really believe that the strongest defense for encryption should not come from us, the Internet forums, the Internet community, but from everyone, everyone in the world. Users of encrypted services recognize its importance. Citizens will speak out loudly to protect their very own rights.

    I think we need to ‑‑ the technical debate, the debate in the IGF is relevant for the resolution of this discussion, but we need to bring our users to be part of this discussion.

   >> André Ramiro: Thank you so much, Pablo. Very important remarks.

    I will pass the word, finally, to Mr. Patrick once again to give our closure to the second round of answers. Please, Mr. Patrick.

   >> Patrick Breyer: Thank you. It's not easy to add to what the previous speakers have already said. I think when it comes to international standards and the basic standard should be fundamental rights because this is ‑‑ these are rights that apply to everyone universally, and everybody has a right to security and privacy and safe communications. We've seen court judgments in Europe that outlawed mass surveillance or that outlawed data retention, meaning the indiscriminate collection on metadata on all citizens.

    If that were adopted more generally at an international level, this would be a very good analysis for this debate on encryption as well.

    Since we're dealing in Europe mostly at this time with proposals to establish mass surveillance and undermine encryption by using the hook of child pornography, this is what I've been mostly looking into when it comes to trust. And I have found many children use private communications for sexting. So sending nude photos or recordings of each other, and they have a very strong wish for their communications to be private. So there's an overwhelming majority in that their (?)

    I was told by a victim that they discussed with a journalist whether it should be reported. They really need to be private and go undetected, as well as when they discuss these issues with their lawyers. What we really need to do in this debate is find the emotional cases and the witnesses that explain how crucial this privacy and safe communications is to their lives. How the safety of children and adults depend on it, including democracy activists in many countries.

    So we need to be just as emotional as the purported children rights organizations are, and they are very emotional. We just need to find those cases. We need to find those people. I think in the case of the Pegasus scandal, there have been so many examples of scandal whose people's privacy was invaded. And that's really the way to go to find those people and let them give witness to why it is important.

    I fully agree on what was said that, you know, this difference in opposition of privacy and security is completely the wrong starting point. We really need to put that out, that those need to go hand in hand.

    Thank you.

   >> NATHALIA SAUTCHUK: Thank you. We finish with the panelist intervention.

    We have some on the floor now. Keep it to two minutes.

   >> I try to be faster. Alexander (?) Of University (?). I'm old enough to remember that discussion on encryption was in the United States in previous sanctuary (?) Or expert and restrictions on PGP.

    I just want to remind people that such things already happened in the United States. We just have to remember arguments there. Also, a lot of arguments about child abuse. I'm the father of two children who are just becoming teenagers. I am very concerned. I'm very concerned about our freedom and abuse by law enforcement agencies.

    By the way, on this panel, there was no law enforcement agency representatives. We're talking about child abuse without any statistics and witnesses.

    I also remember that since the beginning of this century, the constitutions of child abuse, cultural pornography happened in Russia and representative of law enforcement agencies, I think it was (?) Police said that everyone who installs Linux is a child abuser. The same things happen now on other levels. Anyone who uses encryption is also trying to hide their intention to abuse child. It's not true. But law enforcement agencies representatives are (?) I remember how came and said, you're right, the database is wrong. I won't use it and Google to find criminals. No. That would not happen.

    In answer to the second question, it's not policy. It's not technology things. ATF is working standards very well, but if you're raising these questions, you should bring law enforcement representative but not just bring law enforcement representative. Bring statistics. How much abuse has happened and not been discovered because of inability to decrypt?

    I think that they are not on this panel because they have no such statistics. They're killing our presumption with their laziness to work.

    It's not a question about standards. It's a question about law enforcement agencies.

    Thank you very much.

   >> NATHALIA SAUTCHUK: Okay. Thank you for the intervention. We have a very few minutes to make a literal wrap‑up of these panels. We had some questions also in our online ‑‑ from our online participants. For example, we had one here. Monica that talked about do states believe that nobody but us can work, or do they just pretend that it will work?

    So this is one of the questions. I will not be able to read all, address all, because we have only seven minutes, and I want to give the chance to our panelists to make a final statement, like one, two minutes, yes, to fit in our time here.

    First, Patrick, do you want to try to address very quickly, briefly, the things?

   >> Patrick Breyer: Sorry. Could you repeat the question?

   >> NATHALIA SAUTCHUK: There's this question from Monica about the government stuff.

    Nobles ‑‑ nobody but us can work or just pretend that it works, or just some final remarks that you have also.

   >> Patrick Breyer: Okay. I just see Monica's question.

    So I can't answer the question directly, but what I will generally say is that it is so valuable to have the panels, and they should have a much broader audience because this is a debate that concerns and effects all of us. The implications are difficult to understand for many who see a solution at hand. Oh, yes, great. This will help us tackle terrorism, save children.

    I don't understand how effective this is and what collateral damage it comes with. So we really need to push stronger. I've been heartened by what happened after Apple proposed its plans and the reaction to that.

    Also, earlier this week at the EU Commission for Internal Affairs was pushing for these chat‑controlled plans. It was said that at the moment that privacy is louder. So this is some encouragement to us that we can make a difference if we speak up and defend those rights.

    As we know by experience, once they are gone, they are gone for good. We won't get them back. So there's only one chance to defend these rights, and we should use it.

   >> NATHALIA SAUTCHUK: Thank you. This debate is so amazing. We want to talk very much.

    Pablo, now, please, final remarks?

   >> Pablo Bello: I think things come from different parts of the world. In Brazil, we know the misinformation and the coordinated attacks against the democratic institutions is the main source that puts encryption in jeopardy. It's different, part by part, and it's relevant to consider that in our conversation.

    Just for a finish, I think from my point of view, there's no trade‑off between security and privacy. We need to go both. We need to get both. Encryption is the tool for that. We need to confront the reasons for going dark. We need to create better responses because society, at the end, wants these responses. These responses are not backdoors. Of course, we need to find others.

    My final point is I really believe that we need to expand on this conversation to bring regular users, not just activists or guests or users. Not just those in danger but regular people and expand the narrative for more economic reasons, more health‑related issues, to win, at the end, the common cause with the society.

    Thank you so much.

   >> NATHALIA SAUTCHUK: Thank you. Very good remarks.

    Now, Lidia, please.

   >> Lidia Stepinska‑Ustasiak: In the context of encryption in the context of law, societal context, and technical as well, everything was discussed perfectly.

    I would like to add just one thing, which is on the margin, but I think it is important. I believe that, as a society, we need good and responsible education. There is a role for journalists who can educate individuals on consequences of development of technology, encryption, safety, et cetera, et cetera, because we need such knowledge, as individuals, to ask relevant questions to governments, to expect reliable services from private sector, and to have benefits from digitization.

    Thank you.

   >> NATHALIA SAUTCHUK: Jeremy, your final remarks, just quick.

    You are muted.

   >> Jeremy Malcolm: There's no trade‑off between (?) And privacy. I have a trade‑off. The only way that we can get absolutely security and have no privacy is in jail, and we don't want to live in a society like that.

    So there has to be some trade‑off, in a sense, between security and privacy, if we want to live in a free society. If authorities know of particular users who they have probable cause to target, then they can do that, but that does not translate into weakening across the board. It reduces trust in communication systems. It's illegal.

   >> NATHALIA SAUTCHUK: Nice.

    Susan, final remarks from your point of view, comments?

   >> Susan LANDAU: I'm going to reference a report I participated in a couple of years ago with the Carnegie endowment called moving the encryption debate forward.

      (Coughing)

   >> Susan LANDAU: Sorry. I had a frog in my throat.

    We threw away the Strawman. One there's no way to access communications.

      (Coughing)

   >> Susan LANDAU: Sorry. The law enforcement cannot do their jobs if they cannot access communications. We came up with the principle that there needs to be utility in any solution. Any solution cannot be repurposed for master balance and so on.

    We also went down a tree looking only in the United States about what issues seemed to be most oppressing. It was not foreign intelligence, law enforcement. It was not end‑to‑end encryption; it was devices. The solution was not escrowing keys; the solution might be accessing the data on the phone. Sorry. It was not using updates to make the phone insecure and so on.

    There are two things that are striking about this report. There are many things, actually. I would urge you to read it. The two things I want to draw to your attention, first, that we said, If you can't solve the unlocking‑the‑device problem, then you can't solve anything.

    You shouldn't pass laws or policy until you actually have a technical solution and see if it can work its scale.

    The other is I urge you to look at the authors of this report, which include people who are now ‑‑ well, you have the now director of national intelligence and the now second in Department of Justice where people who participated in a report. So I think it's a very powerful report, both to what it says and for who signed onto it.

    Thank you.

   >> NATHALIA SAUTCHUK: Thank you.

    As our colleagues see here, our time is up, but we have just a few minutes, I imagine, our colleagues will intercept, a few minutes to conclude here.

    Vittorio, your thoughts?

   >> Vittorio: I will be quick. We have come down to discussing sexual abuse versus law enforcement. Yes, it is a problem. It is a furious problem, but it's just one small part of the problem, and we should really focus on all the aspects of the discussion, which are really about the sovereignty and the power and the control in a certain way.

    So they're really about different user groups. As a hacker, engineer, as I am, I have no problem securing (?) Myself and possibly I do need and don't want anyone messing with my communications, but (?) Non‑technically really wants someone to check their communications with their consent.

    So, in the end, I think with the law enforcement, we should have a discussion and understand the long‑term strategic implications and of the changes in power and control it would bring between the companies and the governments. That's really the long‑term aspect that worries me more.

   >> NATHALIA SAUTCHUK: Go ahead, Mallory.

   >> Mallory Knodel: Sure. Briefly, I will go back to what others have said and something I've myself been saying for quite a while.

    You know, governments have an opportunity. There are several UN high‑level discussions about cybersecurity. It would be really great, I think, if we could all work on establishing these sorts of boundaries that lift up people‑centered approach to cybersecurity which would require, I think, a commitment to not try to backdoor end‑to‑end encryption and make sure it's strong and ubiquitous. Many things can be done on a platform like that.

    Like I said, there are many opportunities for states to commit to doing that. I would expect the democratic governments to be the first in line. Many say they appreciate strong encryption. They support it. They wouldn't want to break it, but then they often come up against these issues over and over again.

    So I'm looking forward to having this conversation in another 20 years, if we have to. I will be around. But it would be nice to also start moving it forward and getting more specific without this constant threat of the idea that the biggest, most ubiquitous user‑centric services will have to offer poor security. Let's not do that.

   >> NATHALIA SAUTCHUK: Thank you.

    We have now a last intervention and after this, our Rapporteur, you share some of our cloud of words to see more or less what appeared here during our discussion.

   >> Prasanth Sugathan: These are non‑negotiable. Any exceptions to these should be very narrow and with proper safeguards. We cannot have a situation where the rights of citizens are effected. You can have surveillance, but there should not be situation where (?).

    That is my remarks.

      (Captioner will disconnect in 60 seconds)

   >> NATHALIA SAUTCHUK: Luiza, go ahead.

   >> LUIZA BRANDÃO: Thank you so much, all of our speakers. As Rapporteur, I collected the impressions from our speakers and also from some commenters, some comments in the chat. And what we can see, of course, encryption is the main topic discussed, but one thing that is very interesting is that security and privacy, they are at the same level. You can see they are discussed equally, and that point, how much we must follow this path that they are not opposite, they are complimentary, and how we must involve this discussion about communication, and it's about human rights and how to protect the society against violence, censorship, and establish standards such as end‑to‑end that appeared a lot to make our rights, our human rights enforced around the world.

    So thank you, very much, all of you who contributed to this tiny demonstration of the discussion.

    As Mallory pointed out, we will be around for the next years and for how many years is needed to build a trusted and reliable network for everyone.

    Thank you a lot.