2015 11 10 WS114 Implementing Core Principles in the Digital Age Workshop Room 10 FINISHED

The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record

 

***

 

>> MODERATOR:  Ladies and gentlemen, can I invite those who sit in the back to come maybe to the table?  Because it's a lot easier to discuss if people are close and not just the audience in the back.  There's enough space here.  Please fill it up.

 

We are facing new challenges in Brazil regarding protection of civil rights and privacy.  Brazil and Germany are working together for many years.  We have a common agenda.  We got an excellent outcome at the United Nations, discussions about privacy, and CGI.br, has one very clear set of principles, core principles to shape and drive the Internet Governance in Brazil.  Our principles, all of them are structured by the human rights perspective.  Of the principles, one is human rights, privacy, and freedom of expression.  So we are absolutely committed with that role.  And those principles are inspiring new legislations in our country, one of the Marco da Civil framework.  We said very clear that the most structure and most important rights respected on the environment is the right to privacy. 

Unfortunately, we don't have specific legislation regarding data protection and protection of personal data.  We have a lot of discussion at the Congress, but the law was not passed.  So I'm sure that this meeting can help a lot and is a great opportunity to share our experience, to have an updated scenario about how this challenge is evolving since our resolution that was approved last year, and I hope that we can have a great discussion and a positive session this morning.

>> THOMAS FITSCHEN:  Thank you very much for these welcome words.  My name is Thomas Fitschen.  I'm the director in Germany for United Nations Cyber Foreign Policy and Counterterrorism.  Please read these as separate issues to avoid confusion. 

Thank you very much for coming here.  The workshop today on concepts and implementing core principles in the digital age is a product, as my colleague already mentioned, the very close cooperation between Brazil and Germany on the challenges for the protection of privacy in the digital age.  As you know, we wrote together, with a couple of other countries, Norway, Switzerland, Austria, the sponsors of two groundbreaking resolutions in the General Assembly and of the process in the Human Rights Council that led to the creation of the special rapporteur on privacy. 

Human rights concerns figure very high on the IGF agenda.  We have witnessed that yesterday already and, of course, today in the other workshops.  Of course, that's why we didn't want to have just another workshop on privacy.  Bearing in mind the magnitude of stakeholders present during this week, we decided to open up the discussion a little bit so as not to constrain, first of all, our presenters too much; second, to give them room to address other things as well.  The idea is not to stay at the conceptual level.  We have discussed notions and concepts for a really long time.  And we really want to break them down and bring the big notion, privacy, private participation, democracy and look at the privacy side of them. 

How can we turn these big notions into action?  For that purpose we have invited five renowned experts who will be presenters today.  First one will be professor David Kaye.  He is ‑‑ I'm not reading the entire CV.  You can read this on the net.  Of course, most people are very known.  He is the UN special rapporteur of the promotion and protection of opinion and expression.  In that position, his first report was on encryption and how that relates to the protection of the rights on the Internet.

The second speaker is from Article 19, they will look at the implementation of ‑‑ their work they are looking at how technical standards ‑‑ creation of technical standards can be inspired by human rights.  Our third speaker is Nanjira Sambuli from Kenya.  Governance and technology and from long experience as blogger of Civil Society activist, she can tell us a lot about to put the principles into practice and how really to work with politicians with Civil Society and others. 

And fifth presenter will be Professor Joseph Cannataci.  We will start with the four that I have just mentioned.  I would like to invite all of them to make a brief statement.  This is an incredible group of knowledgeable people around this table.  Probably we could discuss a whole day with them about these issues.  Unfortunately, we have only 90 minutes.  So I urge all discussants and everyone else who wants to join the discussion later to be extremely brief. 

To skip the basics, let us presume that IGF participants know a thing or two already about these issues.  Instead, please try to give us your, let's say, three to five or maybe more most important insights, the takeaways that we should all take home and reflect upon. 

If you exceeded the time limit of five minutes that was given to you, all too visibly, I will hate, but I have to, remind you that time are indeed limited.  I brought my Swiss watch, because I was in Geneva before.  So expect the worst.  Having said this, I would like to give the floor to our first presenter, Professor Kaye, please.  The floor is yours.

>> DAVID KAYE:  Thanks.  Thanks to Brazil and Germany for hosting this panel, and especially for your real energy in Geneva around the world on the issues of Internet freedom and Internet Governance.  I think there is a little bit of pressure being the first person to present on principles, so I'll really try to keep it under five minutes.  Hopefully this will be a kind of dynamic and open and interactive roundtable. 

So I'll maybe list these rather than have a general set of principles.  The first thing I would say is the idea of an open and secure and multi‑stakeholder Internet is clearly under pressure around the world today.  It's under pressure from governments.  It's under pressure from social situations, from societies.  It's under pressure from corporate actors, and it may even be under pressure in our universities and elsewhere.  This is a general problem with freedom of expression where I think there's kind of worldwide emerging doubts about the value of freedom of expression. 

And so my first general principle to say is that we need to ‑‑ one of the reasons why he was so delighted to see this panel talking about core principles is it's important to go back to those core principles.  Why 10, 15, 20 years ago, when the Internet really emerged as a global force, why did we welcome it so much?  We welcomed it so much in part, large part, because it put the control of information into the hands of users.  And when we talk about users, we're talking about individuals.  We're talking about individual citizens around the world.  And that idea is something that is under threat, and it something that we need to consistently, I think, defend. 

So I think in terms of very specific areas where we need to defend that, and to defend and open and secure Internet against government control, against control of access to information, against censorship, is in a few specific areas.  So for those familiar with my report to the human rights counsel in June, my points won't be of any surprise.  But I think the first is to focus on digital security. 

So that means encryption and anonymity, which in Brazil is disfavored constitutionally.  So I know that would be a very big process to change that.  But I think that both encryption and anonymity need to be reinforced around the world.  Clearly, encryption and anonymity are under threat as perceived to undermine law enforcement and legitimate intelligence activities.  But that's ‑‑ they're critical.  They're critical, I think, because of that initial first principle, which is they provide a zone of privacy for individuals to pursue information, to pursue their interests, to do search and private space that otherwise they wouldn't do. 

So how do we do that?  How do we make the case for encryption and anonymity in an environment where really the dominant theme is counterterrorism and law enforcement? 

I think ‑‑ I don't have any magic answer to that problem.  I think the first step is for us, and I'm really glad to see at IGF there's so much discussion.  There's a real place at the table for human rights principles now, but I think we need to continually we state the importance of these tools. 

Now, I come at it not from the perspective that Professor Cannataci might come from it, which is privacy specific.  I'm thinking of privacy in a kind of instrumental way.  It's creating zones of privacy in order to have freedom of expression, in order to maintain freedom of opinion.  So, again, I don't know how to do that.  I think that would be a very valuable conversation, which is how do we roll back the real intense opposition to even encryption and anonymity worldwide.  How do we make the case that is fundamental to circumventing all sorts of pressures, circumventing censorships, and allowing individuals access to information. 

I'll say one more thing, the general principle, thank you for raising the human rights in the General Assembly, it's become trite now to talk about how rights offline apply online.  The General Assembly said that, the human rights counsel said that, and I'm not sure they actually believe it.  And I think this is the key because creating zones of privacy I think we all understand a physical space.  We all understand how that's possible.  But I think that in the old days we would read the newspaper.  Now newspaper reads us.  I mean, that kind of situation is prevalent.  Whether we're talking about surveillance or we're talking about taking away our rights or our ability to use encryption and anonymity tools in order to have that private space, I think we need to restate that those tools are really analogous to the kinds of tools that we have in the physical world. 

To the extent that we can restate that and maybe even think here about how we can operationalize that, I think that would be really quite important.  So thank you for the time.

>> MODERATOR:  Thanks very much for the opening statement and also for the brevity of it.  As someone who was involved in drafting the resolution, I would say at least my government, yours as well, we do believe that this applies, because if it's a human activity, it's covered by human rights.  It's as simple as that.  So the old dichotomy of thinking between us and them, us and individuals, well, them, whoever that is, needs to be perforated and the two sides, the multiple sides need to get into a better dialogue. 

Thank you very much for that.  Second speaker as agreed is Niels ten Oever.

>> NIELS TEN OEVER:  Thank you for creating the space.  I realize it will be very hard to come off Professor Kaye with these statements, but I hope I can get into a bit more of the nitty‑gritty of where we're at.  I think it's very important, even though we all know; to really remind ourselves of the technology is not value neutral.  Also, the implementation of technology is not value neutral.  Human rights are not only an issue for Civil Society.  It's an issue for all stakeholders on the Internet. 

So the Internet, as we see it, is a tool for freedom of association, but we can ask ourselves is this by intention or by coincidence?  If we go back to early technical documents, Internet standards, also known as RFCs, we read an RFC in 1958 that the Internet is a global network.  The goal is connectivity.  The tool is the Internet protocol; the Internet is end to end rather than hidden in a network. 

But as the scale and industrialization has grown, the issue of world views started to compete with other values.  So what is our task right now?  Is to make rights on different levels explicit and show what they are and try to operationalize them as well, and I use their association of the architecture, the market, the legal environment, and the users that make up the whole Internet ecosystem. 

Here I'd like to ‑‑ we chose to focus on architecture.  An important part of the architecture is to open Internet standards.  In Internet standards or Internet protocols there are already security considerations.  Luckily there are now also guidelines for privacy considerations.  But there is no such thing as human rights considerations yet.  So we started a research group in the Internet research task force to try to define what could these human rights considerations on a technical level be.  For that, we've been analyzing protocols, interviewing engineers, and we started to come up with definitions.  As you see some on the screen, to make the connection between the legal and the technical, which is crucial if we want to come up with the rights‑enabling environment.

You are all very much invited to join the IRTF working group.  Luckily, we're also seeing right now that at ICANN it's very likely that there will be an explicit commitment to human rights in the bylaws.  If this happens, we'll see after the transition, they'll be working on a human rights policy and implementation.  There are different frameworks being discussed, both in the cross‑community working group on accountability, but also in the cross‑community working in ICANN's social responsibility to respect human rights.  Yes, that's a consensus title. 

We managed to get a report and some advice of which I have here which I would really like to see your input, because even though we have been talking about Internet and Internet Governance and human rights since WSIS, and the Human Rights Council, we need to understand how it works.  Will we use the service provider, IXPs?  How do we define the human rights in terms of conditions?  Can we come up with universal standards for that based on the ICCPR and the convention for economic rights and that's a challenge that's ahead of us.  That's something I would like to work with this community on in the future.

>> MODERATOR:  Thank you very much.  Thanks for recalling the question how once code is being produced or standards are being produced, who decides what goes in it?  Is it really a decision?  Is it coincidence, as you said?  Is it just a predisposition?  Is it something we're not aware of?  Or don't we need, as we said, to make this human rights commitment very explicit?  Thank you very much. 

Second part, I'm sorry, I misrepresented your affiliation.  Maybe you want to correct that for us.

>> JOANA VARON:  Hello, everyone.  I'm Joana Varon from Coding Rights.  I'm going to address a little bit of the topics on the work that we are doing in IGF.  First, I'm going to go back to some principles and discuss the challenges of implementation of those principles, taking into consideration the legal institutional environment.  So in Brazil we have Marco da Civil, as I believe everybody knows.  We have the principle of net neutrality. 

On the other hand, Facebook's coming here and we are discussing its free basics of Internet, how does it affect the principle?  The regulation wasn't addressed yet.  So it's more likely that the commercial power will just run away those principles because the regulation is not there.  The political clash to say no to zero ratings initiatives, it's significant.  Because in short time you are telling people that you cannot access those things, but in long time you are teaching people that those things are not Internet. 

It's a difficult balance with the principles and the commercial application of it on net neutrality.  We also have principles of privacy in the constitution and Marco da Civil.  We don't have the protection bill yet, with another head of Consumers International ‑‑ international, with the German government, Chinese government, and the Brazil government to discuss consumer protection.  The first step of the study was to map the legislation, the legal institution for the three countries.  If you take those three countries together, is a huge consumers market that if there some privacy standards like the ones from Germany, extended to that market, at least for consumer protection, we cannot say too much and state the relationship of privacy in states in China.  It's complicated.  But it's already a market pressure for privacy. 

Of course, the main challenges pointed in those studies were exactly conflict jurisdiction.  Adding to the first challenge that I mentioned, when trying to implement the neutrality, we have the conflict of jurisdictions on many other principles on Internet rights.

Brazil after the nationalization of the centres, but now we see the decision in Europe to remove the safe harbor for the U.S. on data flows.  This is an interesting case for implementation and for this conflict of jurisdiction to address. 

Those are legal institutional challenges.  As I have been the Internet Governance for a few years and sometimes it's frustrating attending human systems and then we thought let's go to the technical community and try to translate those principles into tech language.  The first interactions in the IGF were very interesting.  We were ‑‑ I'm not an engineer, and it's always challenging to go to a forum like that.  But there was a lot of openness from the engineers to address human rights.  Now we are in the process of translating human rights into technical language.  Just now in IGF Japan we presented a documentary film trying to put those issues.  I think there are a lot of openings there to discuss as well. 

Thank you. 

>> MODERATOR:  All right.  Thank you very much.  Thanks for reminding us that technology may not only, as previously speaker said, it's not very neutral, but it's also not economically neutral.  And you used the very nice term, the market pressure for privacy, consumer human rights as possible inroads for human rights concerns, for anchoring human rights from the technical side. 

Thank you. 

So our next speaker, Nanjira Sambuli, from Kenya, experienced practical work for human rights in the digital field.

>> NANJIRA SAMBULI:  I'll start with a bit of a quiz.  I'll say a quote and if anyone knows who said it, please shout it out.  "You have freedom of speech, but I cannot guarantee freedom after speech."  Anyone in the back?  Shout it out if you know who said it.  "You have freedom of speech, but cannot guarantee freedom after speech." 

Okay. 

>> AUDIENCE:  Does it count if we Google it? 

>> NANJIRA SAMBULI:  Now you can.  But these were the words by dictator Idi Amin in Uganda, and I brought it up as I was listening to everyone here is that we talked about the legal and the technical, but now the political operating environments where one thing that I've heard since I got here is to bring in the African perspective to everything I do.  I'll be having lots of shots every time everyone says African perspective.  The thing is, we signed on to pretty much every convention and resolution that is there to be signed.  My country, Kenya, is always number one on the signature.  Where do we sign?  Where do we sign?  After that, what does that actually mean?  For me, that quote goes back into the belief systems that conflict with the laws and resolutions that are on paper.  That's the operating environment we are in where it's one thing ‑‑ give it one hand and take with the other. 

And so you have these other conversations that need to be merged into very conflicting operating environment.  While I'm actually supposed to be speaking about it from the private sector, I think it's really interesting to see how at least in my speaker speculative working with a new generation of developers, how do we tell them all these things?  How do I condense this to a coder who just wants to build an app to help a context that they come from?  What does resolution X and agreement Y matter to them?  This is always very interesting.  How do we start bringing all this into something that makes sense? 

What I've observed so far is these conversations have been very much in silos, core principles for us, I think, even before the digital age, we're still having trouble in the analog age, so to speak, and forcing them.  The technology at best will amplify what has predated it.  Whatever the core principles, you end up with the classic thing.  You want your freedom of speech, have it.  But what comes after is a whole other situation. 

So one thing I said invited by the government of Germany to reflect on their role is that one thing that would really help with country context like ours is we need more leadership in enforcing of the principles.  We need a few more people who practice what they preach.  That would really help our work, because if I go to my government today and say, you know, we should not have ‑‑ say there is a law ratifying the African union convention on cybersecurity whose idea around inscription and privacy, are given with one hand, and take with the other, after that, consequences..... 

They will look at that and say you want us to enforce this and actually have this as a principle, but look at what country X is doing.  They're supposed to be the beacon of implementing core digital principles, and why should they be the ones to do that.  It becomes an interesting route and it's a vital cycle, trying to figure out who would be first to implement these core digital principles, and not just in signing of paper, but actually enforcing it.  I'm here so that we can have another narrative, other than you can have freedom of speech, but cannot guarantee what happens after. 

Thank you. 

>> MODERATOR:  Thank you very much for this very tough words.  It reminded us once again, indeed, there is no difference between human rights online and offline.  If the human rights and the real life is bad, it would be equally bad in cyberspace.  And you can't hope to solve human rights issues in cyberspace if the general political framework that you mentioned is not the same.  So thanks very much for reminding us of that, too. 

We have the last speaker, and I have to apologize.  I missed her in the first round of introductions.  I'm a 19th century person.  I'm dealing with paper here.  From Global Partners Digital, who works in London, Sheetal Kumar, and also an expert in Civil Society in this field and how we can introduce human rights into political decision‑making and political rights.  The floor is yours. 

>> SHEETAL KUMAR:  Thank you.  I have to admit I'm not an expert.  I'm sorry.  But I'm in many ways, I think like a lot of Civil Society members, a concerned citizen.  It's from that perspective that I'd like to speak on this issue.  I'm basically going to be doing a bit of marketing for the work of Niels and Joana here particularly, because I share a lot of their sentiments.  One of the main points that I wanted to make here today is that no technology is not neutral.  It concerns me whenever I hear those discussions happening where people say this is a technological issue, don't you worry about it, Civil Society.  You don't have any idea ‑‑ you're not a computer scientist.  This is not for you. 

I think it's just not true, is it?  We see it every day.  We see it all the time.  People's rights, people's abilities to live their lives are affected by technology and a real constituent way and becoming ever more so the reality.  And I think one of the issues is that we kind of see those who are not computer scientists as ICTs and the Internet as some kind of magic.  But it's not magic.  It is subject to the laws of natural science and material science, of course, but it is also shaped and designed by people who are shaped by so many factors, by norms, by values of the society in which they come from, and their politics when they bring to the work that they do.  We all do that. 

But more so, technology's not just about the way now how we get from A to B or it's not just a tool anymore.  It's constitutive to our relationships and living full lives as consumers and everyday people.  Just to reiterate that point made by some other panelists, technologies are neither good nor bad, but they are never neutral.  They are constructed in different ways with different normative implications.  What we're seeing now, obviously it affects people in many different ways, with you moving towards a world where we will see more smart technologies.  We'll see advances in Cloud computing, for example, things like personal digital assistants, and people already have them.  Machine to machine communications where we'll have ultimative decisions happening and not necessarily done through a human interface.  I think all of this is really changing the way that the law applies to the legal norms are relevant to technology infrastructure.  I'm not a lawyer either, but that's one area which I find particularly fascinating, again, as a concerned citizen. 

And I just wanted to point out a couple things have come across some work that I've come across in relation to ambient law.  And this is a growing field, as far as I understand, in the academic ‑‑ in academia and it's a really interesting field of work.  Again, I don't have the time or the expertise to expound on it too much.  But the arts made by the proponents of ambient law include that we have to find ways to articulate the legal democracy and our legal principles like privacy and transparency and fairness and nondiscrimination into the technological architecture that aims to regulate. 

One of the interesting aspects of the work, and I don't want to go into too much detail, but the law it says cannot be separated from the technological embodiment.  The way it is inscribed and by whom and the way it has an effect on society, the way it is practiced, is a fundamental effect of its technological embodiment.  They show how that has evolved from the oral tradition to the written tradition to the inscription of law through writing and the move from handwritten script to the printing press, and the effects that had on the practice of the law itself.  That's a lot of detail, and I think it's interesting, and I would just personally encourage anyone interested to have a more in depth look at that work. 

As we move forward, there is a particular point that makes sense of my point.  It's one by Pierre Levy, a media scholar.  What he said, I think, and he said this quite a while ago, but I think it's more relevant now.  "As long as engineers stick to the technicalities and sociology afterwards, add some social aspects, the issue that arise from closer human to machine interaction will not be solved."  And I think we need to be really wary of that as we move forward. 

By the way, this is not new.  There is a lot of academic study work in the fields of engineering ethics, value sensitive design.  We heard already about some of that.  Critical technical practice, values in design practice, human computer interface studies, all of that gives good standing point by which to advance. 

And perhaps just in relation to how I see where we could really move forward practically, is to be open to that idea, to be open to the relevance of all of these changes in technological infrastructure to everyday life, to even day citizens in the fora where the policy in engineering has an effect on everyday life which is pretty much all the time now, are inclusive, are transparent.  It doesn't mean that everyone involved will necessarily have the same know how when it comes to the technical capacity to participate, but those processes need to be open, inclusive, and transparent.  Because if they're not, then we risk having situations where the consequences of certain design and protocol engineering are something we can only respond to when it's too late.  And that's certainly what we want to see.  We need to be very wary when we hear that all too familiar line, I think, "This is a technical policy issue.  Don't worry about it.  It's not a public policy issue." 

I'm very happy to expound on some of the latter points, but I think that's all for now.

>> MODERATOR:  Thank you very much.  Concerning your relationship concerning citizens, some in this room may be expert, but I think we are all concerned citizens.  We are all in the same organization.  Thanks for recalling that, as was already said before, technology is not just a tool.  It's a feature of our society.  Of course, the way how we deal with technology, but also the way I come in as a lawyer, the way how we draft our laws is, of course, also influenced by our understanding of different cultural and understanding.  Bringing all this into the development of technologies and making sure all of this comes very early while we develop the technologies and not later, is very interesting reminder.  Thanks for that. 

So those were the first five presenters.  We are still waiting for Professor Cannataci, who comes around 12.  He'll be given the floor as soon as he's here.  In the meantime, I would like to open the floor.  We have a long list of other world experts, plus concerned citizens, which I don't want to mention in advance.  Please when I give you the floor, tell us briefly who hour, which organization you work for, and then we proceed. 

You're the first.

>> MARKUS KUMMER:  My name is Markus Kummer.  I'm here as a Board of Director of ICANN.  Let me start also speaking first as a concerned citizen and someone who has been hanging around this space in various functions for the past 12 years, working for the government of Switzerland, the IGF secretariat, the Internet Society, on the board of ICANN. 

Professor Kaye's plea for encryption and anonymity struck a cord with me, and also reminded me of the second IGF in Rio de Janeiro in 2007.  It was the end and a representative of Amnesty International who got up and was pleased to realize that some people to make use of the tool of Internet without anonymity; they would end up in prison.  That was very much, I think, for me a key experience in the very first years of the IGF

I also listened and agree with our society.  Technology is not neutral; however, I came to the conclusion that the core values of freedom of expression, human rights, are kind of hardwired into the technology by the founding fathers of the Internet.  It was in the spirit of the California in the 60s and 70s, with a very strong belief in these values that the technologies, without using the human rights, they built a technology that allowed the promotion of these values.  And I heard Bob Kahn, one of the coinventers of the TCIP, considered as one of the fathers of the Internet, who said once at the meeting of the ITU when there was talk about security, we built Internet not to be ‑‑ we built it to be open, and this open interoperable is the key characteristic of the Internet.  This is what also allows the free‑flow of expression. 

I love quoting myself.  I once had the ability to ‑‑ Article 19 reads like the definition of the Internet written well ahead before the Internet was invented, regardless of frontiers. 

Having said all that, I've been already too long.  This issue has now come up in the discussion of transition.  The argument was before when there was the ultimate responsibility was with the U.S. Government, the U.S. Government had the obligation to protect human rights.  If the U.S. Government goes away, ICANN as an organization ought to take on this responsibility. 

Now, the ICANN board, broadly speaking, agrees with that and is committed to human rights, but then the devil is in the detail.  Your initial question is how to operationalize it, questions that actually need to be assessed, how does it then impact the day‑to‑day operations of the organizations?  But we are committed to the principle and we are committed to working towards this as we move ahead.  But we are not yet clear on how to operationalize this in the discussion that we'll have over the next few weeks and the next months. 

Thank you for your attention.

>> MODERATOR:  Thank you very much.  You're next.

>> CHINMAYI ARUN:  My name is Chinmayi Arun.  I'm a lawyer and run a nonprofit in New Delhi.  I'm not audible?  I am audible now.  You don't need to know more about me.  It's okay. 

(Laughter)

So start from David's point, I just want to take it a little further.  Two points I will make.  I will not be brief.  I think where he's saying or if I understand correctly, and I stand for it, is that without the anonymity of reading, there is no democracy.  Of course there aren't any fair or free elections, but I also mean deeply that there is no such thing as free self‑governance.  If Frederick Douglas is right, if everybody's going to watch every move on every book you are going to read, there is no self‑governance, and there is no democracy or no such thing as free and fair at all. 

In that context, I just want to say one thing, is that the idea of social sharing in a context in which your service provider reads everything and what does everybody watch everybody else, their access or current or future ones, is inherently unethical.  And you cannot just dismiss it as mere business.  And that's a point we probably sometimes forget. 

The other stuff which I want to say is that we talk a lot about privacy, and I think Joseph will probably break it down into those three things, which I like, which is about breaking it into secrecy, anonymity, and autonomy.  Anonymity is about secrecy, about who is sending and receiving those messages, where the content of the messages may not be secret at all.  And this all leads to something which we call autonomy.  That's what constitutes privacy.  We need to divide it and then see what we do for all these things. 

All this cybersecurity dialogue, human rights issues, they're very, very important, but the fun part which Markus is also saying is the technologies.  I represent all leading free and open source projects.  One thing which we have learned in the past and one word or one person can tell you, which Mr. Edward Snowden told us, "What you cannot see, you cannot trust."

And what we have learned is if you cannot see the software, you cannot trust it.  And one word the Germans can understand, Volkswagon, tells you if you cannot see the software, you cannot trust it.  If you're going to talk about principles, talk about free and open source software in everything.  Whether it's your root server, whether is there anything else, this is something you need to see the source code, because otherwise you cannot trust. 

And the technologies in the Internet is because people are such that they self‑assign innovation.  Nowadays the problem is privacy.  We will be making a presentation.  There is an Indian engineer leading a worldwide project.  It's called Freedom Box Project.  It's a router, $35 router.  This is self‑assigned innovation.  He said if there is a medical emergency, we always ask, "Is there a doctor in this room?"  And somebody who is a doctor stands up.  Today's emergency, surveillance is an emergency, privacy is a crisis.  And we ask, is there an engineer in the room?  And he has hesitatingly raised his hand.  But the community has joined him.  Technology is now going to innovate and hack it to ensure that we have the privacy which we need. 

So the Internet, whoever is building, whether it's coders or anyone else, they already have imbibed these principles.  It's for us to recognize them and move in this direction.  Thank you

>> MODERATOR:  Thank you very much.  I think your reference to autonomy as the value behind privacy and all the others, freedom of expression and all the other freedoms, is a very important one. 

Now, I have two speakers you're first.  And then down the table.

>> EILEEN DONAHOE:  Eileen Donahoe, Human Rights Watch.  I have basically one quick sentence for each of you.  For David, I want to go to your very practical political question, which is how do we make the case publicly that privacy is worth protecting in the context of cybersecurity and counterterrorism?  And my answer would be get different spokespeople from the communities of people who are responsible for counterterrorism, national security, protection of critical infrastructure, and get them to articulate that privacy, confidentiality, are related to protection of critical infrastructure and essential for national security. 

And by the way, protection of human rights is essential for national security in the international peace‑keeping context.  Just flip it on them. 

To Niels, the first thing I would say is you're starting ‑‑ your starting principle, 1958 articulation you said unfettered connection end to end, that is becoming more and more important in the so‑called Internet world.  I would double down, not only implementation, but advocate for the principle itself, and the related is what you said about there is this vastly expanding roles in effect governance that isn't completely covered by the existing human rights framework.  We knew articulation on the private sector on what their roles and responsibilities.  Is there massive burden shift from governments to private sector?

Joana, I think you raised probably the most complex question.  It's hard enough to articulate why to protect privacy in the context of counterterrorism, more difficult in this room for people who actually believe in human rights.  What do we do when there are tensions between freedom of expression and privacy?  And you brought it up in the sense of consumer protection.  When policies are intended to ‑‑ genuinely intended to protect enhanced privacy and consumers, have the effect of, perhaps, undermining the unfettered interoperable platform, what do human rights people say? 

Nanjira, for me you raised the point about role modeling and leadership.  And every country that claims to be a champion of human rights or part of the Freedom Online Coalition, or whatever, you better ‑‑ if you don't, it's over, game over.  And you give cover to everybody else.  So that's really important.  Then Sheetal, rights respecting by design, both legally and technically.  I think that's the name of the name.

>> MODERATOR:  Thank you.  We give our presenters time at the end to respond to all these questions.  So maybe let's just go around the table.

>> EMILAR GANDHI:  Thanks.  Emilar Gandhi, APC.  Thanks very much to the panelists for ‑‑ it's ‑‑ sorry ‑‑ a little bit closer ‑‑ thought provoking inputs.  And I think I want to pick up with where David started, which is the online/offline analogy.  It has been a real victory for those that have tried to win that battle, a battle which was also fought in the IGF.  For those of us people like Markus, who have been in the IGF process from the beginning, it took a long time to get the level of focus on human rights in the IGF that we have now.  So thanks to everyone who helped make that happen. 

But I think that analogy can, while I believe in that analogy, it can obscure the fact that implementation or application of those core principles are different in the digital context from the analog context.  There is some similarities.  So you always do need human rights law and human rights standards.  You need people to be aware of what their rights are.  Now, that applies in both context. 

And I also think you need belief that rights are possible, and I think that's what Nanjira has referred to.  I think it's very hard to create a culture where people demand rights if they've never experienced respect for rights.  

And I think what is quite scary is that that is a reality that of us in Africa have experienced.  But how many Internet users are actually experiencing that without even necessarily knowing it. 

You also need the freedom to demand those rights and the political concept that respect rights, the other things, institutions recess analysis, mechanisms.  What are the context?  I think you all touched on this in different ways.  Firstly, I think at the level of process there are real differences.  And I think the very big one is that this notion that the duty bearer is the state.  We as individual stakeholders and we will demand the state to protect our rights.  That has changed.  I think Eileen also affirmed that.

The relations are different.  They're kind of almost multilinked or triangular.  Of course, we do want states to hold businesses accountable, but in the Internet context, there's also a direct relationship between the user and the corporation.  Does that become a consumer rights relationship or a human rights relationship or a combination of both?  So that's very different. 

I think the other process application that's very different is the application of the principle of public participation and decision‑making, decision‑making that's going to affect your life.  That's also very different, because the decisions are made in so many different spaces in the context of the Internet.  It's not the traditional state parliament system and citizens are able to ‑‑ not able to engage.  So that's also become quite complex.  Who makes the decisions?  How do you exercise your right to be part of those decision‑making, especially if some of those decisions are very technical.  And they do happen at the IoT.  That's why it is so good that we have rights people there. 

Then I think the other real difference is that we also need tools.  And I think this comes to what David was saying.  We need tools that enable those rights.  Those tools are sometimes not intuitively ‑‑ it's not obvious to people why we need those tools.  Anonymity, why is it important?  Encryption, why is it important as an enabler of rights? 

Then we need code.  We need code to protect rights.  So that's another added complexity.  So implementation requires code as well. 

Then I think it also requires something which I referred to for want of a better word.  I'm not a lawyer, but as translation.  In the traditional rights context, there would be a general understanding that freedom of expression also requires a free and independent media.  That free and independent media requires protection of journalistic sources and journalists. 

Now, this all changes in the context of the Internet, as Dave said right at the beginning.  When it's the individual user that is, in fact, that has the control to create and access information.  And how do we deal with that?  And I think there are some very simple things that actually a lot of people still don't grasp, that privacy equals data protection.  It equals anonymity.  It equals or can be translated into encryption.  Freedom of expression requires intermediaries not to be held liable for the content that they host.  This is the kind of translation that I think is not done in the systematic way.

Then just finally, just to ask the panel, you know, in terms of enforcing forward.  Do you feel there is a value of having a core set of principles that provide some guidelines both on process as well as on actually outcomes, something like the NETMundial, and at the human rights standards and level.

>> MODERATOR:  Thank you very much.  This really sounded like a sixth presentation.  We have one more so far.

>> BERTRAND DE LA CHAPELLE:  My name is Bertrand De La Chapelle.  I'm the director of the Internet Jurisdiction Project.  I wanted to hook my comment on the term implementing and getting into operational sizing fundamentally, when talking about human rights, you need to put in regimes and mechanisms and it's usually done by treaties and various sorts, at least international level and for the national law, but there is one element that I want to highlight is that usually what guarantees the exercise and the implementation of those rights is procedures.  Procedures is not pleasant.  It's not something that people care very much about.  It's a little bit like the technical layer, you know.  Dealing with procedures is not fun.  It's not chic, but it's important. 

And in that regard, one of the things we were confronted with, if we want to have international regimes, we have to go through all the hoops of the intergovernmental procedures.  And applaud both Germany and Brazil for having been able those resolutions through the whole system.  Unfortunately, the next step is how do we implement and go beyond that affirmation of those principles and make sure the documents are not filed as soon as they are signed, as was mentioned before? 

It means that we're confronted with the situation where we need to develop mechanisms and cooperation mechanisms between the different stakeholders that are in a certain way national distributed and collaborative as the Internet itself.  The spirit has infused the communication system that was built, as Markus was saying earlier, was the spirit of open.  It was a spirit of fundamentally in developing a second one.  We need to have the same kind of positive attitude, which is not the situation we are no today.  The situation we are in today is we are developing national laws and international cooperation mostly with the angle of security and not the protection of rights and the putting in place the procedures to protect those rights.  It is a challenge because there are security challenges because of the jurisdiction that is mentioned by Joana, because the‑ are very different in different countries.  But there is a need to develop the procedures to make sure that the different laws that exist can somehow be applied and coexist in digital shared spaces, but with due process mechanisms for any kind of interaction and any cooperation. 

I want to finish with one thing.  There is an extreme irony for whoever is from a large company in the Internet space today.  Because on the one hand, they are being legitimately or not, doesn't matter, accused of assuming control of the setting of the parameters of human rights and being the ones who make decisions on what can be published or not published.  Some governments are very keen in saying this is not acceptable, this is the role of the judiciary; at the same time they are being asked by the judiciary to make those kind of requests and those kinds of decisions, as we saw.  Likewise from the Civil Society side, there is this argument that there shouldn't be private.

>> MARKUS KUMMER:  That make those decisions, at the same time we are expecting them to uphold the human rights for securing of these rights.  We had a discussion yesterday evening, and a very interesting the country or even a company can have.  According to the principle of sovereignty, you're not supposed to have an impact beyond your borders when you exercise your sovereignty in your country.  And it seems completely okay if it is to have the impact to prevent this if there is an impact that is reducing the enjoyment of human rights in another country.  But there's a very tricky question is what is it?  And what should we do if the behavior of a company or a state has a transactional impact that oppose the human rights and improves the situations in other countries.  I just want to get it here but highlight the question of implementation is really ‑‑ really needs to be given due attention to procedures and due process and cooperation mechanisms.

>> MODERATOR:  One more speaker, please.

>> FRIDA ORRING:  My name is Frida Orring ‑‑ I am from the European Action Service.  Thanks for joining for this very broad topic of today's discussion.  Since we have very little time, I'll jump quickly into some aspects in touching upon what David Kaye started.  In the EAS, I deal with a team of cyber issues broadly, cyber defense to cyber defense online, four people doing it.  So bear with me.  I think what we have seen in the development in the past years now and the recent developments with terrorist activities being more prominent and closer to Europe as well, that there are two silos here.  I think it's very good discussion to have here, but I think we're mostly preaching to the choir when it comes to human rights. 

On the other side, there are cybersecurity conferences, counterterrorism communities, which are not having a single human rights experts in the room, and I think that it's probably more efficient if we can also be part of those discussions rather than hoping that they will travel to Brazil for the IGF, even though I thought it was worthwhile. 

So this is one remark I would like to make, because you see now in terms of inscription, and I read Professor Kaye's report, that the discussion in the counterterrorism community is still about how can we get around this?  Can we talk about back doors, etc., etc.?  While if you, perhaps, would say encryption is important and for the end user and all societies as a whole, then what other options are there to counterterrorism to make sure that the focus is in the right place also in those communities.  That was just a small stop.  Thank you. 

>> MODERATOR:  Thank you very much.  I think your contribution about procedure would be a nice bridge to Professor Cannataci's work.  In order to give him a little more time to get a flavor of what we have been discussing, I would like to invite our panelists for brief comment on what they have heard.  Any new ideas, any replies you want to give before we give you the floor?  Who would like to start on our side?  Niels?  Go ahead. 

>> NIELS TEN OEVER:  There is so much great input that I find it hard to choose.  I'll definitely try to keep it brief.  Markus, I completely agreed that it seems that the early developers of the Internet had this mindset hardwired and tried to get it into the codes.  Unfortunately, they didn't make it really explicit and write it down and operationalize it.  We need to get into the nitty‑gritty:  Proof of the pudding is in the eating. 

That brings me to you put quite a lot of responsibility of governments.  Even though it is the responsibility of governments to protect human rights, there is also of businesses to respect the human rights.  We need to maintain the multi‑stakeholder environment and not put the responsibility over the governments, because that can have quite adverse implications.  So I think we should indeed help as the representative from the external action service said, build the capacity with the private sector which we indeed not have many on the table, which is really a pity, to increase the capacity to come up with these standards so they can really respect so we do not need government inference in that. 

On the point on consent, it's incredibly problematic.  We are still in the puberty of the Internet, even though we've already been going for a while, because I know what I would like to share with you when I'm on the panel.  I also can navigate what I would like to share with you while we might have a drink in the hallways and that might change when you have a drink in a bar.  But when I get on the Internet, it's much harder to navigate what I share with whom.  Those variables are still things that we need to define and need to get accustomed with.  It will take years or perhaps generations to negotiate. 

Eileen, lastly, indeed there are a lot of potential issues between privacy and security, even though in my humble opinion maybe less than is sometimes being presented.  But when we make these choices, I think it really is crucial that we have tools and standards to make explicit what choices we make and then when we make such a difficult choice and we make the reasoning explicit, if our reasoning changes, then we can also make it explicit is really crucial.  That furthers or thinking as well. 

There, again, I think we have a partner in the technical community, because RFC 1958 is a technical document that comes from the early technical community.  So I think these communities might be closer than we think.  We just need the language and the practices to bridge them. 

Sorry for taking so long.

>> MODERATOR:  Thank you very much.  Making things explicit allows accountability.  That's the point.  Who is next?  Whoever feels like?

>> NANJIRA SAMBULI:  I had a whole other thing to say.  Maybe pass it back to Niels.  When we talk about government's role, it sometimes seems we are trying to pass on a very technical responsibility to them, but what they bring to the table, which is a political enforcement.  And so when we say we should not or we should have government on the table or passing things on to the government, I think it's important to distinguish are we saying the passing on the technical enforcement, solely which means they have to be more techie in that sense, or are we talking about the fact that nothing, again, will really come to have life breathed into it without the political environment being enforced.  Back to my point about freedom of speech, you can have it, but after that, what happens? 

So what she said is actually really true.  But my question then becomes we always point out we all need to have this multi‑stakeholder approaches and conversations.  So where are they happening?  Are we going to have the cybersecurity guys lock up with the techies.  Summarizing what Mishi said, start upholding those and embedding those principles in what they're doing.  I should give a shout to the free and open software developers across the world who are embodying these things, not read our FC something number X 59.  I'm doing this in practice.  At the end of the day, who is going to read these documents to actually enforce them?  How do I pass them on to the techies and the new coders who are just getting into in thing?  What is RFC 95529?  You know what does that actually mean to them and how do we pass this on so they're thinking privacy by design, if and when necessary, how they implement these core principles in practice.  So I'm really interested in seeing what are the ideas around that as well.

>> MODERATOR:  Thanks a lot. 

>> SHEETAL KUMAR:  I come back from two points.  I complete agree that though we say that human rights apply offline and online, there is a transition.  That's what we're trying to do in Brazil when we talk about anonymity, because the constitution forbids anonymity in the freedom of expression, but what is anonymity online?  Anonymous to who?  Mishi mentioned the anonymity about who is speaking, but not the content, but perhaps in the chain of communication, other intermediaries or some of the intermediaries know who is speaking.  So it's not complete anonymity.  Most of the time we are dealing with some sort of anonymity.  So I think this kind of exercise is needed.  That's how we are trying to address that terrible provision in our constitution to not block fundamental tools that we need online for freedom of expression. 

So that's one point.  The other point on cybersecurity and the balance, I want to call attention to all of you to the process of the WSIS review, particularly the session on cybersecurity.  It's very heavily based on state intervention for security and have little considerations for human rights.  And that's something that changed the law if you consider the WSIS documents from ten years ago.  And I'm sure that if you go in that path, you'll have ‑‑ then you'll have more governments interfering in fundamental rights and framing it as terrorism, two points I want when setting the rules for the implementation.

>> MODERATOR:  A quick reply.

>> PROFESSOR KAYE:  I think they were all valuable comments.  In a way I want to take off from Andrea's point about the norms being the same.  The same norms and it goes back to what my co‑panelist asks when I think is the fundamental question, which who is enforces.  ICANN is doing work in the space.  I think to the extent there is a huge amount of agreement around the room, which I suspect there is, as we said, there is some preaching to the choir in the room.  I do think there's a significant amount of uncertainty as to how the rules apply in the corporate sector, and this question of whether it's consumer rights or human rights is really essential, because if we're really just talking about Terms of Service and it's a contractual relationship, it's one thing.  But if we're talking about community standards, that's another thing. 

And so we're actually going to launch a project that focuses on how it's possible to extend and refine the principles in the ICT space.  That is something that is on our agenda to do specifically with freedom of expression.  I do think that that's really the biggest challenge. 

I didn't mean to disrespect any of the resolutions on putting down the principle of online and offline are the same.  I really did, I think, Emilar articulated it better than I did, which is that the implementation is the key point.  Because of the multiplicity of actors who are responsible for it, we really need to turn to that specificity at the technology side, at the corporate side, and even at the user side as well.

>> MODERATOR:  Thank you very much for outlining the way ahead a little bit.  Then we will turn to professor.

>> SHEETAL KUMAR:  I'll try to be very brief.  I wanted to pick up on the point that do we need a new set of principles.  I think perhaps yes.  But I think we also need a much more nuance understanding of the implications of advances in new technologies on our rights the way that they already exist and we understand them.  To give a particular example, Mishi made a really important point about how privacy in a more granular level, you can see it as secrecy, autonomy, and anonymity.  What has been concerning me in anonymity, the way we understand it, perhaps this is just an issue amongst certain people, but my understanding is that some people see anonymity as quite a clear or black and white thing.  So if you don't give up your name or you don't give up certain address details or something, you're anonymous.  But actually, with the sheer amount of data that is being collected, aggregated, stored, and analyzed, there are ways to triangulate data now, it's incredible the advances in technology and now the technologies can actually look and actually bring up patterns in this amount of data that opinion points one individual against another.  So there is that point we need to be careful of.  Having a more nuance understanding of the complex effects on technologies on our rights, I'm not sure if that just opens up a can of worms as opposed to actually responding to anything in a constructive way, perhaps.  But there we go. 

But on the point of how we need to get more ‑‑ break down silos between cybersecurity discourse and human rights, absolutely.  For me, democracy is not just about the ballot box.  I understand it as a system where those who are affected by the indirect consequences of a decision or action actually find a way to participate in those decisions so they can affect the outcome.  Cybersecurity and all of the decision‑making that is occurring in relation to that is affecting all of us, and we should be able to make sure that we have a say and that at least we are there present.  How that's going to happen, I think, will happen by us continuing to push for that.  That's really important. 

Sorry, I just really want to make a last point about some of the issues that have come about in terms of gathering of lots amounts of data and big data.  We need to be aware in policymaking about the harm, does proportional harm that can be caused to marginalized and at risk groups as opposed to other groups.  So those of us who are privileged by not always understanding the effects of things like profiling and how certain technologies actually helped with that, and how that can perpetuate existing policies.  It might be the poor, young matters, people of certain affiliation, and we need to be aware of that as well.  Where policy exacerbates those qualities, I think it's a bad policy.  We need to be more sensitive to that.

>> MODERATOR:  Thank you very much.  Thanks to all.  It's a first response in this round.  It's our particular pleasure to welcome Professor Cannataci.  As you know, he is the special rapporteur on the right to privacy, and doesn't need any more introduction.  So I give you the floor right away.  Please excuse if we eat into a little bit of the lunchtime, but I hope the food of thought is much better than the one you'll be served outside. 

The floor is yours.

>> PROFESSOR CANNATACI:  This is a difficult one in the sense that the last 30 seconds have been going on lunch and providing delays to lunch.  But over the past half an hour that I've been listening to you, it has been very interesting to see how much agreement there is and yet at the same time how much realization that we've come this far and yet to a certain extent can, perhaps, get no further.  So if you're asking me to be a bit provocative, here I come. 

So let's ‑‑ I'll be speaking more about this this afternoon, but you know when I was hearing an agreement, he was complimenting the governments of having gone through the hoops, as he put it.  He was saying there are procedures and those procedures have got to be respected.  And I think that that is a very important thing, which is while I was listening to Niels speaking about what different actors can bring to the table, that's fine.  But I think one should accept that there are some things which only governments can do; right?  And that, I think, is one of the key realizations where we are ten years after the first IGF; right?  Ten years down the line we have been experimenting with the IGF and with self‑regulation and code regulation. 

Frankly, ladies and gentlemen, it takes me back to some of the publications I worked on 15 years ago about self‑regulation on the Internet and also about some doctorate dissertations that I supervised.  Yet when I look at that work, then there is a realization that self‑regulation and co‑regulation, but especially self‑regulation, can only go so far.  There comes a point when you sit down and say, "Does it work?  Or doesn't it work?" 

This is the point to a certain extent, which I understood David Kaye is saying.  Fine, you have the norms, but then how do you enforce it?  Because the minute that you decide to go across a border, this is when you get into some sticky territory.  And those of you who have been following me around the rooms in this IGF, will know that I am trying to keep my message simple.  But the results can be quite complex.

What is my message?  My message is that when I talk to citizens out there, when I look at the research that we have carried out over past five‑ten years with citizens around the world, asking them what they want of the Internet, what they expect out of giving their concerns, what they expect out of surveillance, how they look at surveillance, the answer boils down to, you know, I want two things:  I want safeguards and I want remedies. 

Quite simply, and this will be familiar territory to anybody in the room who is either worked as a lawyer or indeed as an IT person, and so my first reaction to this is if we are working in an Internet without borders, and that is clearly the case, then citizens expect safeguards without borders.  And if something goes wrong, they want remedies across borders. 

Now, we just have to sit down and say responsibly, sensibly, can we deliver those safeguards without borders and especially those remedies across borders without involving governments.  With all due respect, my answer is clear, and my answer is no. 

With all due respect, this does not leave, of course, the corporations or the other stakeholders.  Of course, it doesn't.  But there are some things which governments can do which corporations cannot do, and there are some things which corporations cannot do which governments can do.  So the first thing that I would ask everybody to think about is if what I'm saying is right, then we have to go to the next set of hoops as Bertrand was referring to. 

The next ones are we have to face facts and say, "Listen, guys, there have been other difficult subjects on the planet.  Internet Governance was not necessarily the most difficult one, but did we shrink back saying the only way forward is the international agreement?" 

Have we tried to negotiate an international agreement on climate change?  It's taken a long time.  It's gone in fits and starts.  But now we're seeing some improvement and some impact.  Of course, we are.  Is everybody on board yet?  No.  But does that mean that we stop doing things?  Of course not. 

Because in the end, you can only get so far with self‑regulation.  In the end you can only get so far with the Internet Governance Forum.  In the end, you have to stop talking and start acting.  And I think that this is where.  And the next subject that was raised is that the lady from the External Advisory Service has left us, but talked about forming part of the cybersecurity world.  Well, by an accident of history, I have part of the security world for the past 20 years.  I've been trying to build bridges between the privacy communities and the data protection communities and law enforcement communities and security and intelligence services.  And all of these have an important part to play in life. 

And the legitimate role to carry out, but that doesn't mean that because they have a legitimate role to carry out which should shrink back and back and say there is nothing more we can do to improve things.  When we look at the history of policing, when we look at the history of intelligence services, it's been one long attempt at refining things.  And refining things is very important.  Did we get it immediately right when we set up our police forces?  Of course not. 

Did we immediately get it right when we set up our secret services and intelligence services?  Of course not.  Does it mean we have nothing left to do in fine tuning the way we have oversight to them?  No.  And what I'm also saying there is that I resist talking about privacy versus security.  I think it's privacy and security.  I think that ‑‑ I think that ‑‑ I think that privacy and security should complement each other as freedom of transparency. 

In the room next door, when I just came from, I had something to say about this almost obsession with balance.  I don't know how much you guys like cooking, but, man, do I love eating.  And when it comes to making a good ‑‑ you don't necessarily say, okay, I have now put in 50 percent, so I'll put in 50% paprika.  That's not the way you cook.  You put things appropriate to what you are cooking. 

I don't see why we don't put similar approaches into the Internet an analogy.  We've spent the years of making the car more efficient, faster, more comfortable, more convenient, but did we invent safety belts very early on?  No.  We only started bringing in safety belts in the 60s and 70s, 75 years after the car was invented. 

Did we bring in airbags immediately?  No, we didn't.  Did we start removing the wood out of cars and put in padded stuff immediately?  No.  It's come in slowly but surely.  At first we tried checks but then it wasn't working.  As a matter of public policy, if you look at the mandate of the IGF, the first line is public policy, which can help things on and so forth.  As a matter of public policy, we intervened with manufacturers and the motor car to try and make the car a safer piece of technology to use. 

To my mind, when we've come ‑‑ when we're looking at the Internet we are more or less at the same place where the motor car was at, let's say, 50 years ago.  It's got brakes.  The first cars had brakes, too.  Now they have got much better brakes and they've got ABS and a whole load of safety features.  And it seems to me we have been incapable of putting in all the safety features which we desire on their own. 

Don't get me wrong.  I'm not saying that the law is the only solution.  Far from it.  I'm also saying we should work with our colleagues on the technology side to make sure that we have the right technical safeguards, including encryption, which should be there, encouraged and promoted, because encryption will help make the Internet a safer place. 

That being said, you can't rely on encryption alone and you can't rely on the law alone.  When people ask me, "What kind of law to you envisage?"  My reply tends to be, "Well, all kinds of laws." 

You have soft law, you have harder law, and we've had a history in data protection of soft law becoming harder law, but here is where I go back to what I started off before:  We have to face facts.  And if there are some things where we failed so far because we were always destined to fail.  This afternoon I'll be talking about our being destined to fail, we're destined to fail, because there are some things the people in the room cannot do.  That is where, of course, if there are some people from governments in the room, then we need to turn around and tell them we can't do it alone.  We need you to help us. 

And as much as you might hate to do it, you need to sit down with other governments and come up with a sensible solution.  Because under international law, if you want remedies, a word which David Kaye did not use just now, but which he implied, is that something must be justiciable.  Something must be capable.  If you have a norm, then you have to go across and look at the norms.  I don't have enough time.  I understand you were understanding the need for new norms.  Sometimes you need brand new norms, like the cybercrime convention, we have to say the hacking is an offense.  Sometimes what you need to do is take a principle and develop things from that principle. 

This is an example, I think, where if we're looking at good practices, and IGF is supposed to enable us to look at good practices and exchange good practices; right?  Just look at the difference, and I'm not saying this because I happen by an accident of history to come from Europe and it's true.  And before the Europeans came up with Article 12 on the right to privacy.  Yet in 1950, the Europeans came along and said we're going to have a European convention on human rights and in our convention it's not Article 12, it's Article 8, the right to private and family life. 

What is the difference?  The difference is the Europeans did not stop there, because the context changed.  The context changes included the arrival of computers and the arrival of the Internet.  So when computers came along they say we recognized it already, or should we develop it further to provide better guidance and more protection for citizens, for businesses, for governments?  That's what has happened. 

What we have seen in Europe is we have seen a methodical approach led by the Council of Europe and then emulated which developed more rights.  And I think that a study of the way European law developed there, then with all its imperfections, ladies and gentlemen, it has served as a lovely case study of what you can do well and successfully, which mistakes to avoid when you are trying to develop more detailed regulations. 

And I'm also saying that Europe has got to learn from other people, too.  But at this stage, what I'm trying to point out is that if you have awe regional approach, which seems to have worked for some, and the very interesting point that was made by speakers, I'm trying to remember who said so what happens when you actually export a good practice or law?  Now, this is a very important point.  Because what we're saying now is if you have exported a practice or law, and it's a good practice, my point here is a good idea is a good idea is a good idea, no matter where it comes from. 

So who cares whether a good idea comes from the United States or Europe or China or Russia?  Why can't we just sit down and get together and advance things forward?  And I'm saying this, ladies and gentlemen, as a veteran of Internet negotiations, I look at the gray hair and understand how many times I've been around the block, including negotiating several legal instruments.  What I'm saying here, too, yes, it's difficult, but it can be done.  And the fact that we have done it in the past encourages me that we can do it again in the future. 

Mr. Chairman, you've been generous with your time.  I can go on about the subject for the next 30 years, if you please.  But we don't have 30 years.  We have less than 30 minutes if we want to go and get some lunch.  So I'm going to stop here.  And I'll be very happy to take questions and elaborate further on any ideas which hive put forward.  Thank you. 

>> MODERATOR:  Thank you very much for this extensive package of thoughts and proposals, because when I opened the panel, I said to all of my panelists, please give us three to five takeaways, things that we can reflect upon elsewhere that we maybe can develop and continue.  So you have certainly added a big chunk of food for thought. 

I'm afraid I'm seeing the audience has already diminishing, of course the program is packed at lunch.  For many not lunch but another meeting time.  So as you already said, you will be around for a couple of days and we have more opportunities to listen to you and to discuss and to take up things.  I would propose that we conclude our debate now, but you have all heard the invitation of professor to continue the debate, which is necessary, not for the next 15 minutes, or the next 30 years, but somewhere in between, so that we can come to the interaction.  I think that is the common denominator.  All of our panelists say we need to advance the issues, be practical, and look at the political context, look at economic context, economic interest, at the way how our societies are dealing with this.  So I would refrain from trying to sum up anything else. 

Thank you very much for coming.  Thanks, first of all, and please a round of applause to our panelists. 

(Applause)

>> MODERATOR: Of course, many thanks to our Brazilian host who made this possible, and to everyone else who was involved in preparing it, in particular, from my team.  So I hope you take home a lot of ideas.  Thank you very much for coming.  And the session is closed.  Thanks a lot. 

(Adjourned)