IGF 2019 – Day 4 – Raum V – Concluding Breakout Session: Security, Safety, Stability & Resilience

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> SYLVIA CADENA: Good morning, everybody.  My name is Sylvia Cadena, I'm with the MAG and with APNIC Foundation.  I'm happy to see you here for the concluding session for safety, security, stability, and resilience track. My colleague Ramesh Charia and I, also a member of the MAG, organized this session and the session for the track and now this concluding session, trying to capture the richness of the contributions from the community to the community of the IGF and see how this idea of organizing the program into the three main buckets, the inclusion, safety, security, resilience one actually worked out.

We have a small group today.  The idea for the session today is more about a conversation around dividing the group into ‑‑ into sections, into areas.  With the idea of just discussing what are the key take away ‑‑ the key take-aways from the sessions on the safety, security, stability, resilience track, that you have attended or the sessions that you organized in this track that we can collect.

As you may have seen on the IGF website, there is an effort to capture the outcomes of the IGF into one single page.  The messages for all the sessions in the tracks.  You can see them on the main website of the IGF, a link to the Berlin messages.  It is a compilation of all of the conversations.

So whatever you discuss in the two groups that we are going to set up in a minute, it will be part of how the final document for the Berlin messages on the safety, security, stability and resilience track are organized.  That will be published in three weeks after the event.  Right?

So we have some time to refine those.  I have shared a few Google documents with the organizers of the sessions, trying to capture some of that information.  And the IGF Secretariat is trying to help us capture what is coming in the reports from the sessions.  It is painstaking to actually capture the messages.

Okay.  Okay.

So I'm not going to ‑‑ when I hold it like this, it feels like I am going to sing.  I'm not going to sing.  Anyway.  The idea is to structure the messages that way.  It is important to take notes.  It is important to take time to consider how the messages are coming.  We will do a lot of promotion and it will be input for the next IGF and how the MAG designs the program and if the three buckets work, maybe we will use a similar approach for the program next year.

We need to know to capture that richness and that diversity from the different sessions that were happening in this track.  I will pass the microphone to the moderator so you can introduce yourself.  We have two amazing men that will help us with the discussions.

So we are going to divide the group in three.  I will give you the roundabout, about how we are going to divide those.  I will give a few minutes to Amir and Alexander to see from the experience of the IGF and how they lived the safety, security, stability and resilience track, as facilitators of this conversation, what are the messages that they have seen and hopefully with the help of the group, we will be able to capture more of that.

>> Thank you, Sylvia.  Good morning, everyone.  First of all, the safety, security cannot be the conclude.  We cannot conclude all the time.  We have to discuss and continue with this safety, security, and stability.  That is the main goal of our network.

Day before yesterday, we had done this session, there were other groups, and Sylvia confirmed we're creating two groups ‑‑

>> SYLVIA CADENA: Three groups.  I'm sorry.

>> Okay.  There are three groups that will be managed by different people. Alexander.  And ‑‑ should I go through the distribution?

>> (Off microphone)

>> SYLVIA CADENA: So there are going to be three groups.  Alexander is going to facilitate the one on security and safety.  Amir, I'm sorry if I pronounce your surname wrong.  Amir.  Right?  Okay.  Will facilitate the one on stability and resilience and technology traits.  And I will take on the (?) and human rights.  I will pass the microphone or if the two want to come here so you can explain about what you have seen in those tracks and introduce yourselves to the groups and we will divide the group in three and we will follow‑up.  Amir, you want to go first?

>> AMIR:  Thank you.  So what we have seen this week is the need to create more pragmatic bridges between the communities.  There is diplomatic language, there is a technical language, there is a policy language.  So this is one issue that we see recurring.  Many of the discussions reached a terminological sort of uncertainty.  One thing is to recognize the different professions of the stakeholders.

The other thing is to look at the different roles that stakeholders can have in some of these discussions.  States, companies, and Civil Society.  There is something a bit missing in the description of Civil Society in the formal nonstate actor world, right?  And we need to look at this issue in a more comprehensive way.  I think the third thing that we have seen is that when we look at the issues from a very practical point of view, we have held this in several areas, then there are more common grounds than one would look at, at first blush.  When you look at the big issues on the concreteness case, you can have a better understanding moving forward.

We have just finished our discussion in this area, where we have shown that creating common terminology about a specific issue, in our case, information sharing, can help not only domestic policymakers, but interoperability between jurisdictions, without having to do the big type of more complex discussion that we may have.

The IGF is a great place for that.  It includes Civil Society and corporations, which are basically players here.

>> SYLVIA CADENA: So Amir, will sit on this side of the room.  Alexander at the back.  And we will be up here.  So now, Alexander.

>> ALEXANDER:  I would like to continue Amir's concerns about uncertainty, but I would like to watch this from a different angle.  Technologies are being developed, there is the application that came to our everyday life.  Now everyday life we want to be really safe.  So safety is really important ‑‑ feeling of safety at least is really very important.  We want our security measures to be effective.  Not only in technical view, but in how it affects our society.  So we definitely need to continue discussions so we can stop or think that our documents are now it comes and outcomes of the whole IGF is a final document.  We need to follow technology development, follow society development, and follow all this interactions between technology, society, also Government somehow wants to participate in this.  So let's discuss it, actually, I think we will be only facilitators, so we will not bring our opinions.

So please actively participate.  Say what you have seen during this whole week.  We try to write to down.  As the participants who are not just now, during the next three‑week period you can comment on this.  Please be more active.  Because your opinion could be that one that someone from another country or region doesn't even think about.

So it is important for us to share our findings and outcomes and possible ways of development in these areas with others, to let their life to be easier and maybe even more safe.

>> SYLVIA CADENA: Thank you, Alexander.  So on the internet ethics and human rights components two of the subthemes under this track, we will be looking at the sessions that touched in the contradictions, basically between the human rights frameworks and level of cybersecurity implementations.

During the week we saw a lot of sessions where there was a difficult approach, let's say that, to try to find the middle ground and what is the compromise in how we move with policy ‑‑ not only policies, policy frameworks, but policy implementation around issues that consider ‑‑ that are really looking at the different angles for safety and security.

There are several examples about how these contradictions play.  And we can see that in many of the sessions that were presented.

So we hopefully will have an interesting discussion about what we saw in the subthemes of Internet ethics and human rights.  Safety and security in the back with Alex.  Instability and resilience Internet technology and trait on this side with Amir.  We can break out into groups and hopefully someone will very kindly volunteer to take some notes that can be passed to us.  And then we will work from there and we will reconvene in a few minutes to ‑‑ probably in 30 minutes to go back, look at what the group said and discuss as a group what we see, how can we take this effort, if that is okay with the group.  Okay?

So should we break into the groups?  And yes, please.

(Small group discussions)

>> MODERATOR: Excuse me.  If the group leader are ready to give the detail of the discussion over here, so we can ‑‑ Mr. Alexander, Mr. Amir, Sylvia.  All the three groups, please join me on the dais.

You can take five minutes each for the briefing of the group.  I want to give more time to the participant for the question‑answers, for the interactive session.  So please join.

Yeah, I will request now Amir to please, still.

>> AMIR:  Thank you.  So first of all, all of the participants here know that they are not sure that we are accurately collecting all of the discussions.  A world of caution.  I will tell you the themes I was collecting, the groups are still here.  If I do anything incorrectly, they will chase you.

One recurring theme is the need to know the norms, process and processes, which was discussed during this week.  The impact and meaning of the norms discussion.  While recognizing that norms are important for consensus, achievement and have been an achievement, there are ‑‑ there is a continuous discussion as to their effect and the need for more norms.

The other issue is that norms ‑‑ the word itself and the world's use in the norms have different meanings to different stakeholders.  And we should pay more attention to the relevant context of the way they were developed so they can be better used as means of communication between different stakeholders.

And the last issue regarding this point is that it appears that sometimes there are duplicative discussions about norms, and this can be a challenge.  This leads directly to the next point, which is the need to enable effective Civil Society participation.  And the role of the IGF in this context, we have held that Civil Society has, at the end of the day, limit little resources to make a meaningful participation in these processes and duplicative processes may strain the way they bring the messages to those that need to hear them.

And at least in the group today, we held that the IGF is a good place to coordinate these types of Civil Society participation.

The next point is a need to create basic information about the way problems and challenges are described so they can open even better communication.  And we have seen the different professions look at the issues differently.  And we need to include all of the professions which are relevant to the problem‑solving discussion here, not only diplomat and technologists, but also economists, loyals.  This is a recurring theme because the Internet is relevant in a lot of different social context.

The last point is the need to look at capacity building measures, which are not only technical but are also policy and legally oriented because this is the way that we can help ourselves great a common language to cooperate in the problem solving of this area.  It will help us promote pragmatic measures which will have a positive effect on stability and security.

>> MODERATOR: Yeah.  I will request Alexander.

>> ALEXANDER:  Yeah, security and safety.  Actually, the previous speaker nearly stole all my speech, because the discussion even from a different angle have nearly the same outcomes.  I will try to combine points.  Again, I warned my group if they don't like what I say, they can participate and come and file on document.

So we started from understanding what actually makes people unsafe, what are the threats, how to measure actions of the state.  Can we exchange the security of being harmed by terrorists to safety of freedom of speech.

Also the question was raised as exactly the same as was raised during introductory question.  During introductory session, that the qualification of how enforcers does not ‑‑ and security guides does not follow development of technologies.  Which leads to possibility to enforce the law, especially enforce the law globally.  And leads to privatization of law enforcement by private companies having much more qualification.

Close to this moments, education and people information sharing.  Because not everyone knows the modern technologies and modern approaches.  And the most difficult topic on this, not all information is available in local language.  Or people just does not know where to obtain the fresh information, updated information.

The emerging technologies were mentioned that technologies are rising and developing much faster than relations between people with communications and information sharing goes through.  And it brings us to systematic threats which new systematic approaches to mitigating such threats.

Then the difficult question of building new trust between companies, countries and societies rises up, because it is also a source of the difficulty of enforcing security in global ... in global scale.  Also participants mention that we need to change our traditional habits and traditional approaches to be sure that we are safe and we are ensuring safety of people around us.  Question of norms also was raised but I think my previous colleague did exactly talk about this.

Also, we had a lot time of what do we really want to discuss at the next IGF.  And again, the matter of trust, of building trust, of building trust between states, between stakeholders, building trust on many different levels was arised.  As may be special note or MAG, the participants asked about continuity of work.  To make sure the next IGF will definitely start from something which is higher, which is building on this IGF.

>> MODERATOR: Okay.  What is the name?

>> SYLVIA CADENA: Because I was facilitating this.  Someone from the group would be in a better position to present the discussion.  My friend here is going to join us and share the conversation from the ethics and human rights group while I take notes.

>> AUDIENCE: Okay.  Obviously, correct me or add any information I may miss.  I'm Christopher (?) (audio skipping) we're a political technology consultancy that have helped several political organizations and think tanks and most recently in Thailand and in Hong Kong on the pro-democracy side.

So we discussed a lot about human rights and ethics perspective.  And we have a very interesting panel here.

We have at least three with online persecution, harassment, or other things, (audio skipping) different angles, different experience on the Internet.

It is really difficult to try and find a balance between all these issues.  So on the one hand, we had someone persecuted by the Thailand Government on Twitter.  But Twitter not being able to do anything about it back in the day.  However, we discussed that all these kind of rules require ‑‑ all of these kind of developments require (audio skipping) and a lot of engagement with the organizations.

And it can be really hard because on the one hand, we want to try and build a completely free space, but on the other hand we have all of these kind of issues.

We also saw the issue of impersonation (audio skipping) theft used to basically use your identity and basically steal your identity and pretend to be you and say many bad things, harassment.  There seem to be many difficulties in getting all that.

We also had a very interesting contribution from a teacher here who works with young people, who asked, okay, if the Internet has ‑‑ if digital platforms have all these issues, why don't we find a better product, look for better designs, better competitor?  And there also we faced the conversation about, okay, all of this ‑‑ there are market for said play, the bandwagon effect, the social effect.  It is indeed hard to strike a balance.  That is what we found.

In order to continue finding that balance, we need to engage in more conversations and find a solution with all stakeholders of the issues.  Am I right?  Okay.  (Chuckling)

>> SYLVIA CADENA: So I guess now we can open the floor to have more questions or comments from the participants.  The different breakout groups.  And if any messages in particular from sessions that you attended that you would like to carry on for the Berlin messages, remember, this is the purpose of the session to gather that insight, the insight from the people in the room about the messages they want to carry for the Berlin messages that will be published in a couple of weeks.  I will walk around with the microphone and I hope we can have a nice conversation among all of us.

If you don't jump, I will give you the microphone.  (Chuckling)  So ...

Okay.  People are thinking.

>> (Off microphone)

>> SYLVIA CADENA: Just to break the ice.  I'm sure people will follow.  People complain sometimes, the IGF, there is not much time for participation from the audience.  So here we are.  We have all the rest of the session for participation from the audience.  So can someone please say something?

>> (Off microphone)

>> SYLVIA CADENA: If you attended any sessions or participated in any of the discussions, if there are things that the facilitators didn't capture and you would like to raise for reports on the track, of the safety, security, stability, resilience track.

>> ALEXANDER: Sylvia, we have

breakout in many different areas.  If you want ‑‑ we had discussions there could not be distinction between Governmental, Civil Society and human rights.  We need to coordinate between different stakeholders.  So be that stakeholder, cooperate.

>> SYLVIA CADENA: Or conversations you had in the couriers that are not captured in the sessions or that the IGF should take a look at this.  See if people are worrying or wondering about a particular issue, and it was not considered in the program.  So it is also a space for those kinds of issues.

>> AUDIENCE: Thanks.  So actually, I really liked the conversation we had in our group.  The idea was raced about security based by blockchain, something like that.  And the future of cybersecurity, security in the cyber space.  Before that, we had a discussion about the role of rights nowadays.  (Audio skipping) because it seems that have much more and how to keep cyber space secure.  Like, from the border perspective, I think IGF should be focused on how to feel in the international law.  That is the problem huge.  We didn't solve it.  And actually, the United Nations is the correct platform to do it, in my opinion.

>> AUDIENCE: Okay.  Yeah, my concern is ‑‑ I was going to mention emerging technologies.  Technology will accelerate so fast.  When I look at the way digital supply chains are being created, I think technology will not wait or even seriously consider whatever Governments say or do.

I think data will flow across borders, and it will be faster, a million times faster.  It will be a million times much more data.  It will lead to autonomous solutions using AI.  Not machine learning any more.  The solutions.  And the law will fall behind.

The way society will be impacted by technological advancement will be so traumatic in the next 10, 15 years we need to look at systemic answers to the systemic changes that go beyond traditional approaches, including Government or companies.

>> SYLVIA CADENA: Anyone here want to share thoughts?  Here, you can have that one.

>> AUDIENCE: Should I?  Okay.  So my name is David.  I have ‑‑ I don't know if it is related.  I think it is.  It's about ‑‑ there is a lot of criticism toward social platforms.  I think part of them are right.  But I also think it is a big possibility for those platforms to do some good, and they're doing good in parts.  And I'm thinking of if you have seen ‑‑ if there is a quick, quick, if there is some kind of natural disaster or war or there is ‑‑ there is a post about it on Facebook.  I think that is very good.  And so my idea or my thought is that why can't we encourage Facebook or Twitter to through that algorithm that understanding the topic going on or trending or being discussed in a group, that they show the work with researchers about facts about global warming, for instance.

Show facts in those spaces.  I know that is problematic, because that perhaps is intruding in some community's discussions, but I think the probability it could be discussed.  I would like to know what you think about it.

>> AUDIENCE: My name is William Neville, I work for the Australian Government.

One of the big take aways this week across the range of panels and discussions is the discussion of values and ethics.  You know, we want online platforms to consider values and ethics of society in implementing new technology.  And we want technology ‑‑ future technology or critical technologies to be developed in line with values and ethics and for companies to take account of the social contract operation.  But I'm not sure it is clear or consistent what the values are across the world or regions, even.  So I think it would be good next year to have a small focused discussion on what people mean by particular values, things like privacy and how much ‑‑ how important that is to different parts of the world, different people, different parts of society.  And this multistakeholder group.

>> SYLVIA CADENA: I think the conversation around definitions and clear, more clear, narrower scope is something that has come across all the three tracks.

So it is good to see some of the correlations between what are the concerns of the people that are working on inclusion, the people that are working on the ordinance, and our track, because the challenge to have ‑‑ people have had ‑‑ clearly indicated why or what are the challenges to have more definitions, but at the same time pushing for okay, we know they're not perfect, but we need some sort of agreements.  That is something that the IGF should work harder on.

>> AUDIENCE: Thank you.  My name is Hans Beckman, I'm a teacher in Germany.  This is the first time for me to take part at one of the IGF meetings, and I have to confess, at the beginning, I had to understand the format of the program and who has to say what, am I allowed to say something?  Do I know enough to give any statement or to take part in this?

But to me, the most important thing also, today and within this group, this room, it gave me an impression about the Internet community.  That is to say, I'm very often doing research in Internet, I use it as you do it often I do not think about the dangers and limits or these things.  Now, if it comes to meet people here from all around the world.  We had people from India Thailand, Pakistan, I don't know if Sweden.  It makes me really to be a part ‑‑ to come together with people in the virtual world of Internet are not present in that way.  To me it is important.

It is interesting to hear about the problems people have in different parts and people had to report about things which are not my problems.

I'm lucky enough, I haven't been so far a victim in the Internet, but it is very interesting to hear the reports from other people and to think about how we can prevent these things.  I'm very much interested in visiting the next meeting in Poland.  Thank you very much.

>> ALEXANDER:  May I respond shortly.  Please share this information with your community, your stakeholder and different stakeholders in your country.  So IGF is not completely just Governmental meeting, go to the umbrellas, share this information, invite more people, share with them final documents, share them the findings of your sessions, and share with them the feeling that you are part of the global community.  Having discussion is a real good thing for building.  To stay in safety, security, stability, resilience and right‑protecting Internet in the future.

>> MODERATOR: You are part of the stakeholder, and you have full rights to this.  You can speak anywhere into any session.  Don't forget that.

What I want to say ‑‑ the time is there.

>> SYLVIA CADENA: Just to remind everybody, that there is an open microphone at the end, in the closing plenary.  So if you are interested to make a statement, that is also a good opportunity to make it in the plenary.

>> MODERATOR: Yeah.  What I am able to gather, no doubt, the Internet communication media is a strong media.  Technology is advanced in a high speed, but at the same speed your ethics, right, regulations are not going in the same way.  That's the reason.  Someone do mischievous thing on the Internet and policymakers are not understanding how to resolve the issue timely.  This time factor is the biggest factor.  That destroy everything until the time that we are able to get recover through that policy, by that time the damage has been done.

What I feel is as an Internet user, we should also be doing the self‑regulation while doing anything on the Internet, we must control ourselves.

If you are putting anything on the net and at the same time we see that our privacy is infringed, sorry.  And putting anything on the Internet, your privacy is compromised.  Don't expect that nobody will misuse it.  What is my suggestion?  Only put those things on the Internet that you feel it is for the public domain only.  Not for the private domain.  As well as with the responsibility of the Government.  Our Internet users and responsibility is a bit higher to protect ourselves from any mischievous thing.  Thank you.

>> AUDIENCE: Thank you, Ramesh, that is an important cc.  We all have a responsibility to understand the issues on the Internet.  I'm curious, in this room here, how many people work in the tech ‑‑ how many people work representing tech platforms?  Anyone from Facebook, Amazon, Facebook, big tech companies here not in this room?  In the other room.

So I think maybe personally what I would like to see as well as getting them more involved here, whether it is through the audience or through a panel, through discussions, because before we can hold them accountable, before we regulate them, we need the conversation with them, too.  We need to make sure ‑‑ maybe some of them are not intentionally evil, right, or intentionally laissez‑faire, they don't want to do anything, but they need to be involved in the discussion.  That is a quick remark as well.

>> MODERATOR: Amir?  Anybody wants to speak, anything?  Yeah.

>> AUDIENCE: My request for comment is open to all in the panel.  I'm speaking from an Indian's perspective, where, what are your views on how we can reach grassroots in terms of security and safety.  Because like one example in India is there is a huge surge of use of smartphones and use by people in grassroots level communities.  A lot of them don't have an idea of how to behave online, and they're stakeholders that are creating ripples and intermediaries are held liable for their activities.  How do we work on reaching out to them?  Also, what is your take on using arts as a medium, you know, to reach out to them.

>> ALEXANDER:  I think IGF is not actually intended to answer such kind of questions.  It is an exchange of ideas, so you can involve colleagues from different countries.

Understand that problems exist differently.  It exists not only in India, but you also need to get back what you heard here to tell to your Government or Civil Society, the way everybody is taking on education, the daily behavior because compared to centuries away, decades away behind us.  The world is completely changed.  The intention of IGF for you is to bring other possible issues.  Discuss them in your community, in the next year, bring your outcomes back to us and do this again.

How do you is ‑‑ for sure, you have right to ask this question, I'm not sure you get the only one and only right answer on this.  Listen to everyone, communicate, provide your own answers by the way.

>> SYLVIA CADENA: If I jump in to answer a question.  In terms of security, one of the concerns from the technical community that I represent is that awareness campaigns are based on solid, technical knowledge.  It is not ‑‑ the awareness is not about raising fear about the technology, but about our own empowering the community to actually use platforms and protocols and tools in an effective way.

The emergency response team around the world, that community has a lot on the security and safety awareness.  There are efforts from some of the platforms and the content providers that are here, for example.

You know, these are examples that would be said, it is about building your own answers and making sure own, whatever it is you find here.

But one example comes to mind from companies that are here is Microsoft.  The safety chief officer of Microsoft is here, Courtney Gregoir.  Microsoft because of the work done on driver's license for schools to use software and their tools in the Cloud for, you know, kids in schools.  They have done quite a lot of modules that go from primary school all the way to university.  Same issues but for different audiences and using art and animation as a way to explain how things work and what you can do about it.

There are also campaigns from the World Bank in the Council of Europe that are doing awareness raising on cybersecurity and safety.  There is ‑‑ like the digital safety foundation.  Let's say.  They have a booth down there.  They have their manual with all the materials they're using for awareness.  There are a lot of examples that can be used.  From a technical community perspective, what we strive for is that people preparing the materials, especially for the young people, refer to accurate technical information.

>> MODERATOR: I'm from India.  I am a resident of the ISP association.  And we are doing this awareness program into our country, especially in the schools and into the rural sites, understanding that 65% of our population is from the rural.  Digitally they are electric by using the smartphone.  But there are illiterate.  To use it in an authentic way is the main concern.  We are doing that.

All of the service providers in the small district are doing the awareness program into the respective area, so that one, they should not misuse, and second, they should not be victim.  So that's the responsibility we have and we're doing that.  But apart from that, sometime, in our country, I'm using the word (?) cannot be acceptable now.  We have to control ourselves while using the Internet.  So that is my say to you.

>> AMIR:  I will add just one thing.  That awareness is very important, and this has been really important.  Sometimes you need to use the law as well.  So what we have seen in Israel is that we needed to impress specific criminal sanctions on distribution of videos that were taken privately and distributed publicly.  Even though it was a criminal offense.  This signified to the users that this is unacceptable and a specific legal warning against it.  It is part of awareness raising.

Sometimes you need to use the legal tool to signify that this is unacceptable, and the Internet and the smartphone is not in no‑man's land.

The other thing that is also very important is that sometimes the platforms can do more.  This is an ongoing discussion.  It depends on what country you are in.  As users, sometimes, as you have rightly described, the quick gratification, and not always thinking about the future and the next step is.  This is the human being, we're not always rational.

So sometimes policymakers should think about the way the user with the most awareness, everyone on this panel is of course aware.  But also user sometime ‑‑ and policymakers can help them not get in a difficult situation.

We see this in the most important area with children.  Children, by definition, have less capabilities to understand their risks and choices that they're given.  So you can give them a lot of choices but they will not use them correctly.  Therefore, we need greater responsibility on side of the platforms when they're dealing with children.  It is a combined effort.

>> MODERATOR: Thank you very much Alexander Amir and Christopher for joining freshly into the session.  And thank you, everyone, for participating in this session.  Feel free to write anything on the board, and your issue.  If you are able to remember later on, you can put that on to the website.  Thank you very much, Sylvia, for this successful session.

I think all the things that we have discussed will go to all the techy sites very soon.  Thank you very much.