IGF 2021 – Day 4 – Main Session BPF Cybersecurity

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> HARINIOMBONANA ANDRIAMPIONONA:  Ladies and gentlemen, greetings, everyone!  Thank you for joining this session on Best Practice Forum on cybersecurity from wherever you are in the world, online or onsite here in Katowice.

I'm a MAG member and facilitator of the BPS cybersecurity.  Special greetings to the several hubs with us, particularly the hub of Madagascar from the university.  First of all, I would like to remind that our session is fully hybrid, so do not hesitate to share comments.  For those who are with us here, you will able to have the mic and take the floor.  That said, I'm honoured to introduce this session today. 

Now, I would like to give the floor to Markus Kummer who is co‑facilitating the BPF with me.

>> MARKUS KUMMER: Good morning, everyone, good afternoon wherever you are.  It's a great pleasure to be here. I would like to just say a few introductory words to put the IGF BPF on cybersecurity in the IGF perspective.

As you will recall, the BPFs were reintroduced in 2014 with the objective to produce more tangible outcome.  This has been a frequent criticism of the IGF that it did not produce outcomes.  And the BPF on cybersecurity clearly has produced outcomes.  It was started in 2016 with evolving focus and Maarten will tell us more on that.  And it has produced outputs every year and also shared them with the broader community on intergovernmental processes such as the open‑ended Working Group in the United Nations.

And right now I think there is much talk on the future of digital cooperation.  We have the Secretary‑General's Roadmap on Digital Cooperation.  We have the Global Digital Compact on the horizon, and the IGF plus which is supposed to fit into that.

And there are calls that the Secretary‑General has called for a more inclusive, focused, relevant and outcome-oriented BPF.  The BPFs in general fit into this scheme, and this BPF on cybersecurity in particular as it has been very focused and very relevant.  And with that I hand back to Maarten who is moderating the session.  Thank you.

>> MODERATOR: Thank you very much, Markus, and thank you for attending this best practices forum on cybersecurity, our final meeting of the year.  As Markus has mentioned, this BPF has had a wealth of history.  We have been at it several years, but the last few years we have focused on the topic of cyber norms, and I will actually try to briefly share with you what our history looks like.

  I'm hoping you can see this in the room and virtually. This is December 10th, 2021, we will be covering a few topics today.  First of all, I will introduce you to the work we have done over the last year.  We will have a session on overview of the work that happened in one of the two work streams focused on assessing international cybersecurity norms, and we will have a deep dive into a secondary topic where we tested cyber norms against historical cybersecurity events.

And finally, we will have a wide panel session on several of these topics in more detail.  In 2018 the BPF started looking at the idea of norms, and we did that not so much rooted in the ongoing discussion at the time around cyber norms, but we really stepped back to seeing what cultural norms and values meant to different stakeholders.

We looked at the different norms development mechanisms that existed and the places where cyber norms were being developed.  We found a lot of those were state driven, but that there was actually also a lot of emerging development happening in the technical community and civil society that we should not discard.

In 2019 we then looked at norms operationalization. How do norms actually get put into practice and how do organisations that sign up to these individual agreements actually internalize the discussion there to make sure that the norms actually hold value as opposed to simply words?

And that year we also started taking a look at cross stakeholder agreements that existed in the norm space.  We started looking at normative agreements that have participants from many different stakeholder groups, and how they work together, how they convene and what they ended up writing out as they put letters to the common understandings that were cascading through the community.

In 2020 we took a look at normative principles in global governance.  So what we really did that year was we looked at the very wide history of social norms, and how they actually affected different types of governance outside of cyber.  So we looked at norms around nuclear materials, we looked at various different areas to understand how norms had been effective in other parts of the global community and how we could learn from that when we develop and think through cyber norms.

We discovered as norms were developing, there started to be commonalities between the different normative agreements and we started wondering where do those commonalities come from?  What are the drivers?  And in particular, we have a thesis that norms are inspired by real life cybersecurity events, and we use the word event instead of incident because not necessarily every particular event that may have resulted in a norm developing would have been a cybersecurity incident.

And we wanted to leave it open that some of these events that led to impacts could actually be disclosures of vulnerabilities, discussions, things that wouldn't always fit everyone's definition of an incident, but that would have fairly widespread impact on the community or at least to require us to change our ways in certain perspective.

So we brought together three different work stream groups.  First of all, we focused on understanding the different agreements that were in place.  This work, as I mentioned earlier, we had actually been doing three years.  We simply continued that initiative, and John Hering and Pablo Hinojosa, John is another session and Pablo is here representing this work stream.  They make a selection on the ones that were most impactful from the perspective from the cybersecurity events and started analyzing what some of the patterns and commonalities between these agreements are, and most interestingly also look at the things that were less common, and that were more unique or less mentioned across these different agreements.

Over the years we have been increasing the amount of agreements we cover from around 20 to at the time 36 normative documents looked at this year.  A second work stream led by Mallory Knodel looked at the historical cybersecurity events.

They started by compiling a list of major cybersecurity events over the last 20, 30 years and then started diving deep in a smaller amount to qualitative questioning and trying to answer the thesis, how would a specific norm have been affected mitigating this cybersecurity event.

Both of these groups have published papers which are in draft and available from the BPF on cybersecurity website.  You can find the documents on the website accompanied by the short two pager that I recommend reading if you don't have time for the full documents.

Finally, we have a work stream led by Sheetal Kumar, and Markus Kummer who have been working on it, they have been working with other bodies to understand how can we bring what we are learning here to the wider community and get greater value from it, and how can we bring new participants into the Best Practices Forum to work together on these issues and get more high quality analysis of the situation that is really emerging in this very complex cybersecurity world?

So if I would have to summarize the work that we have focused on this year, it is really about identifying how the rules of the road actually meet the rubber of the road.  So where is it that these rules are truly taking effect, and where are they having value?  Where are the challenges?  This is what our team spent this year analyzing.

Now, with that, I mentioned that we have a really interesting session coming up.  I will hand it over shortly to Pablo, and he will talk about the overlapping analysis of international cybersecurity norms agreements.  We will go into that interesting analysis of how have these norm concepts held up against cybersecurity events and if we take new norms and apply to older events, what do we learn.

And finally we will go into a panel and enrich the conversation about these topics with the goal to take what we learned from the esteemed panelists, from the different volunteers that worked on the effort and from all of you, and bring that back into our final papers we will publish after the conclusion of the Internet Governance Forum.

So thank you so much for being here today.  It is you who makes this effort worthwhile.  It is you who helps us make this as effective as it can be.  So we really greatly appreciate your time and we hope that we find what we have to share here today interesting.  Thank you for joining and Pablo I will hand it to you for the next part of the session.

>> PABLO HINOJOSA:  It go be nice to be in Katowice close to many of you.  So I am going to present some of the findings from the work stream one and I want to just make sure that you can see my presentation and can I confirm that you are seeing the proper slide.  So it's Eneken and myself talking about the work of the work stream 1 that Maarten introduced.  I will talk about the quantity part of the study, and Eneken has done a good job in having sort of quality and extra to this work.

This work has been mostly led by John Hering that unfortunately had a coalition in his IGF with the work organising probably in a room close to this one.  So I will try to do justice to sort of the work that he led.  However there were a very good number of volunteers that pull up most of the work for this work stream and this was a best effort, voluntary type of work which was crowd sourced and well organized by John.

Recognition to him here.  So as Maarten said we continued to unpack a collection of norms, and work stream one was tasked with the mapping and analysis of these norms and agreements.  Something that is worth to remember is how the Best Practice Forum collected these agreements.  So there was a special criteria, and we thought, well, these have to be firstly international in scope.

They have to absolutely have a mission to improve the overall state of cybersecurity, and these recommendations should apply to all groups signing these agreements.  Something that is very important these collection of agreements do not include treaties or Conventions or other legally binding agreements between countries.

We only focused on voluntary, non‑binding norms for cybersecurity.  So these are the collection of norms so far that we have been analyzing, and I think it's a very good collection.  We increased the number from the previous report in 2020 there were 22 agreements, and nowadays, with he have 36 agreements to work on.  These agreements are either multilateral and these are the UN agreements or just the 2015 report of the group of governmental experts or the recent open‑ended Working Group agreements.  The other big category is single stakeholder mostly governmental, but there are also industry agreements such as the Cybersecurity Technical Court or the Freedom Online Coalition's recommendations.

And importantly there are cyber norm efforts that have been agreed in a multi‑stakeholder setting.  They are not the majority of them, but they are important efforts there such as the Paris Code for Trust and security, the Siemens Charter of Trust, the Global Commission on the Stability of Cyberspace norms, and we included here the contract for the Web of Worldwide Web Foundation.  So this is the collection of norms.

This is our data set, and we try to make progress on analyzing these.  So in the previous report, I hope you have read it in 2020, the idea was to map most of these agreements to the eleven most famous norms from the UN governmental group of experts in their 2015 report, but this time we improve the framework, the analytical framework, and we came up with basically six categories of norms.

So we divided in rights and freedoms, information security and resilience, reliability of products, cooperation and assistance, restraint on developments and use of cyber capabilities, and technical and operational.

From these six categories, there were more detailed up to 26 specific norm elements that volunteers look each of the agreements and try to pull out paragraphs of these agreements into each of these categories.  So this was a crowd sourced exercise, and obviously there is the caveat of subjectivity as part of the evaluator.

So we tried to promote shared understanding of what goes into each of the categories, but there is an element of subjectivity here.  The findings are not intended to be authoritative, but they are trying to aid the conversation, and try to improve our understanding of this.

This is John's magic heat map table, and the work of the volunteers went into categorizing paragraphs of each of those agreements into these 26 categories.  This is the heat map.  Of course, you cannot read all of the elements here.  I'm just going to highlight a couple of things.  The most or mention or the most paragraphs have these two elements, rights and freedoms, human rights in particular, which is quite an interesting finding and a non‑surprising finding that most of these deal with general cooperation.

John ran frequency tables with these percentages and we can see basically two things that are relevant, again, the general cooperation on 86% of the agreements have some general cooperation, and interesting 69% of them on human rights this frequency was consistent with the findings of the 2020 Best Practice Forum report.

And while general cooperation can be unsurprising because international norms tend to be about cooperation, but it an interesting finding is the human rights element.  Another part to mention in this table is obviously the less frequency, what were the things that were less frequent?

And this had to do mostly with restraint on the use of cyber capabilities.  For example botnets were only 8% and election infrastructure related normed were only 11% of the agreement.

Something that is interesting in this general category of restraint is that most of them are referring to non‑state actors so for those that have restraint, only 33% relate to restraint of non‑state actors.

So this is frequency over time.  It's a lot to visualize, but this graph is a little bit more friendly, and it is about how these frequency evolve through time.  And we separated the agreements into year groups from 2008, 2011, 2012, 2015, et cetera, et cetera.

You can see here how consistently sort of human rights considerations have been growing through time as well as cooperation.  So another part of this exercise was a little bit of text analysis, and what we did was to put in a mixer only the operational clusters of these 36 agreements, so basically we cut all of the introductory paragraphs, we cut all of the whereas and the background information, and we only left the recommendations per se.  So these operational clusters, we would aggregate them from all of these agreements and we put it in sort of machine text analysis and we produces interesting things.

So here is the simple word cloud of those operational clusters and you can see the most frequent words are data, information, security, cybersecurity, and cyber.  Next are shall or state, access and computer.  These are all reflected by the size of the words here.

This is a computer‑generated linkage of those most frequent words, and there is a report that we have with sort of how each of these words relate to each other, which is why interesting, and there is much more analysis that we could do, but we couldn't finish on time.

These are a couple of tables that I find relevant, so these term shall not, so the negative side, this is the right side, shall not carry out such actions, shall not mandate the party, shall not warrant transmitting, et cetera.  And they are on the more positive side sort of these sentences.  Shall be used, shall be an integral part, shall cooperate, shall cooperate, et cetera.

So this is the quantitative analysis, but obviously these descriptive elements need to be considered together with more qualitative analysis that Eneken kindly contributed to the work stream one and I will leave her to talk about her findings in each of these.

>> ENEKEN TIKK: Thanks, Pablo.  Just mindful of the time, and also the fact that I will be part of the panel discussion later, I just wanted to say two sentences about the qualitative analysis.  First the quantitative, the numbers don't tell us necessarily about the quality of the norms, and one I think we need to keep in mind is that those that are more frequently cited or repeated actually tend to be the most general ones.

The idea is that without keeping that in mind, we don't necessarily understand that consensus about the topic does not necessarily lead consensus or unity in how they are implemented.  But I would cut my remarks very short at this point.  Thank, Pablo for giving the overview, and I'm sure we get to discuss the content down the road.

>> MAARTEN VAN HORENBEECK: We will go over to Mallory Knodel who is live at the event.  Over to you.

>> MALLORY KNODEL: Thanks Maarten.  Hi, everybody!  Wish you were all here with us here in Katowice but nice to see you online.  So I'm going to talk about the second work stream which the report is titled testing norms concepts against cybersecurity events.  As Maarten said our main research question was how would specific norms have been effective at mitigating adverse cybersecurity events, and from there we planned research with a group of folks similar to the way we approached our research last year, and we had some returning researchers, volunteer researchers in our group and I want to thank them.

 

It would have been difficult to do all of this work in the last year without their help, so I just want to acknowledge them.  We had an ambitious plan to look at some of the most cybersecurity events the last couple of decades, not only to do sort of general amount of desk research to understand them better, but then we also tasked ourselves with doing more qualitative analysis.

We sought to interview and understand better the impacts of the folks who were most affected by these cybersecurity events, and then those who mitigated them as well.  So that, as you can imagine is time intensive but incredibly rewarding and allows us to really lift up the voices of those most affected into a platform like the IGF and hopefully we have their ongoing involvement as these reports and findings make their way into treaty‑level discussions where we can start building a story bank of how cybersecurity will really impact people, and I people‑centred approach to cybersecurity, so it's an important exercise.

Before I get into some of what we found and what, because we will also talk with some of them during the panel as well.  I just wanted to talk about our methods and how we approach this problem.  So we first had to determine out of the vast choice of cybersecurity incidents that had a significant impact which we were actually going to study.  As you can imagine even the first step was a bit difficult.

We wanted to make sure that out of all of them, we had some spread, that we were able to glean a large scope of results and findings, so we made sure that the representative cybersecurity incidents we chose were spread across geography, of course to make sure that they, that we were looking at things that affected all parts of the world., that they either were successfully mitigated, that some of them were actually attributed.

We wanted to make sure the outcome was compelling, that they had proper scale, they really did affect lots of people, and or change the conversation as well.  That's another version of scale of impact that things changed as a result.

And that there was just proper knowledge and documentation and acknowledgment of it in the larger collective imagination.  Essentially press reported on it.  There was some public record of what happened.  Let's see what else.  We also wanted different types so as you can imagine a cybersecurity event as Maarten said doesn't mean there was a reach or something like that.  We wanted to take a broader view, so we included types such as malware, DDoS attacks, persistent threats, technique disclosure.

So that is sometimes a version of a threat or, I'm sorry, a breach, vulnerabilities, supply chain issues.  So let's see.  I want to make sure I have covered everything.  We ended up with nine then.  So from taking a sort of global spread across different kinds of incidents, we ended up with nine events that we wanted to learn more about just based on what was available online and various reports that have been published in the past.

So most of our time, I think, was spent in this phase, really looking through literature, trying to answer the questions, the larger question which turned into a few sub questions.  So the larger question would cyber norms have affected the outcomes.  We first had to answer what norms would, what existing norms would apply, which norms did apply depending on when the incident happened, which norms could have been helpful to try to expand that more.

I could also list them out, and you can read the report as well, but one that we looked at, we did this sort of in linear time, was the CIH virus, 1999 that targeted Windows machines specifically.  It certainly precipitated a lot of response from Microsoft at the time which has been fantastic.  As you will note, Microsoft is still work involved in this work today and driving a lot of it, which is excellent.

We looked at the Estonian DDoS attacks in 2007.  Just after the country had digitized almost every aspect of governance, the DDoS attack took a lot of that offline.  A really interesting one as folks already on this stage today have talked about NSO group and targeted attacks.  We had one we looked at back in 2009 ghost net.  Ghost net was targeted at activists, if you remember, uncovered by The Citizen Lab and one of the first examples that was widely reported of these targeted attacks against civil society journalists and so on.

That is one where we went beyond desk research and did more qualitative interviews and hopefully we have someone from Tibet Action who will be joining us in a little while on the panel to talk about how they were affected.

We looked at Stuxnet.  A lot of you will recognize that name, a case where infrastructure was targeted.  We looked at because of the impact on conversation, maybe the most significant one on our list and that was the Snowden disclosures.  We surmised that a lot of the norms packages that work stream 1 was inspired or had a lot to say about what Snowden leaked in 2013, and why we didn't see obvious evidence of that for a variety of reasons, it wouldn't necessarily be documented, we did feel that was an important one to include.

Anastasiya looked at Heartbleed which happened in 2014, this affected web sites around the world, transport layer, security, massive vulnerability.  Similar to the Estonian attacks, 2017 when the Aadhaar data breach in India occurred.  There are not a whole lot of similarities other than it's  Government‑level infrastructure for delivering Government‑level services through unique ID.

And then lastly no, sorry, not lastly, recently in 2020 we covered the SolarWinds attack because that was an interesting one from a supply chain hardware perspective, and that also had qualitative review. So thanks to Alice Wild and her team and NSO group Pegasus, that is an ongoing persistent threat that is software being sold to nation state and non‑nation state actors to target individuals and still making headlines pretty much every day.

So it's good that we were able to also more deeply investigate that through a qualitative review.  So I don't think I have a whole lot of time left, but I just want to say that we gleaned a variety of findings.  I'm actually going to encourage you to read the report to get the findings.  I didn't want to present them because we want feedback on our process.

I think that for me there was so much that I got curious about throughout this research that I would like to have continued it.  I think we should actually continue it.  This is a draft report.  We could take comments, we could continue to do this research on additional events, we can do more qualitative work.

I would like to see for the purposes of continued advocacy and engagement in other norm setting and standardizing processes is actually to gather real stories, almost like a story bank where we could keep these for future discussion and presentation when they matter, when we need to make the case that, you know, there are real people at the end of these attacks.

And so I'm hoping that that could be a direction we go in.  And, of course, the Best Practice Forum is an open multi‑stakeholder process at the IGF.  And anyone can join us.  If you find the work compelling, if you would like to give feedback on prior work that we have done or join us for future work, I welcome you to do that.  With that I will end and hand it back to you, because I think we will get into more interesting discussion about what we actually found in the panel and what it all means.  Thanks.

>> MAARTEN VAN HORENBEECK: Thank you for the introduction of the work you did.  It was great work from a large team and we are thankful for everyone who contributed.  Maybe to reiterate one thing that Mallory said at the beginning of the session which is probably the most important take away from all of the work we are doing here.

Cybersecurity events and incidents are all about impact.  It is almost always completely impossible to prevent the incident outright or the event.  It is very, very hard to effectively prevent all possible impact, but as a community, it is definitely plausible to reduce impact and make things lighter on the individuals, the organisations, the social structures that are enburdened by those events.  And so the idea of actually introducing what the impact is on individuals, how they are affected and how norms can help them is for many of the participants in this group a core objective of actually having better norms conversations.

So I'm grateful for the work that the team did with live interviews and really encourage everyone to read through the document.  Now, next up we will go into a panel session, and the goal of the panel session is to really think about how can the work that was done that is available in these two documents that we shared here with you today, how can it become more effective, and what is actually the value and what are the draw backs of actually relying on some of these cyber normative structures.

We have a great group of panelists.  In order to introduce them, I will hand it over to our three moderators, they are the three work stream leads who put so many effort in the work that we are presenting here, Mallory Knodel from the Centre for Democracy and Technology.  Pablo Hinojosa with APNIC, and Sheetal Kumar, Global Partners Digital.

So I will hand it to the three of them to introduce panelists and get started with what I expect will be 30, 40 minutes of a great and productive conversation.

>> PABLO HINOJOSA:  This will be tricky when you hand the task to three separate people in three separate parts.  I think I would prefer Mallory to lead this.  I think the importance of this panel is to sort of analyze the value of these reports from the experts and contribute as well on future work of the BPF and the role of the IGF in continuing discussing the cyber norms.  I will leave my moderating task here.

>> MALLORY KNODEL: Sure, I can do my best, although you are all there and I'm just all by myself over here, but that's fine.  It's nice to see all of your faces.  I am going to give you the opportunity to introduce yourself when you make the first comment rather than doing that for you because I have lost the list in front of me.  So please, each of you take a bit of time, remembering that we have about four questions to get through.  There are six of you.  The.

The first question is what trends do you see in cyber norms development across agreements and over time and where do you think these trends lead concretely?  If you are not specifically speaking from the perspective of somebody engaged in norms, also talk about that, and your perception of where norms are going and how they affect you.  Thanks.  Why don't you in Zoom, I imagine you have some listing and you can just go from top to bottom.  I am unfortunately unable to see that.

>> CHRIS PAINTER: I can't see the listing but I'm happy to go first.  We will go through each of the four questions.

>> MALLORY KNODEL: Introduce yourself and let's go ahead and address that question first and you can do it briefly if you have plans to address another question more at length.

>> CHRIS PAINTER: I'm Chris Painter, I'm the President of the global forum of cyber expertise foundation.  The GFC is worldwide capacity building group, community of about 150 multi‑stakeholder members including many states, private sector, civil society, and I think this ties in and I will speak mostly from the perspective of the GFCE, but my personal experience for dealing with these areas 30 years.  My last Government job as the top cyber diplomat in the U.S.  First of all, I want to commend the work of this group.

I think it's very valuable looking at how, you know, what the trends are, what we have seen, how these have actually been used.  I think that, as I think about the trends, certainly, we have seen a greater attention to these issues over the years as we have seen more and more destructive cyber-attacks, cyber intrusions.  And recently as everyone knows in the UN there was quite a bit of attention with both the group of governmental experts report and the open‑ended Working Group report which I think are very valuable.

So I like the rubber meets the road sort of framing for this, and indeed what we try to do with the GFCE is not negotiate new norms but take the one that's have been done and put them into practice.  There is a lot of good language in the GGU report talking about how capacity building is an important part of implementing these norms.

If you are trying to implement these norms I think the work you have done is helpful to do that.  From my non‑GFCE hat, I note that one of the things when you talk about norms and their impact.  Norms have an impact in terms of just being norms themselves that are out there.  There is also an issue of accountability, and whether there is accountability for violations of norms and if not, then I think one of the questions is, and I think this is subsumed in the research what effect the norms will have if there is no accountability.  If no consequences, if there is no calling out of norms violations, do those norms continue to have any force?

And I think one that would bear even more study now is the norms from the UN group of governmental experts on due diligence, on countries controlling malicious conduct coming from their borders and in light of ransom were which is, of course, the hot topic now, how that plays in, how you have seen the norm used and indeed I think we have seen that used over time.

I will have something to say on the other questions, but I commend this work, and I think an important pillar of this work, and something I want to pursue is how we can translate this work into some of the capacity building work around the world to get more countries, more actors in involved in this discussion, and also involved in actually implementing the norms to give them more real value and meaning going forward.

>> MALLORY KNODEL: Thank you, Chris.

>> SHERIF HASHEM: It's a pleasure being in this lively panel.  I would like to highlight that I am a professional at university.  I am a board member of FIRST.

>> PABLO HINOJOSA:  Very big echo of sound.

>> SHERIF HASHEM: I will try to use a headset, maybe that would make things a bit easier.  Okay.  Maybe that's better now?  Excellent!  So I have been a member of the UNGGE, in 2020, 2015.  I have been part of the USNGG as well as the Working Group over the past nine years.  In my home country, Egypt, I'm currently in the U.S. but in my home country, Egypt I have led various National Initiatives.  I have been internationally engaged with the ITU and international fora when it comes to cybersecurity, developing norms, implementing them.

With regard to the work done within the best practice dimension, it is very important to look at norms and look at the requirement of, you know, building trust and confidence ahead of cyber incident, and confidence‑building measures are an integral part of the process as you can see in the UNGG reports, it's also complementary of what the norms are about multi‑stakeholderism is really important involving key stakeholders, Government, non‑government industry, academia, civil society.

But if we talk about the challenges and here also I would like to highlight it's very important to look at best practices when it comes to applying those.  Applying norms is not just a compliance list we have done this or that and things are fine.  It is a process that involves everyone all of the time in different aspects of life.

To give you an example, you mentioned botnet activities and DDoS attacks.  In my home country, we had the challenge at the beginning of any major incident, you get a report from, you know, a foreign country that there is botnet activity involving Egyptian infrastructure.

When it comes to the forms that address what countries should do, the first step is to take steps to mitigate that risk and eliminate it hopefully, which involves, you know, applying this in the framework of the country, meaning dealing with the infrastructure operators, private sector.

So you go to the private sector and now you have to, you know, deal with the dynamics, security versus privacy.  We had very interesting feedback and challenging feedback, I would say, at the beginning.  Some of the private sector said this is against the privacy of our clients.

You are talking about botnets coming from certain IPs.  Those IPs are actually allocated to could be banks, could be education institutions or others, and now the ISP is hesitant to communicate with the customers for fear that this is, you know, going to be seen as if they are, you know, spying on the customers in the traffic.

It requires a lot more than just saying that we have this, you know, multi‑stakeholder approach, and that is enough to deal with.  So I really would like to highlight it's important to study instance and how they play out, incidents and how they play out but also the dynamics that need to be implemented within a country to be able to, when you say a country is responsible for this or should or shouldn't do this, we should try to have best practices in how to employ this.

We had a challenge of reporting and exchanging information at the beginning.  I'm talking about a process that started over 12 years ago.  Initially there are certain law enforcement agencies and when it comes to diplomacy and treaties and agreements, and sometimes when you have an incident, and you are talking about the technical community incident responders who are involved with dealing with botnet activities or other cyber incidents, you have to realize that there are other partners that you need to coordinate with.

So dealing with escalation and building confidence and trust within your circle of collaborators in the country is important.  Also it's important that to have this multiplied at the regional level.  I'm currently a member of the cybersecurity expert group at the African Union and we share the same type of challenges and it would be nice to recognize best practices now to deal with them.

I would like to end my intervention and maybe you will have other opportunities to have comments on the coming questions.  Thank you.

>> SHEETAL KUMAR: Coming in as one of the moderators and just to ask Art and perhaps come back to you as well if you have something to say on this.  One of the findings in the analysis was that, and I think there was a session about this topic as well at the IGF earlier in the week, was about the importance having assessed the Heartbleed and the Ghost Net incidents, about promoting the neutrality of the technical community incident responders and vulnerability analysts and the importance of that in ensuring timely responses. 

Just wondering what you think about that finding and also just perhaps more generally about this methodology of research that was used of engaging front line defenders and incident responders in better understanding incidents and relevant norms.  Just wondering if you have thoughts about that, either Art or Sherif.

>> ART MANION: Is the audio okay?

>> SHEETAL KUMAR: Yes.

>> ART MANION: I'm sorry, I didn't have the speaking powers a moment ago.  So thanks very much for the opportunity.  Very briefly I'm Art Manion, my day job is at the CERT Coordination Centre.  We are effectively a C‑CERT or a CERT team.  I work very much with cybersecurity vulnerabilities, so these are security bugs or defects in products or services and the reporting of the vulnerabilities and the disclosure of them are the things I work with the most.  We also a member of FIRST incident response and security teams, again, with the vulnerability, cybersecurity vulnerability angle.

In terms of the norms, this, I think, falls pretty heavily in there is a reliability of products, so reporting vulnerabilities, supply chain, and some of the response, response norms.  Anyway, that's where my expertise is so my comments are based on those areas.

Certainly those are not the only areas you are covering here.  I think in terms of the methodology as you asked, I personally am a big fan or a proponent of the testing, the cybersecurity events and the cases and the testing compared to the norms.  I think that's a very practical approach.  You got a list of events to look at, a list of the norms and do they align?  Do they not align?

Since those events are things that have actually happened, again, very practical approach.  I do like that approach.  One comment would be if this work continues, you know, the methodology is reasonably sound but keep an eye on what is changing, what's new, are there new norms?  There are always be new events.

Some will be the same old event ten years later with new players and new things, but I think there are new, there are actually changes over time and new events so keep up a bit with whatever is happening.  The norms, so the vulnerability response and supply chain norms I think are on an upward trend.  They are more common.  When I speak to people these days, they already know what vulnerability disclosure or reporting are, which is great.

So I think there is growth and normalization of the norms which is a very good thing.  I'm not sure there is enough yet.  We have stories of organisations being attacked and exploited and compromised with known vulnerabilities, so we have vulnerability for which there is a patch or update from the vendor, a public disclosure has already happened and yet the vulnerability management or patching race does not seem to be always won by defenders.

I'm honestly not sure what to do there.  I spent 20 years advising people to patch Internet‑facing bad vulnerabilities as quickly as possible.  It still seems to be a hard problem to solve at scale.  This was already touched on, but I do want to call attention to, perhaps it's not a written down norm, but in the land of cybersecurity vulnerabilities, there is very much a market for the offensive use of vulnerabilities.  Even the events, some of the events listed here, the NSO, Pegasus software, SolarWinds all involve at different points in time zero day vulnerabilities where the offensive actor knew about the vulnerability.

The vendor or provider did not.  The users and the rest of the world did not.  And despite our efforts to close these vulnerabilities, get them patched and fixed, and get them mitigated, again, there is an active, there are active markets for finding vulnerabilities, selling them privately, and then exploiting them.

So I try to keep in mind that that is happening and that is one of the factors our norms are trying to work against.  I will stop there.  Thank you.

>> SHEETAL KUMAR: Thanks so much, Art.  Really useful and helpful points.  And I would also love to hear from others if you have any reactions to what Art was saying including on the whole issue of the market for vulnerabilities and the wider ecosystem challenge.  So if we could go to, perhaps Lobsang Gyatso now, I hope you are on the line with us, because in the research and the findings, it's very clear that the direct impact of the incidents on people and on humanitarian group can be very stark and I understand you were involved in some of that in those, in those findings and indirectly impacted, of course, through your work at Tibet Action Institute.

I was wondering if you would like to reflect on some of these findings, perhaps also on the methodology, and perhaps Mallory's idea as well for a story bank for how this impacts others, and absolutely will not forget Eneken, Pablo or we can go directly to your reflections as well if Lobsang Gyatso is not here at the moment.

>> LOBSANG GYATSO: Either way is fine.

>> SHEETAL KUMAR: Shall we go with you for a couple of minutes and then we will go to Eneken Tikk.  Thanks so much.

>> LOBSANG GYATSO: Hi, everyone.  I'm Lobsang Gyatso and I work with technology, and in some ways in terms of the norms, the organisation that I work for, we actually work on reducing the impact as well as kind of looking at capacity building within the community in terms of reducing the impact.

So when the whole question or the discussion was going to cyber norms, one thing I wanted to say was during the time of the 2008 or 2009 ghost net report, one of the major issues was attribution.  In some ways whether there it's a norm that's already written down, but there is some other norms that have been set towards attribution.

So if you look at the U.S., the solar, or the disinformation campaigns which are not directly related to cybersecurity incidents but there are some norms being set.  I'm not sure whether those norms are recognized as the UN level, but they are being set.  So I think there is a question about how some of these norms are being set and how some of the norms are being used by different states.

At the same time I wanted to echo what Christopher said about accountability.  You can attribute as much as you want, but if there is not acknowledgment or accountability, then it kind of comes back to the he said she said concept.  And that kind of like, I think, has an issue in terms of, like, development of norms.

So I think that's something I wanted to say in terms of norm development that I see, and for us, I think if you look at like the 2009 attacks, one thing we always really wanted to know was how can we attribute it to a state, because in some ways we know a lot of the information that was used during the attack, a lot of the information that was stolen during the attacks were used by the Chinese Government at that point, however I think the question was how do you attribute at a level that is accepted?

So I think that was a question we didn't really have an answer.  Maybe that's something that's being developed right now.  Another thing, I can speak more for the other questions, but one thing I wanted to during the interview, one of the things that kind of made me ask a bit of a question about the norms was the question about infrastructure.

What is critical infrastructure?  And I think that's a question that I tend to think about a bit more, because if you look at a Government, and I think most of us understand what physical infrastructure is, and if you look at the CERT, they understand what a critical infrastructure is, but if you look at a civil society space, critical infrastructure for us may be our website.  So if you look at our organisation, Tibet action.net that's a critical infrastructure for us because it allows us to do the work that we do.

If that is taken down, that is actually harming the work that we do, and it has some real impact on the ground.  So I think the question is how do you define critical infrastructure, and I think if you want the cyber norms to be a multi‑stakeholder norm, I think critical infrastructure does need to be redefined.  Thanks.

>> SHEETAL KUMAR: Thanks so much.  It's great to hear both from you and from Art and I think just more generally that it sounds like while we are recently seeing a lot of challenges and increasing incidents, we are also seeing progress.  And progress and understanding what we can do, and one of the actual, I think takeaways or key findings from the research overall was that norms are helpful, and that the norms we have today would have been helpful in mitigating some of the incidents from the past.

So there is a lot to do still, and you point there to where it would be helpful to think through how critical infrastructure is defined including from a humanitarian, human centric point of view.  So that's key work to do, but it did does sound like progress is being made which is great to hear.

So just turning over to Eneken now for your reflections of the findings of the report based on your very long‑standing expertise on the issues around norms.  Over to you.

>> ENEKEN TIKK: Thanks.  And in my mind I wrap my answer around Mallory's first question which is the trends and maybe where we are headed in my mind.  I am an academic.  I am part of a small institute, Cyber Policy Institute that looks into the interstate conversations about, well, both the development and then now implementation of norms.

And why I say interstate, I would say the conversation has been increasingly between Governments because I think in this panel, we already have identified some of these parallel trends that are yet to merge with this UN and intergovernmental conversations about how to best implement the norms.

So some of the trends I think we should be mindful about, I think it's too early to celebrate cyber norms, and one of the reasons is that I guess we are a little bit operating in a bubble, meaning that while in our cyber bubble cyber norms like, we have to be mindful that, first of all, we feel the need to enforce something that we label voluntary and non‑binding. And mainly or largely that enforcement comes from the likeminded Governments.

And I think in this forum, we need to think about, A, how to either broaden that community, that, whatever form of enforcement guidance goes into implementing the norms would be much more either like just diverse.

And the other thing that I think is notable about cyber norms is that most of them are not restraint norms.  So that is something that when you look at the UNGGE recommendations and then what has been suggested by the industry and by other stakeholders, in these other stakeholder propositions there is much more exploration towards restraint than what the GGE has been able to agree, and acknowledging that the GGE or for that matter the open‑ended Working Group only has so much time and so much effort to put into the data.  We have to be aware of norms being abstract and what that allows in terms of different implementations and the divergence that exist between stakeholders and how to implement them.

Supply chain is a good example where Governments and industry that both prioritize the issue don't necessarily come to the same picture or same idea how exactly or who exactly is the key, has the key responsibility for implementation.  So just touching upon one of the questions that came up also in the chat about informal peer pressure, I think like these trends should make us think about, like, or actually pay attention to where we find this peer pressure that comes from other communities, first responders or civil society or specializing, group specializing on cybersecurity and Governments because where that synergy merges, we really see that cyber norms or those particular norms are to be celebrated, whereas with all of the others we actually, they need to think of how to take them further.

And I hope I will have another, like, moment to think about further because I started my remarks with the cyber bubble.  I think one option for the BPF to take this research further is now to compare these cyber norms and our findings about them with what came precyber.  Everything we did for Information Society and why not those binding rules that we have stayed away from and see how much synergy we can find if we broaden our scope of cyber norm implementation.  Thank you.

>> SHEETAL KUMAR: Thanks, Eneken, great points, a lot to think about there.  Thanks for the pointer as well to what we could be looking at or focusing on in the future as well.

So I would like to come now to Siena Anstis who I hope is on the line.  It's great to see so many of us, so ease easy to get lost, but I think you are here with us, which is wonderful.

As you, those of you who have seen the report, you will know that one of the incidents that's looked at is the Pegasus revelations and that's where, perhaps, there is quite a lack of enforcement of norms, particularly those relating to human rights and some recommendations in there drawing on that, especially those that have been made elsewhere in, and by human rights experts and how to address that.

So I would like to turn it over to you, Siena here in case you want to react to those findings and recommendations in that report, and also if you have any other reflections including on the research methodology or proposals that others have made so far.  So you can't seem to unmute yourself.  Okay.

>> SIENA ANSTIS: Hi, everybody sorry about that.  Thanks for the invitation to be here and it's great to hear the work that's been done and presented.  It's really interesting.  I will say a few brief words and maybe there will be time for discussion after.

In short I'm senior legal advisor for the CIS lab.  We are an interdisciplinary research lab at the Munk School of Global Affairs at the University of Toronto in Canada and I work on targeted surveillance of human rights defenders, so my comments will be informed from that work.  As you may know, the Lab has been working on uncovering cybersecurity campaigns.  And since 2016 in particular we have been looking at and tracking deployment of NSO groups, Pegasus spyware.

And, of course, NSO group is one among many other countries we study in the cyber surveillance industry which seems to be growing year by year.  So I think through the work one particular thing that arises from this research is just sort of an absence of norms.  That's what I wanted to underscore in my remarks.  Particularly then norms are translated into an actionable legal framework that is sufficient to constrain the activities in the surveillance industry and its impacts on human rights defenders.

I think here what is important, norm set of framework but norms that need to translate on something that can be relied on by human rights defenders to seek accountability.  I could talk a bit more about the substance of norms and consequently go legal principles, that would be nice to see, but a number of other speakers mentioned pertinent issues so I will leave it at that.

One thing I did want to underline is I think we might be at a particularly critical juncture for the development of norms and rules around the regulation of the cyber surveillance industry and the use of these technologies and that this momentum is something that needs to be taken advantage of.

I think as many people have seen here, there has been a lot of movement or momentum in the past weeks.  We sue the U.S. take regulatory action against an NSO group against surveillance in Israel.  The Biden administration addressed an effort.  We also have new regulation in the EU specifically starting to integrate a human rights perspective into how we manage the export of technologies which is a positive and promising step which hopefully will be followed more broadly.

Further, private companies who have significantly more resources at their disposal, human rights defenders, Facebook, I guess, now meta and Apple have sued NSO group.  And once again NSO group is just one company in the growing global marketplace for cyber surveillance technologies.  I think litigation is helpful in the sense that it will identify what legal deficits exist for holding companies to account and that is something that we should be paying attention to in how we understand what norms are missing and where norms need to go.

I will leave it at that, and then if there is more time, I can guess more concretely.  Thank you.

>> SHEETAL KUMAR: That's great.  Thanks so much.  That was really helpful, and I was hoping we could perhaps turn to, well, anyone who wants to respond to any of those points but Mallory, I know you are alone there in the room, perhaps you are not alone now, hopefully some have joined you.  Do you want to react to any of those points and any of those recommendations about what we might want to look at in the future and leveraging momentum, for example, around this critical juncture on the regulation spyware, the issue that are we separating in a bubble still?  Do we need to broaden out the discussion?

The whole concern around the market in zero days and vulnerabilities that are still very much challenging to address because of that?  I don't know if you want to respond to any of those points or ask further questions to any of the analysts.

>> MALLORY KNODEL: Thanks for the opportunity.  I had been taking notes on some of the high points and I can go over them now and we can move into the last couple of questions we have for the panel.  One thing that I picked up on that a couple of folks were saying around attribution, I think that for, and I will speak from the perspective of civil society in my career in working with, and witnessing a lot of ways journalists, human rights defenders, activists have to think about this is often the attribution has felt like the accountability.  If you can just somehow lift the veil on or uncover the fact that someone is targeting you as a human rights defender has felt in the past like the first step to accountability because so many groups don't always achieve that, but also then taking Lobsang's approach based on experience that there actually should be more then steps.

If attribution is achieved, then there has got to be somebody or some mechanism to hold that group accountable, so I think that's an important connection to make.  Another one around just generally what Art mentioned that maybe the scale and impact are not always the same.  I'm reflecting also on the way that we approach the work stream to research to look at events.  We were thinking about scale as a way of expressing impact, but if you take into consideration the most recent part of this discussion on zero days, on targeted attacks and so on, that massive scale is really not going to give you events that are potentially very impactful, but maybe didn't impact a large number of people.  It has different qualities.

So we need to take those into consideration.  I think we did do that, but how to then in an ongoing way talk about the scale and the real impact.  I'm thinking that it seems like if we look at the zero days being use the for the persistent threat software, if we look at ransom we are, there are a few examples there where I think there are industries cropping up around this, and that that maybe is the way to express it.  So rather than focusing on individual events, maybe somewhere elsewhere we focus on the vector of attack, the method, the industry that's cropping up around some of these.

That may be a separate approach that may lead to different sub questions and so on.  I'm already thinking of the research plan.

And then as well, I think, I'm trying to remember how this was mentioned, I have it in my notes, but the idea of baseline assurance, security as infrastructure, just knowing that there are boundaries beyond which states, actors, they will not go.  And I'm thinking generally about this is kind of the purpose of norms, but we can maybe think of ways in which there will be even harder rather than softer, harder approaches to creating those boundaries if it's commitments to, you know, not weakened encryption, a commitment to not holding onto Zero‑days, other things like that that may actually be, again, a commitment to creating this sort of baseline security as infrastructure for all of us.

Those are my reflections.  I'm sure others have their own.  Please feel free to bring them up as you get another turn to speak.  The question that we have for you next, and you can all answer it in turn, is how can the folks or the groups, the Governments developing global cyber norms include those most affected by cybersecurity incidents?  So if you are part of a group that's been a victim or target of an attack, if you have been a first responder in mitigating attacks, how could we best include those groups?  How could we best include you?  How could we best include those groups?

I will start with Chris because I know you have thoughts on building capacity necessarily for those groups to engage meaningfully in norm‑setting processes.

>> CHRIS PAINTER: Thank you very much.  That's one of the challenges that it's not just countries, a lot of countries don't have the capacity or background to engage in these discussions and that's been helped to some extent over the last couple of years but still is an issue and that's why capacity building is important, but it's bringing other multi‑stakeholder members into this club.

When Governments are talking about the norms, they are talking about even if their norms are restraint for Governments or norms of cooperation for Governments, the other stakeholders have a role in that.  They have knowledge of how the ecosystem works and Governments don't.  They understand second order effects that maybe Governments don't.

And an example I give is a few years ago I was giving a keynote at the annual FIRST meeting and I was talking about the UN developed norms including the ones about not attacking CERTs or C CERTs, and the audience had never heard of that norm before.

So there was not a connection between these important efforts in the UN and the community they were meant to protect.  So one thing we could collectively do is draw the communities that are either trying to be protected or have a stake in this issue and the people that are debating these issues, especially the Government people closer together.

One other comment I would like to make because I know that Sheetal Kumar just said where would you like to see the work go forward, I think it's been very valuable now, but I think also reaching out to other communities, making sure that others get the benefit of the research I think would be helpful.

And although some of the norms that have been developed have been non‑binding or voluntary, they are still political commitments.  You can still hold folks accountable for violating them, and I think that would be another interesting thing to see from the future of this work is to measure when there have been accountability efforts, have those been effective?  When they have not, what has that meant for the development norms?  So go from what you have done now to also how they have been implemented and whether accountability is there or not and how we might be able to improve that.

>> SHEETAL KUMAR: Great.  Thanks so much, Chris.  Lobsang Gyatso, you have indicated you would like to say a few words on this.

>> LOBSANG GYATSO: Thanks.  So I think one thing we have done as a community, so we have started our own kind of CERT.  Technically it's not a CERT.  It's kind of like a civil society CERT and in terms of capacity building, that is something that and we are also part of the global community of civil society CERTs which is called a CiviCERT, so I think the capacity building aspect of how do you develop cyber norms and if you look at GGE reports it does mention that about capacity building, but it is focused on the Government level.

So I think those other groups or other kind of like initiatives done by civil society, how do you integrate some of that into the more kind of like the traditional governmental roles or governmental kind of structures.  I think there is a space to do that.  So I think maybe it's a collaboration of these two different aspects of being able to build off on the traditional structures of the CERT, to have the civil society versions of it, and being able to collaborate on that at the same time being able to collaborate on certain aspects with industry or whether it's a Government as well.

>> SHEETAL KUMAR: Thanks so much, Lobsang.  I think that's a great example as well of how you can practically broaden engagement in collaboration with other communities by setting up something like that, an infrastructure institution like that or mechanism and thereby expanding the range of actors you engage and collaborate with.  Thanks for sHering that.

So Dhaka hub would like to come in if you are able to.  And thanks for highlighting that to us.

Okay.  It might be an issue with audio.  I will come back to you.  But shall we go, I don't know whether it was ‑‑ yes, thank you.  There are ten minutes left, whether it is Sherif or Siena, I will go to Sherif Hashem, you are next on my screen so over to you.

>> SHERIF HASHEM: Thank you.  I would like to highlight a couple of points that were raised, the cyber bubble, and what Chris mentioned also about the importance of trying to join or bridge different silos, people working from Government, industry, civil society, the general public, infrastructure, they don't usually meet or work together unless there is a special effort targeted, you know, to bring them together when it comes to applying cyber norms.

Cyber norms were developed by Governments and it is part of the UN mandate.  So it's always the case that it talks about state responsibilities, states should do this or that, but when it comes to reality, it has to be, you know, in partnership, and there are some norms that target or highlight the importance of multi‑stakeholder approach when it comes to implementing norms, and that's the part of the roles and responsibilities of various stakeholders.

And I have had different positions working for Government and working for academia and working with industry.  It's really important to bring them all together.  As I said earlier in a cyber incident, you need to have this trust relationship built ahead of time, and roll out the activities that need to take place to face that incident.

It's important for the group to recognize best practices and how to make this work in reality.  This is a very important link to attribution.  When it comes to attribution, when a cyber incident is ongoing, you don't know who is behind it.  You don't know what type of, is it a state actor, non‑state actor?  Is this ransom were by a group or part of an APT.

Again, it's the state.  You only find out about this when you have partners working together, and there are several incidents, really, that we can explore more in terms of research.  The colonial pipeline that happened in the U.S. and the eastern part of the U.S. affecting, I mean, the energy sector for several days.  And that was a group in Russia, and, again, collaboration across different states and within the same state need to be there ahead of time.

So focusing on, I mean, examples like this is really important.  Also I would like to highlight that the UN, the new UN open‑ended Working Group that just started, so it's very important to mobilize, you know, resources, build capacity within different stakeholders, not just states, not just Governments, but across different disciplines so that we know how to engage.  We know where our roles and responsibilities are, and that would have an effective impact of the outputs.

>> SHEETAL KUMAR: Thanks so much, really great points there.

A remote hub Dhaka, I will go to you.

>> DHAKA REMOTE HUB:  Hello.  Thank you so much.  A question is do you know that the young generations are more vulnerable part of cyberspace world.  How can Internet Governance work for saving them by beating social media or website?  How can Internet Governance work with those companies who provide this type of service?  Thank you so much.

>> SHEETAL KUMAR: Thanks so much for the question.  I would invite anyone who wants to respond to do so in the chat.  We can highlight that in the report, the responses.  Unfortunately, we are running out of time.

Maarten, I will take my cue from you as to next steps.

>> MAARTEN VAN HORENBEECK: If you could wrap us up with conclusions that we have been discussing and taking notes on during the session, I think that would be a great way to close the discussion so far.

>> SHEETAL KUMAR: Great.  I will try and do that in a minute.  So what we heard is that the research is very commendable and useful and that the methodology of engaging with those affected is practical, and it's very important to study incidents and how they play out, especially at country level dealing with escalation and trust between collaborators in country.  These are all real issues on the ground, and perhaps gathering best practices in that regard would be helpful.  I think we also heard that expanding beyond the usual suspects and diversifying, operating outside the bubble will be helpful and Eneken shared a resource worked on to try and do that, showcasing examples at the national level.

We heard that there is progress happening.  Perhaps on socializing the importance of disclosing vulnerabilities and on attribution, we have seen more progress in that regard, but there is still a long way to go, and a lot of concerns as well continuing around certain markets including in Zero‑days.

And so the recommendations that we keep an eye on what's changing, what's new, what's happening and where attribution has been achieved, what's worked because we do need a way to hold to account when these incidents happen.  So what has worked and how could it be improved.  We could analyze that going forward.  And we also heard there is still an absence of norms in some spaces particularly around spyware, but there are opportunities and we could be at a critical juncture to leverage those opportunities to address the norms and translate them to more effective legal frameworks.

So four key takeaways that we will be highlighting is that cyber norms we have today could have been helpful in mitigating some of the events of the past.  It's really important to bring in those affected and we need greater stakeholder participation including broadening out of those we have engaged with in the creation enforcement of norms.

And I think it's, where I'm going to leave you for now, because I think we need to wrap up.  Back to you.

>> MAARTEN VAN HORENBEECK: I will leave everyone with a few links of where you can find more information about the legislation today.  Also want to say a big thank you to all of the different individuals that you saw on stage via Zoom today, and especially the many people in the background that helped with the research this year, they will all be credits and listed in the report so please do read it, have a look at the wide group of experts we have here.  We are blessed to have all of you participate, and I do invite you to look at these documents at the first link on your screen.

They are really worth reading and we are very much inviting your input, and with that, I will actually hand it over to Hariniombonana Andriampionona as one of our BPF on cybersecurity conveners who has done great work bringing us together to share closing words.  Thank you, everyone.

>> HARINIOMBONANA ANDRIAMPIONONA: Hello, Maarten, this Best Practices Forum has been organized since 2016 and has brought together a group of experts and contributors to investigate the topic of cybersecurity.  When the IGF has launched a cultural participation at the beginning of the year, experts, researchers, students from all over the world replied to your call.  As a closing remark I would like to give special thanks to those who have contributed to this BPF on cybersecurity.

We know that you volunteered during this work on top of what you are usually doing, but we are very grateful for that.  To close this session, we would expect that the outcome of the session would be, and will be helpful for countries, researchers, state and non‑state actors, do you means and research papers are on the IGF website and shared by Maarten on the Zoom.  Thanks to the audience.  The session is now closed.