IGF 2021 – Day 3 – OF #13 Christchurch Call & GIFCT: multi-stakeholder efforts for CVE

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> We all live in a digital world.  We all need it to be open and safe.  We all want to be trust. 

>> And to be trusted. 

>> We all despise control. 

>> And desire freedom. 

>> We are all united. 

>> DAVID GREENE: Welcome, everybody, to our ‑‑ to this session.  This is a session regarding multistakeholder initiatives such as Christchurch Call and the GIFCT, and we're happy to be here.  I believe this is the first time either of these groups has made an appearance at IGF, and we really welcome being involved in this as well.  So just very briefly, my name's David Greene, I'm the Civil Liberties Director at Electronic Frontier Foundation, and I am a member of the Christchurch Call Advisory Network as well as some of the GIFCT working groups.  And we'll talk more about what these bodies are as we go along and how they're working and also most importantly what we've learned as we've been trying to get these efforts rolling and effective.  So basically both of these bodies are opportunities to give stakeholders a chance to ask questions of each other and offer their own views, really trying to get some common understanding and identify some shared values and shared goals.  And also I think really track how these efforts evolve as they go along. 

We have speakers today representing the NGO sector, Civil Society.  We have the community ‑‑ representing the community sector, governments.  Governments of Ghana and New Zealand are represented today as well as the tech companies.  Facebook is represented on this panel. 

I do want to note that Dina Hussein from Facebook will be our online moderator and will help manage the Q&A from the online questions through the chat, and then I will be the in‑person moderator.  If anyone in the room would like to ask a question, I'd just invite you to come to the table and do so, and I will include you in the discussion.  And then Jordan Carter from Internet NZ will be our rapporteur, and we'll do a summary at the end of the session that will form our – go over the conclusions of our session. 

So with that in mind, and I think our main goal is to keep everyone on time and make sure that we stay on schedule.  So with that in mind, I am going to keep the schedule and start presentations.  I'm going to ask each presenter to just introduce themselves briefly at the beginning mostly to save me the burden of having to do that.  So we're going to start with Paul Ash. 

>> PAUL ASH: Thanks, David.  (Speaking non‑English language). Hello, everyone.  Thanks for being here today for this discussion on the Christchurch Call on the Global Internet Forum to counterterrorism.  My name's Paul Ash.  I'm the Prime Minister’s Special Representative of Cyber and Digital and the Christchurch Call Coordinator back here in New Zealand.  I’ve worked on the Christchurch Call since its inception and been really pleased both to work with the people on this session and get to know them well and become friends with them and build the sort of confidence needed to make multistakeholderism work.  Both of the initiatives we're covering today are really interesting examples of collective action to solve a shared problem on the Internet, and both have interesting stories to tell when it comes to our shared vision for this IGF and Internet United,  where shared needs and actions shape the future of the Internet as a force for the collective good.  Fellow panelists, Erin, Anjum, and Dina have all had significant experience both with the Call and at the Global Internet Forum to counterterrorism from a range of perspectives.  Special thanks from me as we kick off go to Courtney Radsch who helped pull together this session and unfortunately couldn't be here at the last minute because of MAG obligations and to David Greene for stepping in at the last minute when Courtney’s duties required her to be elsewhere. 

I'll focus my 5 minutes on setting the scene on the Christchurch Call, if I may.  Many of you will be familiar with what happened in Christchurch on the 15th of March, 2019.  It will forever stay with me, the horror and the shock of what had happened, the massacre of innocent people, children and adults alike while at prayer.  Sadly, this wasn't the first or the last incident of racially motivated violence, but it was unprecedented in how the Internet was weaponized and how it was abused by the terrorists.  This time around the violence was livestreamed, and I think, you know, in terms of the online impact, it would have been seen as successful by the perpetrator because it was specifically designed to be broadcast on the Internet. 

The sheer scale of its reach was staggering.  That original footage was viewed 4,000 times before it was able to be removed by Facebook.  Within the first 24 hours it really spread and proliferated.  1.5 million copies of the video were taken down from Facebook over that subsequent weekend.  There was one upload per second to YouTube in the first 24 hours alone as it started to move from one provider and platform to another.  And we know that the proliferation of the video was aided actively by a few hundred sympathizers of the Christchurch terrorists.  They worked very, very hard to exploit and to gain the safety, moderation and recommendation systems of the online platforms for maximum impact.  It's one of the reasons that there were over 800 copies of the video made in that first 48 hours as those looking to perpetrate this abuse reshaped the video in order to get through hashing systems and other moderation systems. 

It's hard to quantify the impact of something like this.  We know in New Zealand, mental health help lines received thousands of calls in the days afterwards from people who had seen the video.  I've seen colleagues traumatized by it.  You know, we've had to put particular health and safety efforts in place to support them.  But I've also spoken with families and friends of families for whom this livestream is actually the last memory of their child or their parent or their closest friend.  And the trauma inflicted upon the individuals in the video, their relatives and friends, the city of Christchurch, the country of New Zealand, the whole of the Internet, and all of those people affected by this, really a pond of sort of rippling waves of harm that spread out across the world.  And, you know, while we can't ever quantify that, we know it had a huge impact. 

And we also know that the video and the manifesto that formed part of that calculated campaign of really unfortunately have been a motivation for others to commit similar attacks since.  The immediate issue for us here in New Zealand was how we responded to something of this sort.  And what sort of tools we reached, different range of tools, from ignoring a problem all the way through to quite draconian responses.  For us, multistakeholderism and an approach that tried to work with others on solving the problem that we saw, and sticking to the presenting problem was where we landed, and it was probably always going to be where we landed.  It led to the Christchurch Call which emerged in the days after March 15 in a system where it was quite clear that the systems in place to address this kind of attack weren't up to the task anymore and that serious change was needed. 

Importantly, you know, the call ‑‑ Christchurch Call represented a strong commitment to address this problem in a way that serves the good things about the Internet, a free, open, secure Internet that promotes human rights and acts as a force for good in the world, not one that should be shut down in order to address content issue.  The approach taken was also not a legally binding treaty nor an institution.  It's an initiative that's based on voluntary commitments by governments and online service providers, 24 commitments, in fact, including a number which are collective, that is they are for governments and online service providers to carry out together working with Civil Society. 

The supporters who agreed to fulfill those commitments also agreed to do so consistent with a number of really important underlying principles, namely that they'll carry them out consistent with a free, open, and secure Internet without compromising human rights and fundamental freedoms including the freedom of expression and that they'll work closely with Civil Society to ensure these principles are upheld.  The Call is based on voluntary participation, and it's growing quite rapidly.  It now has ten companies and more in the pipeline.  It has 55 governments participating, the European Union, through the European Commission, two international institutions, and around 40 Civil Society organizations working together on what is now an expansive work program. 

And that work program was refreshed and redefined by leaders from all three sectors working together; Civil Society, governments, and the tech sector.  Our second anniversary summit in May to include four high‑level priorities.  The first is building self‑sustaining community that's truly multistakeholder which incorporates a more diverse set of perspectives and expertise, and which allows our work to possibly impact a bigger and more diverse group of users.  The second is getting to grips with recommender systems and understanding how they may impact on self‑radicalized action and how they might drive users towards terrorist content. 

The third was looking at the tools that are available to detect and remove terrorist content and making continuous improvements including on human rights protections and redress mechanisms.  Sorry, that was part the second. 

The third was improving our crisis response including the basis on which we take collective action when there's a terrorist impact with online impact.  And the fourth was improving transparency from governments as well as companies.  Transparency has really important aspects for governments as well as the tech sector to deliver on. 

And a couple of quick thoughts I guess as we go into the discussion and the Q&A that I wanted to leave.  As I said earlier the call is not a treaty or an institution.  It isn't a multilateral organization either.  It's deliberately intended to be a multistakeholder, and it's deliberately designed such that it needs to keep delivering impact and value for the participants to really be able to see the benefits of ongoing participation.  So at least to draw on those particular characteristics that states protect industry, Civil Society, and the Technical Community and academic researchers have to address a pressing problem. 

It's based on some underlying principles.  So while it's a big tent, it isn't fully inclusive and it may never be.  We are working out ways to ensure that those who come on board are fully signed up to those underlying principles and to ensure that those who are in the tent, we’re all able to encourage each other effectively to hold to them.  There's a great deal of discourse around regulation in the space.  And it would be an important test for the call to show because it's not something that prescribes regulatory actions, particular actions that focuses on outcomes.  It will be an important test for the Call to make a positive impact in creating more effective outcomes including regulation where that's necessary.  But outcomes that commit us to human rights outcome benefits and upholding human rights law in a free, open and secure Internet. 

And finally, I’d say the fourth observation, the Call has forged some really important change already.  It's really important, though, that as a model of multistakeholder action it's able also to keep achieving meaningful results and demonstrating those as a source of both legitimacy and a reason for the whole community to be involved.  That is, we need a bias for action, and we need to keep delivering on that.  I'll close just there, but hopefully, those are some opening scene setters on the Call and on some of the key elements of it that are underpinning the work we do.  And I look forward to the questions and the discussion. 

>> DAVID GREENE: Thank you, Paul.  Now we're going to -- our next speaker will be Dr. Erin Saltman from the GIFCT.  

>> ERIN SALTMAN: Thank you so much and thanks for this convening.  I know many of you on the call, so I'll try not to be too repetitive about what GIFCT is and does, but I think there's some new people to this call as well.  I want to also be iterative and build off of some of what Paul raised.  I think for starting, as the Director of Programming of the Global Internet Forum to Counterterrorism, so if you hear GIFCT, that is the acronym for the Global Internet Forum to Counterterrorism.  I think it's important to kind of see that we're a little bit of a funky and unique type of NGO.  We're a bit unique in that we were founded in 2017 originally as a bit of an initiative by tech companies for tech companies.  So a lot of NGOs respond out of potential government funding or private funding, and we were brought together as an idea between tech companies, recognizing that if you look on your phone today, nobody's using just one app.  Nobody's using just one platform.  And so when you were seeing terrorist and violent extremist trends, it didn't matter how much one company did signal of bad actors was jumping between platforms.  This already raises a ton of questions about what you can share, what you shouldn't share, privacy, but also how law enforcement can track signal, and we were seeing a great diversity of an increasing amount of platforms being used. 

So GIFCT was founded originally to answer around three main questions.  It was really that first question of where can you share technological solutions across platforms in a way that does not abuse privacy or other human rights concerns.  How do you share signal without sharing user data in a way that would be completely illegal to e‑privacy and GDPR. 

The second question is how do you get better action‑oriented feedback from researchers and experts around the world that really have their finger on the pulse of adversarial shifts?  And that's hard because usually academic work, as a former academic myself, you write about the problem, and if you give that over to the body or the tech company, they say, okay, what do I do about it?  You say, well, I just told you where the problem is.  And sometimes they might not read your beautiful 100‑page report that you worked so hard on, so how do we get action-oriented, smaller, quippier, faster feedback from experts around the world? 

And the third question was really how do we ‑‑ this is not multistakeholderism because it's a nice word.  How do you get better feedback and share knowledge between the sectors in an appropriate way?  Law enforcement and government has a very particular perspective and knowledge about some of the offline trends and harms threats.  Civil Society has their finger on the pulse of what this looks like, how hate and extremism manifests in their community, more than sometimes government or tech companies or often more than government and tech companies.  And then tech companies really understand what data they have available to them and potentially have tools to upscale and optimize solution‑building efforts.  And, again, all of those have different concerns.  So multisector knowledge sharing is not just because it's nice to do.  It's actually because we're all seeing a different aspect of the problem, and only by bringing together those perspectives do you really understand what the threat landscape looks like.  Or do you really understand what ‑‑ why solution X developed by tech company Y, might that be harmful.  So we maybe don't want every solution ever if it has unintended consequences. 

I think the fourth question GIFCT has really taken on is foundational really did come in the aftermath of the horrific Christchurch terrorist attacks.  I think it's really crucial to recognize that there are biases in a lot of list‑based approaches to understanding what terrorism looks like.  White supremacy and neo‑Naziism has been on the rise for quite a while.  And those are attackers that we have seen and have case studies of those attackers carrying out horrific atrocities in different parts of the world.  And there are other forms of violent extremism that never make it onto certain lists.  So we have to always approach holistically of where is it a list‑based approach?  How can we lean into what governments provide, but how can we build on top of that to understand violent extremist behavior? 

And so I guess I'll keep it quite quick.  But some things I want to make sure to land, one is that we think of online harms, we usually still think of the same few platforms.  When you look at how our partners at Tech Against Terrorism flag URLs or when you look at Europol flagging content, you're looking at hundreds of different platforms that are being abused.  It's not just a couple platforms.  And so we need a big‑tent approach to bring lots of different, smaller and medium and newer and different forms of technology to the table to understand what the threat landscape looks like.  And I would also say that, you know, the problem is transnational and cross‑platform, so our solutions have to be that as well. 

GIFCT can provide tools, it can provide that knowledge‑sharing space, it can work with the Christchurch Call to action.  It's tried to build in multistakeholderism within its own governance.  That's never easy.  It's not easy to bring a bunch of people to the table from government, Civil Society, human rights and say what do you think we should do?  Those are not spaces where you want everyone to agree.  It's actually better to raise the diversity of opinion where people maybe don't agree, and then we can make more educated decisions on where to put our next step forward.  How to deploy tooling, how to expand our taxonomies and understandings.  How to get better feedback from different parts of the world about how violent extremism looks in Singapore versus India versus France versus the U.S. versus anywhere else in the world. 

So we do need an iterative space to respect the differences in policies both between different governments but also individual platforms while pushing for baseline standards in things like transparency, in things like having better appeals processes, in having better clear terms of service and human rights standards.  So this is not an easy space.  Nobody on this call got into this because they thought we could just solve terrorism and violent extremism tomorrow, especially in digital newer spaces.  But I think at least between Christchurch Call and GIFCT, we're trying to build infrastructure for those voices to come to the table to build solutions, recognizing that there won't be a single day we wake up where we say, okay, solved it.  This is no longer a problem.  We have to evolve with the times each and every day.  I think I'll pause there.  There's a lot we could key off of. 

>> DAVID GREENE: Thank you so much, Erin.  Our next speaker is Dr. Albert Antwi‑Boasiako from Ghana.  Is Albert on the call, or is someone speaking in his stead?  Well, why don't we move on, then, to ‑‑ and he can join us when he joins the call ‑‑ Anjum Raman from Inclusive Aotearoa Tahono. 

>> ANJUM RAMAN: (Speaking non‑English language).  I'm one of the Co‑chairs of the Christchurch Call Advisory Network of which David is a member.  And there is a network of Civil Society organizations who form one part of the Call community along with governments and tech platforms.  In terms of a multistakeholder forum, Civil Society voice is crucial.  We bring the perspective of the impact of government and corporate action or failure to act on communities.  We bring independent research, expertise, and lived experience.  So CCAN, or the Christchurch Advisory Network, connects various Civil Society organizations, and we have the ability to discuss issues of concern and learn from each other. 

Our members have expertise on various different topics, and we've called on those for various things like submissions to domestic legislation in various parts of the world, raising issues publicly and advocating within the Christchurch Call framework.  Many members of the network also engage in other forums like OECD and the Freedom Online Coalition. 

The multistakeholder model gives us an opportunity to be heard, and we do that through, for example, monthly meetings with the governments of New Zealand and France, through the second anniversary event and the workstreams that led up to this and multiple meetings with the full Call community.  We've had further meetings on particular focus topics with governments and companies, and some of the work has been dealt with through GIFCT working groups which Dr. Saltman has covered, and some of our members are also part of those working groups.  We've held education sessions.  We've engaged with the human rights impact assessment carried out by GIFCT, and we were hoping to have our very first staff member on board very soon. 

So for us, many of the issues of concern where we had focused our advocacy, first of all, around definitions, what is or isn’t terrorism, who gets to decide, and what are the impacts if they get it wrong on dissent, on the right to protest, on democracy and civic participation?  There is no agreed definition of terrorism or violent extremism.  And Dr. Saltman has already covered the issue of the use of designated lists, which don't include a whole range of groups.  And what we want to see is equitable treatment of all types of extremism, whereas there has been an overfocus on one type of extremes and some for many years.  There's a great deal of discrimination in this space that needs to be named and addressed. 

Another area of interest for us is around algorithms and machine learning, which are used for content moderation and for recommendations.  We’re particularly keen to see independent audits of the impact of algorithms, not of the code.  And it would be great to have a process, a provided assurance that the algorithm is not sending people into ever‑more extreme material.  Issues of transparency have been raised and, of course, that's of vital importance to us to ensure that the reports that are being put out by member companies or those that have signed up either with Call or are just members of CT are showing the full impact and provide the data necessary for us to understand what is happening. 

In terms of accountability, we members of the Christchurch Call or GIFCT are not holding up the commitments.  What are the consequences going to be?  There has been and generally is a diplomatic approach of persuasion and working with countries or companies to persuade them to improve, and that's certainly a useful approach.  At some point, though, we do need to consider whether there needs to be further action when a platform or a country continuously fails to live up to commitments but how that would happen as a matter of some discussion and debate. 

In terms of domestic legislation, there have been things like the Online Safety Bill in the UK, the EU Digital Services Act.  In New Zealand there's been the Films, Videos and Publications Classification Act, and our members have been quite heavily engaged, as some of them in these processes around domestic legislation.  And, in fact, we did have some success in submissions to New Zealand legislation which had included upload photos which were a cause of concern for the Christchurch Call Advisory Network, and so that was removed from the bill. 

We certainly believe this urgency to these issues, the effects of online platforms is significant, and we see that they are being used to significantly marginalize and to promote hatred.  Content removal in the context of the Call is focused on after ‑‑ focused on events after an incident of violence occurs.  But also we are thinking and talking about and working on how to prevent the spirit of content from that incident and incite even further violence. 

Some of the concerns that we continue to grapple with is a network, as the power and balance between governments, companies, and Civil Societies, which certainly affects participation, and that power and balance is around resources, time, and energy that we can put to this work.  A second issue continues to be the lack of representation across the globe and how we bring representation beyond Europe, North America, and Australasia.  We particularly want those voices from other parts of the world within CCAN, but we know the risk that it carries for some people. 

And, of course, the fact that there are some limited levers available to Civil Society organizations.  So, you know, we can use domestic label processes, public commentary, but the multistakeholder process allows us another opportunity to try and push for change.  There are some limits on what we can do with this process, especially when we're dealing with different cultures, different legal systems, and the right for self‑determination for the peoples of each nation, including the right to legislation that works for these situations.  We're also dealing with multiple languages and the nuances that arise from that. 

We're looking forward to the year ahead with more capacity through the Secretariat, and I'd like to thank the governments of New Zealand and France for working to secure funding to allow us to have that role available.  We'll be looking to map the network members to see what expertise they have that we can draw on.  We'll be looking to expand the network and particularly to hear more voices from diverse communities and those who have been harmed by violence, more interactions with other Call community members to work on shared solutions, and more outreach through webinars and workshops. 

To conclude, the work isn't always easy.  It isn't always smooth.  But at CCAN, we are committed to educating communities for safety based on a full range of human rights.  I'd like to finish by remembering the victims of the Christchurch mosque attacks and victims of violence everywhere.  Thank you. 

>> DAVID GREENE: Thank you, Anjum.  Our next speaker is Dr. Dina Hussein from Facebook. 

>> DINA HUSSEIN: Thank you, David.  And I'd like to start off by thanking all of you for gathering us here at IGF.  It is very much a point of pride that I have that I get to engage with everybody on this call.  Paul, Anjum, Erin, and myself have worked for a very long time together.  I think that's the perfect example of why Christchurch Call to Action and GIFCT have been so central to the success of the work that we are trying to accomplish here.  The fact that these connections have been built and the fact that we are regularly in contact with each other is not a coincidence.  The collegiate efforts that we've put into this is not a coincidence.  This has been the effort that has come out of a lot of work that has taken place over many years now through Paul and his team, through Anjum and her efforts to make sure that we have been inclusive and through the GIFCT and Dr. Saltman's efforts to make sure that tech is at the table and held accountable. 

And so with that tone at the beginning of the conversation, I really do want to make sure that I'm not taking up too much time but just highlighting four main things that from a tech perspective I have found the most valuable in these exchanges that we've had between the Christchurch Call and the GIFCT. 

So as most of you know, the work that we've done through Facebook/Meta at this point of time has really been to try and protect the community that we have online.  And it started in 2016 when the threat was from a known adversary that manifested in a very clear way, and that was ISIS (?) mainly with clear logo dissemination of propaganda, a clear command and control structure.  So we understood to a certain extent what we were looking for online.  And the reality of the matter was the resourcing the tech companies had as a whole was going to those main groups. 

Now, it is unfortunate, but I do think the turning point that occurred after the horrific attacks in 2019 were as a result of this community's efforts to keep the goals alive and make sure that the community was being heard, make sure that the responses from tech companies were not just surface level.  So I do want to very much pause and emphasize that this community is something that I have been very grateful to be a part of.  It has brought immense value to tech companies.  And while it has not been easy to consistently bring everybody to the table and consistently try and make sure, to Anjum's point, that there is still stamina for these conversations, and there is still support and consistent funding for these conversations.  I do think it's borne a lot of fruit. 

And a lot of that has come out, again, in the three buckets that I'm hoping to address.  First, you've heard the word "multistakeholderism" time and time again on this call.  And the reason that that is incredibly important is up to the point when GIFCT was really created in 2017 but really coming to the fold in 2019 and 2020 is its own independent body, there wasn't a way to bring together this community in a very clear and transparent way that provided equity to the multiple stakeholders that we're trying to bring to the table.  It really was, from our perspective as tech companies, us having bilateral conversations with either Paul and his team or Civil Society and Anjum and the NGOs that had prioritized this work.  But the GIFCT and the Christchurch Call have really equalized the playing field for a lot of the partners. 

Now, we're not fully there.  I think tech companies need to strive to have more diversity, support more of the Civil Society partners, and make sure more people are able to access this table that we all sit on and have the conversation around, but multistakeholderism has really impacted the way that we get feedback.  There is a feedback loop that we have tried to frame out and to build into the way that we develop our policies, the way that we assess the efficacy of our moderation practices, our programmatic efforts.  So when we talk about multistakeholderism, it really is at the core of the work that both GIFCT and the Christchurch Call ‑‑ it's at the core of the efforts that we've engaged with, and it's been one of the biggest priorities that we have had from a tech platform perspective. 

The second part of the deliverables here that we talk about really are diversifying the feedback loop that I mentioned early on.  Again, most of the tech companies that I have worked with and the tech companies that we've partnered with, and that includes Twitter, Microsoft, YouTube, who have been our co‑founders along with Facebook of the GIFCT when it was initially starting out, had all had bases in the U.S. in California.  They employed employees that come with backgrounds from engineering to most likely very baseline policy insights, and that's really evolved over the last few years.  But the turning point was, again, the Christchurch Call emphasizing the need for a diversity that went beyond just surface level of regional diversity that was very much only accessible to a certain tranche of people who were able to access those circles and those areas where the conversations were being had. 

So, again, diversifying the conversation has been at the core of the work that GIFCT has provided, at the core of what Paul and the Christchurch Call team have been pushing forward, and we've been on the listening end.  Again, we are not at a place where I could say that we are operating perfectly, but looking at the Christchurch community, looking at the working groups that Erin has pulled together under GIFCT, looking at the kinds of conversations that are being brought to the table, when GNET as a research entity is brought together, I think we are pushing that line forward. 

The next bucket that I think that has been at the forefront of a lot of the conversations we’ve had recently has been transparency.  As tech companies, I think there's been a considerable shift, and I think Paul can really speak from experience here, it has not been very easy to get tech companies to open their doors to access not just data but access information about how decisions are being made within our policy spheres.  And it's been a considerable point of pride for me to see the evolution of how we talk about transparency over the last few years. 

As Erin mentioned, the GIFCT has pushed forward an effort to make sure that any member companies have a clear and transparent product that outlines their practices, their practices around countering terrorism and extremism.  And Facebook has really looked at this and said, how can we push this forward?  How are we able to set a baseline best practice around transparency?  And we've turned to the Christchurch Call to Action, to our GIFCT partners to learn a little bit more about where we can get better and where we can set the tone there. 

So from my perspective, the multistakeholder approach has been central to moving forward the conversation.  The calls for diversity have moved us beyond just a surface‑level inclusion to trying to build in this feedback loop throughout all of our tooling, our policy‑making processes.  And then from the transparency perspective, I don't think we're perfect, but we are trying to move that baseline forward as well.  And it's all been thanks to the people on this call, so I really do want to emphasize the appreciation and the gratitude that I have had for them to give us the time and work alongside us in moving this forward.  From the tech perspective, we're committed to continuing to listen and continuing to be at the table. 

And I think the most important portion of this is, again, the four people on this call have known each other for quite a good amount of time now, and that is as a direct result of the efforts of the Christchurch Call to Action, the community, and the GIFCT.  So I appreciate your time, and I look forward to the questions. 

>> DAVID GREENE: Thank you, Dina.  So it is time for Q&A.  However, we will unfortunately not have Dr. Albert from Ghana.  He is not able to join us, so we can move into the Q&A now.  I'm going to see if anyone in the room has any questions.  We don't have questions here.  Let me start off with one, and hopefully we can generate some questions in the chat as well.  But I'm interested just based on what Dina was just saying, I think one of the‑‑ one of the really important things about any multistakeholder process is really identifying what you want to get out of it.  Sometimes it might be consensus.  Sometimes it might just be a collective body of knowledge.  Sometimes it might be collegiality.  And so I'm interested, from each of the panelists, I think from my perception of having been involved in both of these groups, it's been ‑‑ and from the Civil Society perspective, it could be a bit of a rocky road, trying to form these groups.  I'm interested in hearing from those involved, like, what lessons have we learned, and what do you think is sort of the biggest improvement or biggest change since you've started your effort?  I don't know if anyone wants to respond to that first.

>> ERIN SALTMAN: I mean, I'm sure Paul or I could write you manifestos about these processes.  Maybe I'll start, and I'm sure Paul can jump in off of that.  I think, you know, this is when sometimes you look at your agenda and your schedule, and you look at all the dialogues you're having and you think, gosh, fascism would be so much easier, so much easier with synarchism if you have just one person in the room deciding what you do and you plow forward.  But I think to some of my original points, you know, you do sometimes think, okay, bringing ‑‑ wrangling cats and getting all these perspectives is really masochistic.  It's time zone consuming or you're trying to bake a pie with way too many ingredients.  But actually, at the end of the day, if you want to be effective and if you actually want to understand the nature of the problem and build solutions that have a really solid grounding and have longevity to be iterative and be able to evolve with the threat, having these different perspectives at the table is absolutely crucial.  Otherwise you are blindsided to problems you didn't even know you could have.  And it is those other voices.  It is sometimes the voices in the room that disagree with what we are doing the most that help us to get to a better place the most. 

For example, we have 18 tech companies that are GIFCT members.  It is a criteria because of feedback.  It is a criteria that those companies have to have at least an annual transparency report.  Now, that transparency report might not have as many data points as you want in it, and that's a space we can evolve, but most tech companies do not have a transparency report.  Actually, just to become a GIFCT member, we know of at least four companies that developed their first transparency reports that don't just speak to terrorism and violent extremism.  This is their first baseline transparency reports just to become GIFCT members.  That was based on real feedback and push from some of the human rights advocates that are a part of our groups to force that as a baseline criteria, which to the average civilian sounds really easy.  Like, well, of course a tech company should have a transparency report.  But actually, in the tech space most platforms -- the OECD did a really good review of the top 50 social media companies and found something like 15 or under of those top 50 companies had any transparency reporting.  So when we were thinking through things like working groups, this is really hard.  We have five to six topics that we choose every year that we try to refresh every year now to make sure that they have their finger on the pulse.  To make those working groups we proactively reached out to the Christchurch Advisory Committee Network and other global networks and said, spread this application to join working groups as far and wide as you can because we don't want to just preach to the choir.  Sometimes you get echo chamber effects where you have the same multistakeholder people in the room all the time, and actually we wanted to break our own networks.  We ended up building, for this year, our working groups, there are five topics.  There's everything from positive interventions, crisis response, a review of global legal frameworks, technical approaches, and transparency.  Each one of those is a topic that could take your entire lifetime, and working groups now have over 170 participants from 35 different countries.  That is, like, a five‑dimensional Rubik's Cube to arrange time zones for Zoom meetings.  And also feedback from Civil Society said, don't just cater to Silicon Valley timing or western timing.  So our working groups shift between two times, meaning maybe some participants only participate half the time.  But at least they don't feel messed up and unable to join all the time.  And so things like that, trying to be empathetic to ‑‑ if you want those perspectives, don't bring them in and then make them only wake up at 3:00 in the morning if they want to be part of the dialogue.  That's really hard to coordinate, but it's those little things that go a long way. 

And even with just the research insights, we have really refined definitions and scopes for what can be added to tooling, because tooling can get to scale and speed, and you have a bunch of potential false positives.  So we're very wary of that.  But in the research space when we're trying to look at, again, adversarial shifts, our global network on extremism and technology, that research network, in the last year and a half, they've had over 190 insights come out from over 245 different authors from 25 different countries.  So you want to know what 5G cell phone misinformation attacks look like in Europe or you want to look at far‑right extremism in Singapore, again, neo‑Nazis are not always white supremacists.  Some of them are actually in the AIPAC region.  And so we want to look at a wider view of trends so that tech companies have these small, quippy insights to think outside their own biases of what violent extremism looks like and the types of technology that are being exploited for those purposes.  So that's just a couple concrete examples of where that input seems tedious sometimes and definitely makes me lose sleep at some points, but actually in the long run, it means that the decisions we make will have a stronger foundation to be stable and iterative in the future so that we're not doing a quick fix that then has to be completely revamped in five or six months. 

>> DAVID GREENE: I want to give Anjum a chance to get in here.  I think Paul might want to get in as well.  But Anjum, I'm interested to hear your response from the Civil Society community sector. 

>> ANJUM RAMAN: Yeah.  So I think from Civil Society, you know, motivations and why we are all in there.  For us, you know, it is the hard end of real-world violence and real‑world harm that we’re facing.  And we are seeing the way that it's playing out online and the ways that people are being, you know, shifted to more and more extreme positions which then push – then, you know, bringing them to places where they not only cause emotional harm online, which is a significant impact, but that also take lives or cause other forms of harm offline. 

And so, you know, for CCAN, our thoughts are always how can we reduce the harm?  What can we do, and how can we advocate, and how can we make sure that the attention is being paid where it needs to be paid, bringing the issues into focus, making sure that people have thought of those issues.  And, you know, sometimes it feels like we are the complainers in the room, but that's the job.  That's part of the job, to really point out ‑‑ like, we don't need people to sit there and say, this is really working well.  That's not what's needed in this space.  What we need is to say, hey, this is causing harm.  How are you going to deal with that?  How are you going to fix it? 

And I think, you know, being part of those dialogues and being able to raise those issues and be heard is important, but we also need to see that feedback loop, that what has happened with what we're raising?  What are the impacts internally, and can we see those impacts in the real world?  And I know these are not simple and easy and quick solutions, even though the need is urgent.  But we keep pushing in the hope that we will see some change. 

>> DAVID GREENE: Paul, I want to give you a chance to respond. 

>> PAUL ASH: Thanks, David.  And I guess the key thing for us is to acknowledge there has been an awful lot learned in the two years that we've been working on this.  And that's grounded, I guess, on the fact we won't always agree on things.  We won't always agree on everything.  But we can come back to the core of this.  We all have a common interest in preventing what happened in Christchurch, I think, again.  We have a common interest in understanding that well and thinking about the kinds of tools we can each bring to the table, whether we're governments, whether we're industry, or we're Civil Society. 

And that's certainly the philosophy that the Prime Minister has tasked us with bringing to this mix.  And I think, you know, the key thing that we have learned from that is a fair bit more about how each other works in the community and the things that we can bring that work well.  We've learned to be comfortable enough with each other to disagree on things.  And when we do, to try and circle back and find new ways of delivering on the commitments that were intended to help improve and address the problem at the core of the Christchurch Call commitments. 

A couple of other things have proved quite important there, too.  One, we're not trying to bite off everything on the Internet that people are worried about.  You know, we've tried to really stick to the presenting problem of terrorist and violent extremism content and try and go as deep on that and build the strongest possible trust circles as we possibly can.  But second is, we’ve, I think, had to focus in on sort of immediate baseline measures that we needed to put in place to try and enable some of the longer‑term work to be done on it that involve crisis response first, I think, and then starting to think about how we build a constructive community that can be sustainable over the long term. 

And to on the point I made earlier, making progress on our bias to action.  We could spend an enormous amount of time tangled up in the process if we wanted to, and that might make us all feel we're doing something valuable, but we might not be making much progress on the substantive issues.  And at times, you know, we've all probably stubbed our toes on process points or on ways of engaging, but we built, I think, enough confidence across the community to be able to circle back when that's happened and find ways through that together. 

I guess closing on the last point would be even in the two years since what happened in Christchurch, both the community has changed but also the Internet and people's use of it has changed quite significantly.  COVID and the pandemic has accelerated that.  But if we start to think about new ways of engaging online, I think we're going to have to try and get ahead of some things that after Christchurch we weren't able to get on to after they’d already been exploited or abused.  You know, I think there's a huge challenge and a really extraordinary opportunity for the community to try and get ahead of what, you know, terrorist and violent extremist content might look like in a metaverse environment.  There are enough companies working on aspects of that that we need to be thinking about how that content will travel across platforms in that environment. 

So there's quite a bit there that we've learned, and I think it augers well for the work ahead that we have managed to build that confidence.  The other thing I think that I'll just finish on is the engagement of leaders is really critical at the right moments here.  When we have had blockages, the ability for respective leaders across Civil Society, across the tech sector, across governments to pick up a phone and talk with each other as a consequence of the work that's been pulled together under the Christchurch Call and say, hey, something's not working here.  How do we take it forward, and how do we get through this blockage is really, really important.  And it's probably something that I think characterized as the work of the Christchurch Call and really helps us take work forward in a way that maybe hasn't always been possible if we're trying to play this out in public with, you know, competing regulatory measures or similar tools. 

>> DAVID GREENE: Thank you, Paul.  Dina? 

>> DINA HUSSEIN: Yes.  I quickly wanted to add to a point that both Anjum and Paul made.  I think one of the ways in which we've been able to evolve getting feedback especially to the tech companies has been at the core of our success here.  We've built a framework where feedback is not just ad hoc.  There is a system in place now to record this feedback and make sure that the second part of it is that the tech companies and other community members are held accountable for the comments that they're making, the promises that they're making, and building out a landscape or a future scoping for how they're going to respond.  And I think that's been a really noticeable evolution there. 

The feedback that we're getting is now being built into our roadmaps, as tech companies, our internal roadmaps for how we're going to build tools, how we're going to make sure that we're engaging with these stakeholders on a more regularized basis.  And then the final point I will make is before the very, very horrific attack in 2019, one of the things that we have a difficulty for tech companies is making the case for a more regularized cadence of engagement around very difficult topics.  And I think that has considerably changed now because we now have these two frameworks in place. 

>> DAVID GREENE: Thank you.  We've had some very interesting questions in the chat, and unfortunately we don't have a lot of ‑‑ we only have about a minute to take them.  But Dina, as the question monitor, is there any point we can bring up in the minute we have before we get to our summary? 

>> DINA HUSSEIN: Well, I think a really question ‑‑ a really good question was being posed here by our partner, Jeremy West, and it touches on also David Reid's question, which is how do you, in the next couple of years or in the next steps, how are we going to make sure that we are gathering opposing views, especially when we are posed with such large questions?  And I know we don't have very much time, but how do we ‑‑ do we expect that to be one of the biggest challenges?  And I'd be keen to hear from Paul, Anjum, and Erin around that. 

>> ERIN SALTMAN: I'll let Paul do the final words perhaps, but I was just going to put in the chat that, you know, going forward again, I just want to hit home that this is always iterative and that you start with multistakeholderism, and actually some of that feedback is that you're doing it wrong or that, you know, to Anjum's point, you set up a system, and it works really well for government and tech, but actually the CSOs are the ones not usually – they’re not paid to be there to add their perspective.  It's a big time burden.  We've had to add funding structures to make sure that if certain CSO representatives are taking more of a lead on something, at least that's compensated, whether that's working group output or making sure that they can lead on some dialogues.  If that's in‑person meetings, making sure we have funding to support CSOs to join because governments and tech companies have that funding already.  Those are those little things, but also that we really looked this year to do something painful and tedious but necessary which was proactively carry out a human rights impact assessment on ourselves, and that gives us a lot to think through, both short term and long term of, again, human rights, there's not ‑‑ it's not just free speech.  It's not just right to this, right to that.  It's really trying to see where you draw the line in the sand to be the least harmful and the most effective in your ultimate goal, which for us is preventing terrorist and violent extremist exploitation online.  I'll just send a link to the SR report on our human rights impact assessment.  It's not going to be easy.  We just gave ourselves a ton more homework, but, again, if you build a strong foundation, what you build on top of that is going to have better longevity and better impact in the long run. 

>> ANJUM RAMAN: If I could just very -- for me, it's not necessarily a future thing but a real concern is the amount of effort and energy that there is in the disruption space and the disinformation space.  And, you know, the intelligence and thought that is being put into secondary systems to circumvent and get around any tools that might be available, the use of different types of platforms to organize or to influence people towards violence.  So those continue to be the challenges.  And to be able to, I think, you know, what we really want to focus on is how can we deal with those without losing rights to speak, to debate, to be present, and to avoid the silencing that happens. 

So I'd certainly think that for the advisory network, those are some of the challenges that we really want to focus on in the coming year or two. 

>> DAVID GREENE: Paul, I'll let you get the last word here, and then we'll move to Jordan for a wrap‑up. 

>> PAUL ASH: Thanks, David.  There's a question that’s come in from Jeremy West and  from Colin as well.  Colin, I think that framework is very much what we've tried to do in the core agenda setting, developing the policy around how we implement particular commitments, decision‑making on what approach we'll take and then implementing it.  I’d be happy to have the team walk you through that.  The key is bringing people together.  And to Jeremy's point and Anjum’s one about Karl Popper.  We do have to apply judgment to this question of who is in the tent and who isn't.  And I guess that question you put, Jeremy, of who might spike things is something we've had to ‑‑ we think on every day. 

The key here for us is, I think, trying to build confidence in a multistakeholder community that actually believes that there was great harm upon the Internet and its uses in 2019 and that we should be able to find a way through that to a better digital civility in a way that actually deals with TBEK as part of that, and that does require judgment, it requires regular conversation, it requires confidence building, and that's the commitment we give to members of the community that we will keep doing that, and we'll keep trying to lift bits of the community where they really struggle.  Hopefully that's just one minute, David. 

>> DAVID GREENE: You did very well.  And I'll just say before we turn it over to Jordan to sum up that there is a session this afternoon at IGF.  It's Session 262, which will actually look at the ‑‑ some of the problems with multistakeholder approaches, and Civil Society washing as well where we often feel like we're given a token role at the table but not actually heard or had our points acted on.  So if we can explore these, I think those are actually things that we've really tried to air in both of the forums we're talking about today. 

Jordan, as our rapporteur, can you sum up what we've learned today? 

>> JORDAN CARTER: I'll do my best.  Thank you, David, and hello, everyone, from New Zealand.  Look, just a few very quick thoughts because we don't have much time.  And this is a complicated problem where even the definition of what the problem is, being terrorist and violent extremist content, is contested.  We've heard, I think, a common theme that multistakeholder approaches both lead to better outcomes and are harder work than more sort of corporate or government decision‑making. 

That transparency is central to this work both as an outcome that is being driven by these processes but also as an input to the work these processes need to do to develop understanding about the real impacts of these systems on people.  And there is an argument that’s been put that better listening and better action is the result of these dialogues, but that perhaps depends on the point of view of the person sharing that story. 

An underlying theme, too, is that multistakeholderism is a concept, requires relationships to build.  You know, it's the time spent and the understanding that people develop with each other that allows the hard conversations to happen.  And that relationship building requires time, and time is money.  It requires resources for people to be able to engage over long periods to build those relationships and knowledges that allow for effective action. 

And the last thought is that it's important that approaches like this work.  The alternative is a cacophony of inconsistent regulation or companies simply doing what they like without the input of users and the Civil Society voices that are working in the public interest here.  So thank you to all the speakers for the incredibly interesting set of points raised.  I hope I did some justice to a summary, and back to you, David. 

>> DAVID GREENE: And thank you all for attending and participating.  That concludes our session. 

>> Thank you. 

>> Thanks. 

>> PAUL ASH: Thanks, everyone. 

>> Thanks, everyone. 

>> ANJUM RAMAN: Thank you.