IGF 2021 – Day 3 – Main Session BPF Gender and Digital Rights

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

We all live in a digital world.  We all need it to be open and safe.

We all want to trust.

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united.

>>  CHOUDHURY AMRITA:  Hello, and welcome.  We will be discussing and exploring gender disinformation.  I'm Amrita Choudhury and I'm a MAG member one of the facilitators for this group.  So we would want this discussion to be absolutely interactive, and you can always raise your hand or post your question on the chat, and we will try to take it.  Before we start, I would pass it over to explain exploring about the BPF gender and access for what we do in brief.  Wim, over to you.

>> MODERATOR: Thank you, I don't know if it's possible to let me share my slides.

>> AMRITA CHOUDHURY: Just give me a moment.

>>  WIM:  This looks wonderful.  Good morning and welcome all people online, welcome from Poland, cat was.  It's great to be here.  I would say it's great to see people online, it's a pity for those ‑‑ I say that are not able to make it here, I've found it a real surprise how the local host has been automobile with additional video on the walls and great video and transcript to make this type of session different than what you're used to.  You used to go to session where it was remote participation.  I must say I'm surprised with how well‑thought adaptations and arrangements in the room they were able to give this a completely other sense of the meeting.  So we are way more involved with people that are online.  If?

>> 

Let me explain two things first, my role and the second the role of the Best Practice Forum, I'm WIM, consultant working with the IGF secretary to support intersessional work in specifically my role to support best practice forums, they are an intersessional activity of the IGF, intersessional activity means it is working being done, discussions ongoing in between two IGF meetings.  For that, they get support also from the Secretariat itself.

The idea between Best Practice Forum is not to have policy discussions in the terms of developing new recommendations or really focused on coming up with new policy.  Its specific focus is to look into an issue, into a problem, trying to understand that issue, and how it is addressed and how it is addressed in different parts of the world to be able to learn from each other to be able to share experiences.  I think it's very important to understand it's a different context than, for example, another stream within the IGF intersessional work, which are the policy networks that are was more focused on exactly Mrs., best practice forums are more focused on looking at a specific topic at an usual, see how the usual is understood in the different parts of the world, and also what the answers are that people from different stakeholders, different communities come up with to address the usual.

Practically, best practice forums start work in the spring of the year, after the MAG agree on what topics they should focus on, then start the series of online discussions, which I think this Best Practice Forum really had a series of learning discussions with different specialists between May, April, June, August, which were very interesting.  That you are partly reflected in the report, but also in a separate section or in a separate document online on the ‑‑ that is on the website.

That's important because this is only ‑‑ this session is only a limited part of the work of the Best Practice Forum.  Most of the interesting thing is happening in the period before, where different stakeholders, different people are coming together to look at a specific usual from different sides.  Like I said, this BPF has been working in a number, probably eight, nine, ten online conversations, and the report is just like trying to reflect what was discussed as part of the work.

Also, this session is intended to continue the discussion.  This is not ‑‑ the BPF has me working on the draft report.  This session is nothing like ‑‑ not intended like we come to present the results.  No, this session for Best Practice Forum really crucial because it's giving the intermediate update in the draft report, this is where we stand with our discussion, and then checking with the people here at the IGF and invite additional input.

So part of the discussion today will be reflected in the final report that will be published shortly after the IGF meeting.  I think that's enough as an introduction because I can talk longer for Best Practice Forum.  Maybe one thing, this is not the first time there is a best practice forum on gender related issues.  It's the first time the title is gender and digital rights, but there has been a number of best practice forums on gender and access, check the IGF website because each of them has produced very useful output reports.  If you're interested in the topic, it's really interesting resource.  It's also summarized in the annex of the draft report.  So before handing over to Bruna, maybe I can introduce the topic.  This year the MAG group selected two topics for best practice forum, one on cybersecurity, and focused on cybersecurity norms, that BPF has a session tomorrow morning local time here.  Second is the BPF gender and digital rights.  And the proposal that was put in or the ‑‑ was agreed but the MAG was to have it figured on the concept of gender disinformation.  Relatively new concept.  So we will discuss later on with you relative new concept of what it means.  There's still ‑‑ I have the feeling working with the people involved in the BPF there's still room to discuss the exact definition.  There's still a very big need to exchange experiences with gender disinformation from people around the world, and last thing that ‑‑ the yes, and we will come back to that, it might be too early to already talk about best practices, because we still, people around different communities are still trying to answer or give the first reaction to the phenomenon so there is not yet a clear list of best practices.

But then I would like to hand over to one of the co‑coordinators of the best practice forum, Bruna.

>>  BRUNA SANTOS:  Thank you for joining the session.  We have almost 30 people online, and two more people on the meeting.  As we were saying, this is a BPF that has been going on since 2015 and we have tried to address the most different aspects of our lives, like for male and women and some part of these groups lives online, so some of the things we have discussed before was gender‑based violence and other things related to that, access, as well, and we felt in 2021 moving on to the whole discussion around digital rights, and disinformation as part of campaigns directed towards women online was kind of the next movement for us, but as we were saying, we fully understand that this is not, by any means, final word on this topic, not at all.  We understand gender disinformation as a rather new kind of movement or phenomena, and then our idea here is just to get this conversation started and based on the things we looked at in this past year.

Just looking at the term and the working definition that BPF has used was information activities, like creating, sharing or disseminating content, that can result in an attack or undermining people on the basis of their gender, or weaponize gender narratives to promote a political, social or economic objective.  So that is one of the definitions we have addressed.  Throughout our work, like we just try to really speak with some specialists, some part of the IGF community, and many other groups who would be interested in that.  We do understand that might be or might feel that we are kind of addressing something very similar to gender‑based violence, but we did want to go a little further on disinformation campaigns, and how they would affect male and women's lives online.  That is one thing.

I think we can go to the next slide.

>>  WIM:  I was typing in the chat.  People listening online, the people in the room, this is intended to be an interactive session, do not hesitate at any moment to raise your hand, ask questions, we will fit them in.  Thank you.

>>  BRUNA SANTOS:  Please raise your hand, we want an open conversation.  Moving on to telling you guys how we try to discuss the gender disinformation phenomena, like we try to divide this situation into five groups, so we looked into everyday gender disinformation, youth experience as well because in one of our meetings, it was also pointed out to us that some disinformation campaigns directed at young female might also be a barrier to their presence in a lot of the political discussion online, journalists as well, because it's for everyone that journalists have been also victims of pretty serious campaign coming from all over the world, and speaking as a president, I guess, Bolsonaro is one of the main presidents trying to do those things.  Also politicians, in the past year there has been a growing use of hate speech as a silencing tool to more diverse candidacies and to more diverse politicians, and last but not least, women Human Rights defenders, these were the five groups we tried to address in our draft report, and while we were trying to understand and comprehend how gender disinformation worked towards these groups, we tried to check whether or not it was okay or nice for us to highlight the best practices and also to see break or not, like, platforms or even some groups, civil society and academia, they had existing or exciting answers to the issue we have highlighted or even if there were, like, any emerging best practices on that.

That is part of our report as well.  I think we can go to the next slide.  Want to add something, Wim.

>>  WIM:  There was one question or comment in the chat.

>> AMRITA CHOUDHURY: When he was having a problem in speaking, perhaps he can raise his hand and try.  He had a comment.

>>  BRUNA SANTOS:  It's a she, Jamie if you can raise your hand, perhaps they can allow you to speak, Jamie so there is no hand option.

If you want to type your question.

>>  WIM:  The question was to reflect on why is it difficult to go the full extent of the issue of gender disinformation.  Is that something you to want to do now.  Or later.

>>  BRUNA SANTOS:  We can do it later.  Yeah, then we can go to the next slide, yeah.

In addressing this definition and situation, we also had some guiding questions as well.  We wanted to check in and look on how the suggestion of gender disinformation could be defined or understood by different groups, whether people had different experiences around them.  Also, how far has the world come to the platformization of this information in the context of COVID‑19, because it has also come across many of our meetings with the specialists that this was also something pertaining to stronger and maybe more present content moderation strategies.  That was also something that came up to us.

The third question we had was about who are the actual like missing voices in the fight against gender disinformation at the national, regional and global level.  Last but not least, how is trust building manifesting in the multistakeholder process.  Just to speak about trust building.  One of the main goals of the IGF in the past year was to have like a Frank and open and trustful conversation with our members in the IGF community around gender, because we also sometimes acknowledge this is not a subject that's truly president in the agendas.  So we also wanted to use this as a safe space to have a conversation around gender misinformation.  So that's that.  We can go to the next slide.

I don't know if anyone has any comments in the chat or if Amrita or any of our co‑coordinators wanted to come in on that note.

>> AMRITA CHOUDHURY: I think we are good.  Perhaps we can go back to the previous slide where we had questions, and perhaps we could also ask the audience here we can have the questions from the last slide.  As in how do you define or understand gender disinformation, you know, what are the experiences, and Bruna mentioned different categories you looked at.  Jamie, you asked is it difficult to gauge the full extent of the issue.  Yes, the manifestations are different.  The apparent ‑‑ apparently think that this is ‑‑ this is only the issue, but the repercussions may be far‑reaching too.  It is very tricky, it is being looked at by everyone, but when it comes to gender‑based disinformation, it's still not being gauged to that extent.  So would anyone want to speak on any of these points, you know, you have been looking at various reports, et cetera, in who are the missing links, who should be there, how can trust be built?

Anyone?

>>  WIM:  I wanted to come back on Jamie’s question.  The fact that the first question we asked and one of the first things the BPF started was looking into the definition and asking how do you understand gender disinformation is probably a part of the answer.

If you still have to come up with a clear definition or understand how people see the phenomenon or raise awareness about the phenomenon of gender disinformation, that's partially already the first ‑‑ that's the first part of the answer, why is it difficult to really go to measure and see the full extent of the problem, of the issue?  And later on makes it difficult to already come up with what best practices and measures to deal with.

>>  BRUNA SANTOS:  One of the two main premises we had, gender‑based disinformation was being deployed as a strategy against women and diverse gender groups, that was thing.  We wanted to understand which were the negative effects and how was there like some level of a spillover over a set of rights, such as political participation and to what extent this spillover, our digital rights could allow gender disinformation to be used as part of those political projects, of censorship, hierarchization of citizenship and rights, because we know this is something going on.  I see somebody has the mic and hand up.  Introduce yourself, please.

>> Audience: I'm Namalu, I'm a member of parliament from Tanzania, I'm fortunate I got to attend one or two of the online sessions and I'm pleased you've put politicians there, because I think one of the groups highly impacted by this issue is women politicians, particularly in developing countries where, you know, our regulations and definitions are not so much clearer perhaps like in the developed countries.  What I can say, you asked about the impact, and I can speak from experience that when a female parliamentarian perhaps shares online, maybe a contribution made in parliament or a meeting or I could even post that I'm here in Poland attending this meeting.  The reaction of a majority of people will not be based on the content of what I shared but ends up being sexualized.

I can give an example.  Let's say I take a picture with any of the gentlemen here, and post that here, we are in the IGF.  Immediately, yeah, you women parliamentarians, we know what you do when you go abroad, sleeping around, that kind of ‑‑ it gets twisted from the actual to being sexualized, and, unfortunately, you don't get the opportunity ‑‑ in any case, not able to go and defend yourself and say, no, this is not true, because it's spread, and by the time you go to the community you're leading, even though you're speaking about business, yeah, but this is what she does, even online they said it.

So how to prove what was said online is not true becomes difficult, but now when you look at the impact of it, I can just give an example in Tanzania, we have about 146 ‑‑ 143, 146 female parliamentarian and for us, it's not easy, it's a horrific experience, we have to try to bear it.  The rest, they just don't want to be online because it's not safe.  Someone is like, I don't want to put myself through that.

I have an experience of being online prior to being a member of parliament and post being a member of parliament, the two are incredibly different, and it's good that you've highlighted the different groups, because the impact a female politician is ‑‑ I can say maybe far worse because it goes to affect the kids.  I remember there was a time my kids were teenagers, getting the post people were saying about me, mom, get offline, it affects them.  The biggest impact that most countries, especially developing countries, we want to reach 50/50 in women in politics and leadership, there's a lot of efforts trying to get political aspire antsy to come forward, young girls coming up, the youth, women, but most of them, one of the first things that makes them feel they don't want to get into politics is the fact that once you're a politician, get subjected to all of this.

This is a very important discussion, but perhaps we need ‑‑ when we are talking about the tuft‑building, manifesting and multistakeholder processes, we can try to include political parties.  Because political parties have their own Constitutions, and perhaps ‑‑ it's been proven that oftentimes the people attacking are affiliated with one party or the other.  Maybe also political parties have a role and responsibility in curbing the online abuse on women parliamentarian.

>>  BRUNA SANTOS:  There was a comment on how you were a compelling speaker.  Yeah, we kind of agree with what you just pointed out.  Like when we went through, a lot of the meetings with specialists and everything else, we also knew that this was something that was way more common that we understood, like there are some researchers from Brazil, from a think tank called Internet Lab, while addressing the regional elections this year, they did find out that women were way more subjected to racist, misogynistic kind of political speech online and the whole experience or the whole, like, usage of a little more complicated language was still there, was still very much there, and against those more diverse candidates, as I was saying.

And also, when we had a chat in one of our calls with Courtney Reg, one of the MAG members, she did, again, point out that this was not just going on with politicians as well, but with journalists and the rising attack that normally governments, they direct towards like free and open media, and whoever is willing to have more access to information.

So we do understand this is being used as a strategy and this is why we try to highlight some of those groups, we fully understand they are not like all the groups who are subject to gender disinformation but as we started this conversation on this topic, these were the ones we felt like were worth highlighting or giving a shoutout, something like that.

>>  WIM:  Comments from Marcella.  I don't know from where.  Even as a woman online, you are already exposed to this kind of disinformation going on, but when you get a public, take a public work, it gets worse.

>> AMRITA CHOUDHURY: When we have a hand up from Shilongo.

>>  BRUNA SANTOS:  Someone from‑‑

>> AMRITA CHOUDHURY: Marcelle was from Cameroon.

>>  SHILONGO.  I'm not going to switch on my video because I'll lose you.  I missed a little of the presentation at the beginning.  My name is Kristophina Shilongo.  I wanted to pitch in on who are the missing voices.

So I just ‑‑ we recently completed a study with the University of Cape Town on misinformation and the actors who are fighting against misinformation on the continent.  We covered sub‑Saharan region of Africa.  One of the things that the actors pointed out was that, you know ‑‑ I don't want to say missing, people whose voices are not heard.  So we spoke to organizations, for instance, in Ghana, association for people living with albinism and various organizations in Ghana who are fighting the women witches in Ghana, and they, for years they've been fighting against, you know, myths, but in present day times, you can call them disinformation because it's really like information against them, and it's a violation of their rights.  If we recognize misinformation for people living with ‑‑ LGBTQ groups, in Uganda we spoke to sexual minority groups, they have been fighting misinformation, to recognize those missing voices, we also need to see how this information is linked to a violation of Human Rights and also exclusion, the social exclusion of certain groups.

So whether it's women, people living with albinism, people living ‑‑ people part of the LGBTQ community.

So when we looked at the kind of responses that they have or that the way they are countering and protecting their communities against the harms caused by disinformation or myths, as they call them, is that they look at different approaches.  So on a national layer, for instance in Uganda, in Ghana, an association for albinism, you know, they do community outreach, they do, they advocate, policy advocacy, so they have included a clause in the communications act to protect vulnerable groups like people like themselves.

That you are looking at, you know ‑‑ they have ventured into social media using radio.  I know there's ‑‑ I think there's a disconnect with the global north, for instance, prioritizing platforms, digital platforms, but radio is widely used in many countries none sub‑Saharan Africa.  They have extended into radio.  They are talking to church leaders, all sorts of leaders within the region and collaborated with various other partner associations, for instance, in Ghana, the association for albinism is working with the association in Malawi, and they are working with partner organizations in the U.K.  It's really like they have coordinated this very multistakeholder approach, and it's very coordinated, and I think if we kind of maybe include these various definitions of misinformation to include cultural stereotypes, I mean, there have been named cultural stereotypes before the fake news era, you think we can include those missing voices.

>>  BRUNA SANTOS:  We have another comment in the room, please go ahead.

>> Audience: Good morning, I'm a member of the parliament from Pakistan, and being a woman, I've faced a lot of challenges, I must say, when specifically it comes to social media code of conduct, you've always tried to raise my voice for this particular subject.  I'm also a member of the standing committee for information technology, and I feel that it is in need of the time we need to work specifically on some ethical grounds so that other people are not facing the same kind of situations that I'm facing.  The main issue we faced was fake Twitter accounts or fake Facebook accounts.  They can abuse anybody, they can  disinform, and they get away with it, because there wasn't a political mechanism to get ahold of them, above all, we did not have a smooth or a swift kind of coordination with the social media service providers.

But now with a lot of efforts we have making on this particular issue, we have seen that after creating a cybercrime wing that exists now in Pakistan, it's turned much easier for us to reach those people who misuse this particular social media tool to abuse, to create fake news, or disinform.  Hence, I think that definitely, as a woman, I've always felt the challenges are double.  The double standards are double, and women have to pay extra effort in every walk of life and specifically when it comes to social media and especially when it comes to political arena.  So I think we all ‑‑ all the women around the globe, because it's a digital world, not just a matter of Pakistan or any particular part of the world, it's a digital world, we all need to work out on this to stop these fake accounts, to stop people spreading fake news, but for that, we need men's support of the social media service providers.  Thank you.

>>  BRUNA SANTOS: In one of our meetings, we met with a lot.  We tried to address the same situation, one of them pointed out to us, this rhetoric around fake news has been like increasingly translated into legislation and regulation to try or at least attempt to reduce misinformation, but sometimes, legislators did not fully grasp the amount or the size of the usual and did not really come up yet with the strategies on how to address the specific problem of gender disinformation, because that was some of our diagnosis as well.  We still needed to develop, like, some common sense around what is a possible and common strategy for the issue of gender disinformation, and that's also in our reports.  Some of the things we did think through, like looking forward, was that there was still a lot to be done at the political campaigning level, in order to suppress attacks towards candidates who are responsible for exacerbating gender stereotypes and fostering inequality and oppression, and also introducing some level of gender sensitive standards for political campaigning and politicians and parties as well.

This is also something that does not necessarily need to be addressed once you guys are at the parliament, but also needs to be something that's on the very making of the plus Cal campaigns and electoral processes as well, just so you're way less under those types of attacks.  Then you are ‑‑ your everyday work.

Last, but not least, we also know that gender disinformation directed towards politician can often come disguised or perceived as kind of a legitimate or political critique, but we do know that in the end, it's something that pore trace you as women who are sometimes assigned to higher public offices and everything else as unfit, undeserving, incompetent and a lot of those things, we do know that all of those questions that society has about us, they are all translated into those kind of campaigns, that was something that's present in our report, and we were very much concerned with.  So it's very nice to be hearing the things from you and to have this conversation.

I don't know if anyone in the chat has any more questions.  I see Jamie also posted a link to an article about toxic culture in parliament houses as well and a documentary too, and yeah ‑‑ yeah, she's also asking for the person who just spoke to talk, again, if possible about the cybercrime cell/wing worked and how was it funded, and whether or not there was a decision by the government to support that.

>>  WIM:  I wanted to come back on that too, it's the first time in all our discussions that I heard somebody making the explicit link between all the work going on in cybersecurity and cyberwork and saying maybe there needs to be a link and direct channel that we have access to the people that work on cybercrime and recognized as a real problem.

>> Audience: This is very interesting, we have had this cybercrime wing, which we call the cybercrime cell as well.  It's linked with the information technology committee, also, it comes under the federal investigation agency in Pakistan that is the FIA.  We have specifically staff and a number of ‑‑ hundreds of trained people to combat this issue, and we have online numbers.  If anyone feels any kind of harassment or any kind of social media related issues, they can simply call on the online number, free number, register their issue, and they'll get a follow-up, we have apprehended a lot of people and even we have legislation done for this particular subject.  So it's all legalized, the parliament of Pakistan, we have managed to bring up laws to curb this issue in a better, legal manner.

So it's ‑‑ actually it's a federal subject and every province is connected to this particular agency, so anybody, anywhere in any province of Pakistan, if anybody is a victim, can directly ‑‑ it's very simple, they can call on that Turk number, and they'll be taken care of.

But provided the social media service providers actually they take a long period of time.  If anyone is facing any kind of issue related to Facebook, it will take much more time, but it's going to be much faster if it is concerned with the Twitter.  Twitter accounts can easily be reported and blocked, and the IP addresses can be traced out with the help of the social media service providers and then further action is taken.

>> AMRITA CHOUDHURY: We have Cheryl's hand up.

>>  BRUNA SANTOS:  I see Cheryl, I don't know if Shilongo has a hand as well.

>>  CHERYL:  Thank you very much.  This is an extremely important conversation.  I'm delighted to listen.  I'm from Australia, from a particular vintage, ancient, I think is the classification I go but these days.  With the questions in front of us at the moment, gender disinformation and experiences, this is a generation-by-generation issue.  Almost three generations, very much involved, almost the same stuff happening, but with the power of very instantaneous communications and all the things that we understand are online platforms.  A certain concentration.  It's not new, it's what humans have been doing to each other in a power play, not using male and female, gendered, it goes both ways.

So the tools that have been exploited are far more effective, and I think we need to recognize that, and I believe many of the speakers today have done that.  That's where I think a particular point of the work on what my view is, that is the development of tools or capabilities of resilience, because it's the resilience, this stuff is going to happen, ladies and gentlemen, but how resilient we are, via or politicians or journalists or youth, transgender and emerging ‑‑ finding ourselves youngsters, whatever it is, if there is a disinformation and someone is attacking on them, literally live and die in some cases, there's resilience, I think it's the resilience skills we look at.  That goes hand and glove with learning to establish what is trusted and untrusted or validated and unvalidated.  We know it can spread like wildfire if it's associated with someone who speaks convincingly.  So if we can get people to be more discerning on what they accept as truth or not and help that resilience package, I think we can at least get these current generations through this bout of what is basically a power play that is gendered.

The multistakeholder process, I've gone through all four questions now, we would have an opportunity to work with each other and learn about each other across norms and platforms and the usual places that we work and play, is something rare, something new, and something vital to explore in the solution.

So you'll just stop there, I have been in the board room many, many eons ago.  This is the sorts of inform that, unless we can find way of empowering the politicians and our youth and regardless of our choices of gender and how we identify, room to help that not be the huge and deliberately intended insult and be something that can be used to display one's own resilience and build one's own power in that dynamic.  I digress, I just wanted to share some of those observations with you all, thank you.

>>  BRUNA SANTOS:  Thank you so much, Cheryl, we agree with you, one of the parts that we highlight in our report also is that maybe the place in which gender disinformation meets the IGF or the Internet Governance community is when we definitely need a proper multistakeholder approach to addressing this challenge, to ensure that we are working and moving towards a balance of rights in particular disinformation and freedom of expression, as well as making sure every single stakeholder is part of this conversation, not just platforms, not just civil society and not just the victims.  So, yeah, we are very much on the same page.  

>> AMRITA CHOUDHURY:  How do we shift from voted laws to their applications on the field.  As we see, also it is difficult to ask big tech for removing wrong information circulating for an African citizen.  I think it brings us to the question of content moderation by platforms the transparency, lack of transparency, adopting this entire exercise, I guess that's also a question, right before us.

>>  BRUNA SANTOS:  Yes.

>>  WIM:  Also Cheryl said and then Bruna said, also the importance of having the multistakeholder, the different people on the table, because if you manage to have the platforms on the table, the politicians on the table and other organizations, then maybe you don't need yet the voted law if you have those people recognizing the issue and come up with clear ways that ‑‑ if you point out a problem to us, we can see how to address how it can be an easier way to start working while the discussion on legislation and how we have to deal with this from a more legal or structural perspective, that might be very easy ‑‑ well, never an easy solution, quicker solution to have the parties already around the table recognizing the table and look for those short cuts to taking immediate action.

>> AMRITA CHOUDHURY: Absolutely, Wim, I agree the different stakeholders need to be there.  For example, at times, there is a concern when nation states make a rule for content moderation or removing fake accounts, which is prevalent everywhere, there is a concern that many times it is used for political means too.  So the balancing part of manager where you need the multistakeholder people, you know, the stakeholders there to showcase that, look at this particular legislation may affect in this way or may not be effective, for which it has been created.  Bruna, Nema has her hand up.

>> Audience: What I wanted to add on from my colleague from Pakistan, in Tanzania we have a cybercrimes unit and cyberlaw, but one of the biggest challenges, I don't know how it is in other countries, in most African countries, first to recognize online gender‑based violence as one of the gender‑based violence, oftentimes when we are talking about GBV, it is on the physical ones, easier to prove, but when you are talking about online gender‑based violence, sometimes you cannot prove it.  When you're talking about the amount of emotional distress it brings on you, the way it affects your mental state, you are not automobile to prove it.

So perhaps maybe something maybe the best practice forum on gender and digital rights can do, come up with a best practice guiding principle when we are talking about gender and digital rights, what is the ideal picture, and then every country can, you know, use that as a base and then obviously customize according to the local context of the respective country.

The second part is with online, exactly what my colleague said, most people use fake accounts.  So then even if you have a very good regulation in place, how do you prove it?  How do you know the person that is attacking is Bruna, how do you find Bruna when this person is unknown?  But isn't there a way that the social media platforms can actually know who the person is?  That is something that comes to the social responsibility of the different social media platforms.  The short cut you were just talking about, I'm sure if that you can figure out most of these things, how can may not know this is Nema even though I'm under a fake account, I'm sure there is a way they can prove that.  Perhaps they have some kind of responsibility towards making online spaces safer for all groups, but then there's the other issue of capacity building.  Many of us are online, but we are not capacitated on how to protect ourselves online.

I think very recently, about maybe a month ago, is when I got to realize you can report someone's tweet and ask for the person to be blocked, I've been online for so long.  So there's the issue of people knowing how to self‑protect themselves online.

The last thing I wanted to say, the differences between abuse from one country to another.  Twitter, et cetera, they have their guidelines and certain words they've put in their algorithm that ‑‑ press the red flag, but maybe a word in the U.S., maybe a word in Tanzania seems abusive, but in the U.S. is not.  I'll give one very small example.  I think last week, I posted something on Twitter, and someone said, tacka tacka, it means rubbish, from a European perspective, someone calling you're just rubbish, it's not a big deal because of the interpretation, but for someone in Tanzania calling tacka, that's abuse of.  How do you then get Twitter or Instagram or whoever to understand the different?  Europe, it's okay, but for us it's not okay.  I think there's those things that going about the best practice, we can come up with the best practice, what's the ideal world and work from there, thank you.

>>  BRUNA SANTOS:  Just to come in on the legislation point, I don't know if anyone knows about that, but Brazil has been trying to set up its own legislation for disinformation in the past year, whenever we start to discuss how to address this topic in general, it is not directed towards gender disinformation, but the issue we are having right now is that it's very complicated and hard to stop like legislators normally for going towards freedom of expression, for going towards user rights, and there needs to be a balanced discussion, as you were saying, because everyone knows that for many years social media platforms have been aware of this kind of language online, they have been done sometimes very little about that, and I think situations such as the Facebook papers that just came out also highlight how little they have done and how some countries tend to be more important than others and how problematic is this gender inertia that social media companies, they tend to, like, to be in with regards to countries like, I don't know, Brazil, India or any other country that's not at the very high‑level list of interests for those companies and social media responses.

So, yeah, we definitely need more and maybe more collaboration on that topic just so we are able to fully address this situation.  And not exclusive from the regulatory aspect, because at least to me, the solution needs to be something more than that, like some capacity building, as you were saying, regulatory approach and more collaboration between stakeholders.  I'm going to stop at that because I see Eric has his hand up in the room.  Eric, please take the floor.

>>  ERIC:  Hello, good afternoon, good morning, apologies, I won't be able to open my camera also because I may lose connection.

I would like also to comment ‑‑ not comment, but dwell on the experiences of the Philippines regarding gender disinformation, Philippines has a long history of disinformation, starting from our vaccine hesitancy, which leads back, I guess, from the vaccines invented during 2019.  And now, during the campaign season in the Philippines, I just want to high lawsuit the role of the Philippine government and their supposedly troll farms, in disinformation campaign here.

What's happening is there is a campaign in order to make certain candidates to look good, despite their competence and incompetencies in the past.  What's worst, the government is targeting certain women that are making rounds ‑‑ that are making noise in order to combat this disinformation.  For example, our recipient for the Nobel peace prize, Maria Ereza, she has been the target of the government for her bravery for speaking up against drugs, recently the only female president, is also the target of the government because her track record is far more clean than the other president pols.  Now being the only one capable or competent of running the Philippines, she is now being discredited as a candidate because the government, the administration is now being threatened by her track record because the people are leaning towards Bredos now.  This is also an experience in gender disinformation because I don't think that there are states that are in the center of this disinformation campaign, by far, they are also targeting women that are seemingly threatening their ‑‑ challenging their Powers in order to be ‑‑ in order to appear as more confident or appear to be strong in the eyes of the people.  That's all.

>>  BRUNA SANTOS:  Thank you so much, we have a second hand up in the room as well.

>> AMRITA CHOUDHURY: You can unmute yourself, and the person who is speaking should identify himself or herself whale speaking, we can see you all, perhaps you can enable your audio now and try.

>> Audience: Thank you so much for allowing Bangladesh to ask the question.  Myself, I was an IGF member to (?) the question is Human Rights most of the online are getting privacy and teasing women on the internet, how can we serve gender equity and (?) on digital rights.  Thank you so much.

>>  BRUNA SANTOS:  Thank you so much for the question.  I don't know if Amrita, do you want to start taking this question, because, yeah, as somebody who has been trying to work with this topic for a while now, just gender and equity on the internet and also like being respectful of digital rights of women and gender diverse groups, it's also part of ‑‑ to me, it's part of a bigger and larger strategy around capacity building and teaching people how some languages are not okay, and some languages should not be shared around, like a woman should never be questioned on the basis of being a mom, on the basis of not being able to, I don't know, be a parliamentarian because she has two or three kids, something like that.  I think that's part of a graduator and bigger societal change maybe we all need to go through and stay away from those misogynistic ideas and thoughts that we are all kind of thought at some point in our lives.  To me, it's a bigger strategy and a very deep question, to be honest.

>> AMRITA CHOUDHURY: I agree to this, Bruna, it's very difficult and a tricky question.  It goes back, as Bruna ‑‑ it's less of a technology issue and more of a social issue, because there are many behaviors you will not do offline when you see a woman politician, for example, you may not abuse her offline, but when it is online, people use it because that you like to hide behind the anonymity.  So I think the capacity building, as Bruna said is very important, there should be some etiquette on what you can do online and what you can't do online, especially when I look at imaginations, there are ‑‑ emerging nations, they have the tool of the internet in their lands and think they can do everything, like, for example, a child is taught to cross the streets at a particular point, you know, taking care that they do not get into an accident, same thing as these people are not taught because there is a look of capacity.  Everything is available in the hand, they can tweet and post anything, but they don't even think before doing it.

So I think that capacity building is very important.  And while I would say that it is important that a person is identifiable on the internet, but in many nations, anonymity is also required.  So I think there has to be a balance between the two, as in, you know, everyone being identifiable on the internet is a very tricky thing, while it is good that you can do that, but we see the way some people are targeted.  So I think it's a tricky question, but that's what we could say, is there any question else.

>> Audience: Thank you so much for the perfect answer.

>> AMRITA CHOUDHURY: I have more to say, Wi MI over to you.

>>  WIM:  Completely agree to everything you said, capacity building has been mentioned now a couple of times, and one thing I would like to add in capacity building is also on the side of the receiver, because it has dealing with how ‑‑ what you can do online, post online, also there are not only gender disinformation, but misinformation, disinformation in general, the whole group of people that I think Nema mentioned in a certain way, it was online, so it must be true, that's something people need to be taught or sometimes that filter needs to be there to just question and where does this come from?  I think this is community‑wide problem and it's ‑‑ it is the same with certain newspapers, it is the same with online, so that goes way further than just online, but I think that's something very valuable to learn people, that it's not because something o online, it probably would be the ideal world when you just have to switch on your computer and you can say, I mean, found it on my computer.

I personally had an experience when I started working, working for somebody I already knew, working with the internet, and beginning I was working, I had to foot, be able to allow to look something up on the internet and after a year of working, I had to do the opposite, I had to fight because, well, you can open your computer and everything is there already, not because it's on the internet, that's something I wanted to add.

>>  BRUNA SANTOS:  Adding one last thing to this conversation.  Anonymity is not the main problem here, to be honest.  The fact that some people might, like, hide behind pseudonyms or any other kinds of tricks in order to share disinformation or gender disinformation, that is not the main problem.  The main problem is this culture that needs to be changed.  It is a very complicated line for us to say it's only about criminals hiding behind the internet and doing those things, because we also need to acknowledge at the same moment that sometimes the use of pseudonyms and some other privacy-oriented solutions, such as encryption and everything else, it's what ensures that a lot of us, a lot of us women activists, we can express ourselves online.  I just wanted to make this short highlight because we need to focus on the right things and not make this a conversation against anonymity, against encryption or all of those tools who are also like very relevant tools for our freedom of expression online.  I don't know if anyone has any other points, yeah, please go ahead.

>> Audience: Okay.  So grateful to be a part of this 2021 program, and we get lots of information, and we have very good experience, thank you so much.

>>  BRUNA SANTOS:  Thank you.

>> Audience: There are more ways countries like Pakistan has been working on, we exercised this particular thing, such as a couple of months back, Pakistan, not the government but the ‑‑ one of the application named Tik Tok, if you're aware of that, it was banned in Pakistan for a couple of months because it was seriously being misused and abused to insult, malign and harass people.  Finally the courts decided again that banning an application is not the right idea, and they came up with the idea that we need to have more regulations, but as my fellow from Africa, she's absolutely right, I completely endorse her, it's just not about regulation, it's about the basic ethics and the code of conduct and filters that needs to be applied.  She's absolutely right, and I even endorse those particular matter, because even in my part of country, if anybody abuses me and uses, because I have almost more than 2 million social media followers in my country, who follow me, but at the same time, if somebody abuses me, it's just not me who knows it, it's public, it gets public to millions of people, and they all get to know that this particular member of the parliament is being abused.  Many of them might attack the other person, but many of them might think that maybe, you know, a person is rights and the disinformation somebody is creating about me, my constituents, my voters, my public supporters, might get affected.

It's not only them.  In particular, me, myself, I was strongly affected, not only me, but my family was traumatized, my husband was in great stress for almost one week when I switched my political party and joined the other political party, a year before the election, it's very common for men in Pakistan to do that, but as a woman, when I decided to leave a political party and join another political party, I was trolled and abused and there was a campaign that took over on the social media, which was a big trauma for my family, which was a big stress for my family.

It's absolutely right how does the social media provider compensate with that?  So there needs to be very strong check and balance that needs to be strong filters, and in my particular opinion, and a piece of suggestion I would love to give to the social media providers, is that until and unless we do not punish or penalize those people who misuse this particular tool, we cannot curb this issue, at least those people who are creating fake accounts, all those people misusing these tools, they need to be punished for a level that they might not be allowed to create any account by their own name using that particular social media tool for a particular period of time so that they may understand that abusing and misusing a tool is a crime.

>>  BRUNA SANTOS:  There is still a lot to be done on those channels for reporting those kinds of languages as well.  We all know they have ‑‑ speaking specifically of social media companies, they have completely improved in the past years, like in terms of how fast they can address those things, but we still know that it can be very different just from like a citizen and sometimes comparing to, I don't know, to social media or anyone with a more ‑‑ with more space on media and anything like that.  We need to have the same level of attention for whoever is the victim of these kinds of languages and everything else.  Just to go back on how public these attacks can be as well, I mean, I couldn't go through the session without mentioning Bolsonaro, and his sons, they are top guys who have been directing attacks and kind of related disinformation towards journalism, journalists and everyone else.  It can be very public as well and can come from very high up places, and it often can be part of like a state sponsored kind of propaganda against groups and everything else, that's why I say not just hating or aiming on anonymity or anything like that, won't solve the issue, it's way more complicated one.

>> AMRITA CHOUDHURY: Bruna we have a question which says can you please describe work being done, if any, on the intersection of big tech business models and algorithmic application and addressing online gender based violence, this is a question and a comment stating while there are recommendations, they are not targeted to government or policy makers and would like to suggest government policy‑makers are often quite keen for reports that make it really clear what regulatory and nonregulatory approaches are recommended to address a problem.  Before Bruna can get into the question, what we as a group are looking at is not really ‑‑ it's too early to make actually recommendations per se, we wanted to make recommendations, but, you know, but we talked, since it's a very new topic and it has very various dimensions, perhaps we can cite things approaches which various entities have taken and perhaps provide some more insights on it, and BPF does make recommendations, we are not so proscriptive, anyone from the team, Bruna, or and feel free to add anything here.

Just to answer that, while we are not talking mump on the gender‑based violence here and I don't think there's anyone from the platforms what could answer it much better in terms of how their content moderation or government's patterns are working, if anyone from the audience wants to take it up, Bruna if you want to try, you could, I don't know that we are quite equipped to answer this, right.

>>  BRUNA SANTOS:  I can highlight the works I have been following in the past year, it's more on the advocacy and highlights how this is a very present issue.  As I mentioned before, there is one very good report from two organizations in Brazil, one called Asmina and the other one called internet lobby and they have been analyzing and political speech and trying to understand to what extent this is a compelling and huge strategy, this gender disinformation hate speech directed towards woman.  This is one of the works I can highlight.  I think Chenai has her hand up, if you want to come in, we'll go to Nema because she has her hand up as well.

>> Audience: You've been following the conversation, very lively, great to see the answers that have come up, I'll try to answer the question raised Irene, and the role of algorithms and algorithmic and addressing online gender‑based violence, thinking about the business models and the role of algorithms and algorithmic amplification, a lot of the work that has been done is the transparency pep connectivity aspect.

Last year, I do know the web foundation had what they called the tech lamp where they were trying to create policies targeted big tech companies to think about what are the best ways to actually address online gender‑based violence from the perspective of content moderation, algorithm and algorithmic amplification.

That was some of the work that has done in terms of recommendations, a lot of times the criticisms comes back to the points of the nuances of language, how, as Nema pointed out, when you use one term in one context and use it in the African context, it has a different meaning.  These big techs are located in the Europe and U.S., there's a trickle down of resolution and regulations they would have an impact for people from the global majority, people based in the African continent or Southeast Asia or Latin America.

And I think the work, in terms of thinking about work that has been done, it would be looking at the work in online gender‑based violence and the women's rights program at the association for progressive communication, finding articles that have been done and the work that's on gender IT where a lot of it is assisting their effort.  There has been work around trying to understand data, not just ‑‑ looking at online gender‑based violence, but the way in which the data business model that is currently existed, how exploitive it is.  A submission I worked on last year in terms of thinking about online gender‑based violence especially within the COVID‑19 pandemic, connected to disinformation is just how the prioritization of regulation around disinformation around the pandemic was around health information, and making sure people don't have the disinformation.  However, that meant that for something such as online gender‑based violence that has been a topic we have continuously asked for big tech companies to focus on unpacking, they haven't prioritized this.  There are some pieces of work, but opportunity to really un‑pock how can we influence a shift to addressing how to focus on gender‑based violence so it's not just us trying to hold the companies accountable and them trying to provide solutions that would fit in existent business models, that's my response, thanks for the question.

>> .

>> AMRITA CHOUDHURY: We have ten more minutes.

>>  BRUNA SANTOS:  We have a hand up in the room.

>> Audience: I want to echo what Chenai highlighted.  Oftentimes when we are talking about or curbing online gender‑based violence or making the online space safe, the issue of freedom of speech comes up, you don't want to do that, then you're going to block freedom of speech, in my understanding, I think freedom of speech is both ways, we tend to look on one side of the coin for those who want to be vocal and critical, which is okay, but there is the other side of the coin.  If I'm going to be abused, harassed and pushed out of the online space, is that not hindering my own freedom of speech.  When we are talking about freedom of speech, it's important to have that balance and make sure that a certain group isn't empowered to make the other group not ‑‑ have their voice, because if you push me out, you're shutting me up, so in another way, you're making it impossible for me to exercise my own freedom of speech, there's a difference between being critiqued, being critical in terms of the political space, development space, et cetera and when it's now abusive in an individual context.

The other thing is, I think it's very important, I know ‑‑ I think Amrita mentioned you are not giving recommendations and don't want to be proscriptive.  You think the fact this is about best practice, you know, I would like my call to action to you is to come up with a simplified best practice, which is not proscriptive, because at the end of the day, any group can decide on how to use it, at least give some guidance on this topic.  Finally, maybe there's also an opportunity to bring together, you know, the tech companies with yourselves, with representatives from the groups that you have identified, can we have a session to discuss these issues with them and start that brainstorming agenda.  Finally, one of our ways forward, I always find is, you know, when an item is being championed by one of the U.N. organizations, it changes a space and how people relate to it.  So maybe we have an opportunity through this, maybe one of our take‑aways can be to bring on U.N. women, how can we make online gender‑based violence be recognized and be part of the 16 days of activism against gender‑based violence, we are still in the 16 days right now, but online gender‑based violence is not prominently coming out on those discussions, so that was my final contribution, thank you.

>> AMRITA CHOUDHURY: Thank you so much.  I think being a timekeeper, I would suggest, Bruna, that we ask Chenai to step in and summarize, we can keep on discussing, there is a lot to discuss, but Chenai, would you like to summarize, you know, based upon everyone, how we should go ahead and what should be the next steps.

>>  CHENAI CHAIR:  Thank you so much, Amrita.  First of all, I want to say thank you to Bruna and Wim, who are holding the space in the room and the space for the conversations to happen.  Sometimes, IGF is finding that balance off online participants and offline participants.  I think they've done a great job this time around.  To summarize the discussion, there was so many great points and insights, especially of value, having parliamentarian and policy makers in the room that can give us guidelines about thinking the process.  There is a need to potentially think about connecting ‑‑ not think, but work on connecting gender disinformation, it's a generational issue that continues to exist.

Secondly, a lot of the points that came out were how do we engage different stakeholders social media companies and big tech giants themselves, as well as cybersecurity entities that are in the regulatory and legal space and really thinking about committing them to actually have solutions that will have a balance of rights so that at the end of the day, we don't have an issue with some people are making regulatory decisions that are going to limit the rights of others in this space.  I think that's always a conversation when we think about ensuring agenda digital rights.

Another point that I thought really stood out also was really thinking about how, you know, the definitions that we have and connecting them to the existent usuals already in play.  The strong point that was raised around they have been existent and those work on Human Rights violations issues and tying it back to those issues so at the end of the day we are talking about the lived realities of people and aiding them to the gender disinformation perspective.  Not saying it's just happening and not connected to any issues.

I think there were examples raised of potentially we could include in the report, different case studies we have in the report.  It would be great for people then if you have these examples in terms of like links, there are some links I copied from the chat, examples raised around from the Philippines, and I think some raised from Pakistan or Tanzania, it would be great to have these edited into the report so we can document as links and resources to some of the interventions, you think lastly, as an action point, like I think there's been a strong concern that we give guidance or at least like a baseline best practice, and often I like to call them as best fit practices.  So I think setting it up in that perspective of actually having a best practice that we then can then say can be adapted to different regions and to different localities is something that we definitely would think of working on, I encourage everyone in the room who might from this conversation have best practices they think are important to include to share them on the e‑mail address, contact agenda at inetgovforum.org, shares the with us, we will put them in the document.  I encourage everyone to join the mailing list for the governor and digital rights best practices forum.  If you're interested in being part of the steering committee for 2022 or continue this work, I think this is the place where we do want ‑‑ like we always say around gender and the digital rights space and access, we continuously push for it in the Internet Governance forum because we want a place for people to have this conversation, it's an important topic and you want to influence the space within IGF.  Those are my key takeaways, and I encourage people to write to us, I'll ask them if we have a deadline for when we finalize the report.  Thank you so much for participating in the session, and we really, really look forward to continued collaboration with all of you.  Amrita, I'll hand it back to you.

>> AMRITA CHOUDHURY: Thank you.  We four minutes.  Wi me, if you would like to say something and Bruna.

>>  WI MI:  A quick reaction on deadlines, for this year's report, we would like to invite a quick reaction because the idea is it is finalized and included with the rest of the official outputs of this IGF before the end of the year, I would say if in the course of the next week, you can send reactions.

Then it is for the new MAG to decide on the ‑‑ or select best practice forums for next year.  So I think suggestions of ‑‑ concrete suggestion of what the agenda BPF on this topic ‑‑ gender BPPV could be on this topic, what could be in the proposal to continue this work would be really welcome for the MAG coordinators to work with and come up with the new proposal to continue this work into next year.

>>  BRUNA SANTOS:  Just to take the opportunity of thanking everyone who has been involved with the BPF this year, all of the specialists we spoke with, like Ellen Judson, Courtney Raj and a lot of the people that joined our session too, this was all the beginning of a process.  If anyone attending the session, has any ideas or suggestions as when we are saying for how should we continue to develop the topic for the upcoming year, they are all very much welcome in terms of suggestions in how to move the work forward.  So thank you all for being here, and, I guess, yeah, we can call the session out.

>>  WIM:  We have 1:30 left.  Thank you everyone and thank you for the technical team in the room and for the people, well, from that follow us here and connecting to us.  Sometimes I have the feeling very early or very late hours, but still thank you for joining us.  Thank you.

>>  BRUNA SANTOS:  Thank you and bye.