IGF 2022 Day 1 Networking Session #77 Online Gendered Disinformation from a Global South Perspective – RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> FERNANDA MARTINS:  My name is Fernanda.  I'm an InternetLab director.  We are based in Brazil.  A couple of years ago we started to try to understand the movement of information, creation, and circulation.  We have considered gender disinformation is one more type of gender‑based violence.  So to understand the social phenomenon, we need to assume that gender‑based violence is a structural inequality that marks all society in different ways, online, and offline.  In the Brazilian case, for example, we understood that gender information is associated with narratives that enforce gender inequality, including when it is propagated in support of the current government.  To provoke us today I invite our speakers to introduce themselves and make a brief speech to answer the question.  Can I advance the slides, please?  Next.  So I forgot to tell you about our agenda.  I will direct one question to the speakers.  After that we will discuss in breakout groups, and after that I will ask one of the facilitators in the group to tell us what was discussed briefly because we have one hour, so it's very fast. 

The next please.  Awesome.  I think I forget the slide.  Okay. 

Please, Danya is in line first.  Thank you, Danya, because Danya is in Mexico City, and there it is midnight, so it's an aha moment.  Danya, could you tell us a little about your work and how are you thinking about the gender disinformation?  And please introduce yourself. 

>> DANYA CENTENO:  Hello.  Can you hear me?  Hello?  Is it working now? 

>> FERNANDA MARTINS:  Yes, we listen to you, but we don't watching you, see you. 

>> DANYA CENTENO:  Now? 

>> FERNANDA MARTINS:  Yes, now.  Thank you. 

>> DANYA CENTENO:  Wonderful.  Thank you.  Thank you so much for the invitation.  It's such a pleasure to be here.  I'm Danya Centeno.  I'm a lawyer from Mexico, and I specialize in human rights, particularly in the digital sphere.  I used to work with our Mexican NGO promoting the human rights in Mexico.  I used to work at Twitter.  That's pretty much my background.  Since R3‑D and later on in Twitter I later specialized on working in safety, particularly from a gender‑based perspective or gender‑based abuse online.  In R3‑D we made a collection of NGOs working on this in Latin America and then at Twitter I was also in charge of conducting several workshops and sessions to hear directly from several NGOs and experts in the field about how they were experiencing the platform, and I must say, of course, gender‑based misinformation was always an issue.  It was often mentioned and often flagged because it has many different impacts that affect disproportionately gender communities, women, and LGBTQ communities.  It's a real problem because there's, of course, no one solution fits all.  There's no clear solution.  There's no clearance, and to understand the complexities around this issue is a very complicated task. 

So it's very important that we have sessions like this to discuss especially from a global perspective to discuss how this issue is impacting in our region. 

What I can say at least from my experience working at Twitter and also R3‑D is that we really need to take our time to understand the root causes of this phenomenon and how it can extend so much as it does, and also the fact that sometimes it's really complicated to link the impacts offline of misinformation online, and that's one of the main, I think, priorities we have perhaps at Twitter.  I cannot speak on behalf of Twitter anymore, of course, but overall I would say making sure we have this linkage ‑‑ that we can prove that linkage of how misinformation is being spread online and how it's actually targeting and impacting women and non‑binary communities offline, I think that's the real job, and it's not easy, of course. 

There's so much of an emotional impact that it's often not easy to prove, and I think that's one of the main areas of concern perhaps.  And also how this can take so many shapes.  For example, there are now groups that are dedicated to do this and that might receive some sort of money or payment and that it's super ‑‑ how can you track this?  When it comes, for example, to platforms, it's also difficult for them to conduct such an investigation because there has to be, like, policies at place that also come from governments to make sure this issue is taken into consideration because platforms can not ‑‑ do not have the tools to investigate what happens outside of them.  Like sometimes the coordination might take place on the platform, but also sometimes a coordination might take place outside the platform, and that's what is really a trick to trace down.  I wouldn't want to extend farther than that, and I think I ramble a little bit, but although to say that, of course, it's a super prominent issue that needs to be addressed, that needs to be better understood, and that we need many stakeholders engaged in this conversation to make sure we understand all the different aspects of this thank you. 

>> FERNANDA MARTINS:  Thank you, Danya.  You bring some important questions related to the necessity that we understand that the circle of disinformation is not restricted online environment. 

Elections, presidential elections, now we watched movement very particular because the online gender‑based violence, the political violence, and violence on the streets were very connected, and I think it's a part of the problem that we need to think about deeply. 

Now I ask to Malavika.  Thank you. 

>> MALAVIKA RAJKUMAR:  Thanks, Fernanda.  Hi.  My name is Malavika Rajkumar.  I'm a lawyer, and I work at IT for change in India. 

So I guess a lot of the issues that were spoken about right now and a lot of the gender disinformation practices that happened across the global south, a lot of similarities especially when it comes to the election time.  Like one year before the election time.  That's when we see a sort of hike in gender disinformation. 

India especially has one of the largest democracies across access, largest youth population.  One of the largest global internet users and social media users across the board.  So you can imagine the sort of violence that women politicians face online especially around election time. 

One of the main insights that I would like to bring to today's session is part of a study that we did on abuse and misogynistic trolling that's been detected on Twitter against women in the Indian political and public sphere.  We actually did the study because we wanted to understand the nature of online hate and patterns of abuse.  Secondly, also after understanding this, we wanted to see what sort of regulatory frameworks can actually be put in place and can actually be suggested to address this issue. 

Again, I'm going to go straight to the findings, but we have actually done an exercise where we have gone through 30,460 mentions, annotated them in a painstaking process using 19 codes which we then made into seven codes.  Like, for instance, if we take a team of threat, we had intimidation and violent explicit threats as categories.  Again, after the annotation process, after we focused on abusive, misogynistic, violent mentions, we were able to get a lot of interesting insights, which I would like to share here. 

First is that the pervasiveness of ‑‑ it's so normalized in the Indian Twitter sphere that it's not even surprising anymore.  Even as a user of Twitter, I know that if I put any political message out there, I will face some form of violence.  One thing that is clear is that women involved with the ruling party all faced violence.  Specific targets of violence included women who were left‑leaning dissenters, and those women that belong to opposition parties.  This also included women that were not even on the platform. 

There was a hash tag, the name of the woman, who was belonging to the opposition party that was not part of the platform.  The intersection that received a lot of hate was mostly women, and those who don't have party affiliations.  Especially some political commentators.  Especially with the way trolls actually attack women, trolls tend to tag certain women together as to deride, warn them, or intimidate them to deplatformize themselves from the platform.  This specifically included Muslim women and Dalat women.  That's a cast‑based violence that is seen in India. 

Another interesting finding was the light‑hearted trolling that was present against these women.  So they were not overly grave threats which were also visible, which includes violence, rape threats, death threats, but these were milder fun abusive sort of word play or memes, and they also create a lot of psychological impact on women who are actually seeing this all the time. 

So the abuse rarely had anything to do with their work or politics.  That is one of the core research findings that we found.  It's mostly to do with their bodies or their character.  So if it had anything remotely to do with politics, it was to actually attack the woman's credentials, to realize their role in politics, talk about their appearance, including all beauty no brains, which is actually an example that we have, and sort of derailing where you take a random incident that has nothing to do with them and then make it their issue and sort of cause a lot of sort of, like, perpetrate violence on that. 

Another overarching subtext that we got is the role of patriarchy in here.  First is the sort of angles of shame and honor that kept coming up again and again.  The term shame, shameless, honor (speaking Hindi).  There are a lot of Hindi words.  Hang your head in shame and shame on you were used a lot.  One of the mentions says poor father, shame on her.  These are the kind of attacks that women politicians were receiving, and they are usually ‑‑ it's very clear.  In South Asia it's subjects of male control.  Their bodies back repositories of community values.  Their bodies become sites of family, community, national honor even plays out. 

Another angle is cast‑based violence.  So women who are attacked who were particularly from other cast members, they were attacked based on their purity or honor. 

Secondly, their medic was also called into question.  As in Hindi (non‑English spoken).  What are you doing here?  That was usually posed to them. 

Hate was visible and sort of one of the mane reasons when gender disinformation becomes disproportionately high is when there's a religious angle to it, so usually the women were attacked for many reasons, including high attacks referenced to nationality, asserting exclusive claim over national identities, collusion with foreign intelligence, treason based on hate for the religion itself, which means distorting the way Islam plays a role in India, and the term Jihad was also used quite a bit. 

My last point, which is a lot of object fiction also happened where women's bodies were subject to male desire or hyper‑ sexualization where, for instance, if there's a picture of the woman, then there would be a deep fig that circulates, and trolls would come on her Twitter thread and abuse her. 

There's a lot more, but these were the primary experiences that women are facing in India, and it's something that I'm pretty sure, like, we did a study on Facebook or any other platform we would get very similar results.  Thank you. 

>> FERNANDA MARTINS:  Thank you.  Very interesting to listen to you about the way that we are facing the violence in the categories that we find.  There are things that are similar in different cultures in Global South. .  When I was listening to you, I was thinking about the Brazilian context.  It could be me say the same thing or very similar things. 

Moira, can you? 

>> MOIRA WHELAN:  Sure.  I'm happy, and I think on that point it's the perfect point to segue to what I wanted to share with you all today. 

So the National Democratic Institute, as some of you heard me say yesterday, came to this issue we like to say because it came to us.  The issue of gender‑based violence and online violence against women in politics was the number one reason globally that we understood women to decide to not run for office. 

So from that standpoint we've identified it as existential to the future of democracy, but also a game‑changer, meaning if we can make progress in this particular area, we can really change the nature of elected systems around the world. 

But we really came to this and we started from it because we saw studies like the one in India, the one in Brazil, and we were noticing that in each space it was treated as sort of, oh, well, that just happens here, it's the price of doing business, you've chosen to enter the public sphere so this is the cost. 

We really ‑‑ what we wanted to do is demonstrate a global challenge that we're facing and show that these attacks, much like we're seeing that extend beyond threats of violence and into gender‑based attacks of the ones we heard about religiously motivated, to use a trite but very famous U.S. but‑her emails about Hillary Clinton, right?  It's hard for an algorithm to identify that.  It's hard to identify it as a gender‑based attack, but in fact, it is. 

What we did is we had done a significant amount of research, a significant amount of work in identifying hate speech.  Lexicons that we shared with platforms. 

We were finding ourselves at a crossroads where what we wanted to do was go back to the same political women and ask them what the solutions would be.  So we looked at three different tracks along those of tech companies, governance, and civil society, and globally convened, I believe, eight different working groups, consulted with about 100 women in the political sphere and identified 24 interventions that we now look to.  You can find them online at NDI.org.  It's the interventions for ending online violence against women in politics, and we view that as sort of the menu, right?  Because what we also learned is that regionally there are different opportunities for success.  There are in some areas legislation may help.  In other areas legislation is simply not possible, right? 

We view that as a menu, and now what we're doing is sort of splitting our efforts.  One, to look at the issue of platform accountability and getting platforms to come to the table and address this issue.  The separate one to look at governance structures.  We feel like NDI has a unique contribution to make around things like political party codes of conduct and conduct within parliaments, things that we can really help to change the political discourse to make this issue.  What we're looking for is to create a normative framework where this is taken off the table. 

So ‑‑ hold on one second.  So I think just two points I wanted to leave you with.  One was identifying one of the challenges we're facing right now with tech companies.  So Fernanda joined my pleasure and a group of other women.  I think Irene was here.  Visiting with tech companies very recently.  I would point out that I think we are at a crossroads in a major challenge where we have the most significant regulation from the EU coming at the tech companies that they have seen, but also at a time where their industry is facing a lot of financial challenges.  We're also witnessing the challenges with Twitter. 

That poses a big problem because what we saw in this particular area was that the platform desired to not be blamed for bad things happening, which is ultimately where we are.  Where NDI is, we want to create a positive information space for the full inclusion of all people.  Where they are is very much how do we not be blamed?  That then matches very closely with the cultural dynamics we see around the world of, like, this is the cost that women face.  If you decide to get involved in politics. 

Those very nefarious attitudes and when you match that with budgetary concerns as well and the fact that this was ‑‑ we heard this over and over again, that this is a cost center for tech companies rather than a value proposition of what a platform can bring to society, we face an uphill battle. 

On the government side, though, I do want to leave us on a point of optimism to say that we've seen the ‑‑ we also talked with other governments that are working with the partnership to end online violence against women in politics.  This is ‑‑ you can Google that as well.  There are eight allied governments that are forming an alliance to address these challenges.  So the likes of the United States, Denmark, Australia, South Korea really looking at this and concentrating on this issue.  That's coming.  It's co‑chaired by the United States and Denmark I believe, but that also comes at a time when U.N. women will convene around CSW and use technology as one of the convening issues that are bringing them together. 

I think one of the things we have as a community to look at are how we mobilize and utilize those unique opportunities with the stars aligning in that way to really see real progress on these fronts and to identify which interventions we can push them towards. 

So I would leave us with, one, which is basically when we look at CSW, when we see those opportunities, we want to make sure that we're capturing both sides of this challenge.  One being individual violence, right?  Women facing attacks, things that we all I'm quite sure in this room have experienced and online, but also where it is weaponized for political purposes, and we want to make sure to capture both elements of that because they require different interventions and different approaches, but we would be doing a disservice to what we can achieve as a community if we limit ourselves to just one perspective versus the other.  So I think taking the broadest possible aperture and recognizing that political aspect, not only is it being weaponized, but it's also our opportunity for change.  It's the women that are going to make that change for us.  I can stop there. 

>> FERNANDA MARTINS:  Thank you, Moira.  I think it's important to consider that when we talk about gender disinformation and consequently about gender‑based violence, we are talking about one thing that impacts all society.  Not all women, but the democracy.  One of the solutions that appeared when we talk about it, it is content moderation or on the other side deregulation, but we have so many challenges related to language, related to context in different countries, and I think we need to be more creative and imagine other solutions. 

I invite you to divide in four groups.  I think.  Four or five groups.  To discuss ‑‑ please scoot the slides again.  To discuss the experience that you have with this debate or the questions that you have or decide how we can think about this problem in a cross way.  It is a global problem with different actors ask stakeholders and how we can improve or mitigate this problem.  There are here some guiding questions that you can use, and please introduce yourselves in discussion.  I will advise you when the time is finished.  Okay?  Thank you. 

So four groups.  I think we can use the way that you sit. 

(Breakout groups)

(Silence)

>> FERNANDA MARTINS:  Can you hear me like this?  I will bring you to one of the groups then. 

>> DANYA CENTENO:  Thank you.  Can everyone hear me, or just that group?

>>  Alice Lana:  I will bring you to the smallest one.  Is that okay? 

>> DANYA CENTENO:  Thank you. 

(Breakout groups).

>> FERNANDA MARTINS:  We have more two minutes.  Okay? 

(Breakouts)

>> FERNANDA MARTINS:  Sorry.  We need to return.  Please one of the group to present what you discussed.  Hi, Danya.  I think they were discussing both questions at once.  The part that I followed.  The ones that were on the slide. 

>> DANYA CENTENO:  Thank you.  There were three questions, no?  So I think I can start to some what we debated in our group.  The discussion was so rich, and the time finished fast.  But the first thing that appeared in my group was the fact that political gender‑based violence and the number of women in the parliament and other public spheres is very related because it's such a phenomenon limits to women.  We talked about different women that are targeted to political gender violence.  Not just politicians, but also activists, journalists, and it directs this into legislation, but it's not enough because we comprehended that it's not I think related to just a light environment, but also side.  For example, okay, it's a good thing, but it's not enough because we have more challenges.  The education period in our group is crucial, essential too because sometimes people know what is fake news, what is disinformation, but they decide to believe in the things that make sense and religion and the groups that they are.  So it's a huge issue to address.  I think it was it.  I don't know if you want to compliment something.

>>  Audience:  I feel like big tech and algorithm should be considered as part of the equation.  These companies are huge companies.  They are multi‑billion dollar, and they have all kind of human resource, technical expertise and any kind of capacity to end hate and fight the nuances of languages, hire people from different countries.  There's a financial mode of hate speech that's making money.  It is helping people.  It's motivating people to be online and get attention. 

Once we remember these big tech companies are actually behind all of this hate speech, then we can kind of push forward for just like you said platform accountability.  That's something I would like to highlight. 

>> FERNANDA MARTINS:  Thank you.  Maybe the group this side.

>>  Audience:  I think in our group we were sharing experiences.  I thank everyone for sharing their experiences. 

We were speaking of how different local contexts really play out in developing the narratives that sustain gender disinformation.  Sri Lanka, Uganda, Ethiopia, and we even had an example of someone who works with fact‑check and checking our group and because a woman has been targeted with hate and insults, and this is something that we were speaking about a lot about the work of working with disinformation and being a woman and having to face those narratives.  I don't know if someone else wants to add, but I think the thing about the local coach or narratives of praying a role is mostly what we discussed.  We didn't really get to discuss platforms or solutions.  We were really involved in getting the stories.  Anything else?  Thank you, group, for sharing. 

>> FERNANDA MARTINS:  Thank you.  You?

>>  Audience:  We had pretty much the same.  Share our experiences.  We had in our group Brazil, Ghana, Germany, and you are from Germany too?  We were talking about our problems in our countries, the solutions, and the effectiveness of the strategies.  We pretty much shared and tried to learn a bit more about the differences and the things in common we have in our cultures and our societies.  That's pretty much this.  If anyone wants to shear more. 

>> FERNANDA MARTINS:  Thank you so much for sharing.  Maybe to add on some of the, like, entry points for action that we've been discussing was to think about alliance‑building and organizing amongst journalists, civil society actors and activists and also politicians to tackle this issue because oftentimes legislation also takes a lot of time. 

Another point has been raising awareness amongst police officers because persecution also needs people who actually follow up on that case, and another point has been ‑‑ that has been mentioned before by other groups already is the point of biassed algorithms and AI and how to unbias basically the algorithms that are supposed to detect.  Thank you. 

>> FERNANDA MARTINS:  Next, please.  Thank you.

>>  Audience:  I think I'll do a quick summary, and then all you can definitely add on.  It's been a great group here.  They've had wonderful, like, insights and also really effective solutions.  So to summarizing quickly, in Tunisia a lot of women candidates are targeted.  Especially on the basis of their gender roles, but it's very interesting because they already have online gender‑based law, but political violence is a specific category that's now been adopted.  So a lot of trainings are required.  A lot of SOPs are required as you had mentioned, but again, it's the victim perpetrator binary that's being re‑perpetrated again in law. 

Again, you have to take the trolls to court basically or whoever is harassing you to court.  In Kenya special during the election period there's a lot of disinformation campaign creating awareness through multi‑stakeholder capacities, like connect is one of the solution that is has actually been working. 

Different forms of violence are there on the internet, you about it's important to understand the sort of post‑trauma experience of the women who sort of face the dichotomy where they lose, but they're also trolled.  Online gender‑based violence and campaigns are a reflection of the patriarchy that's being perpetrated in the country.  So that's very important to understand.  Again, countries like Kenya, India don't have a legal basis ‑‑ don't have a law to actually address this.  That's also a big gap.  Judicial training is something we all agree across the board, so that is also a point ‑‑ that is a good solution that's required.  In Maldives especially misinformation is quite normalized like in most South Asian nations, and women don't really engage or talk about it, but it's interesting how the factors include language attacks, like for not knowing specific regional dialect or regional language for that matter, but at the same time it's important to understand how languages play a big role because you sort of bypass the artificial intelligence and trolls still perpetrate the sort of violence online. 

Then it's also an interesting fact that all politicians, regardless of how high they are in the political spectrum opposition face very similar forms of violence, and media and community‑building is a very good solution that's actually being implemented.  Some good practices in Kenya that also have been mentioned include early reporting as an individual because there is a group called Kenyans on Twitter ‑‑ sorry.  Just so good.  I just want to finish.  Just one minute, and then we're off.  So they have this thing called Kenyans on Twitter where they've been evoking expertise and monitoring and speaking to social media platforms that way, and in ‑‑ like, for instance, it's important that platform ‑‑ we look into platforms that have risk governance processes and also look at tackling the governance aspect of disinformation, which also includes the way DSA has been functioning on a participatory element. 

In Germany, for instance, even though the cases were taken to court, the campaigns that came with it actually put the issue into the forefront, which is quite important.  Just two last points.  One is in Nepal, again, South Asian perspectives where no one really cares about the issue, but it's important to note that the emotional trauma to actually bring about the issue is, again, visible.  And resources are slow to come forth.  If you really want to come forward and bring up this issue, it's a lot of toll on the woman especially, and misinformation and disinformation must be defined in our laws, and it's also interesting to note that you have to actually tie it up with freedom of speech and expression.  This is across all jurisdictions.  Fact‑checking is a big issue.  But South Africa is a very great solution, and I end with this, which is that they have this ‑‑ she's actually been working on it.  There's a platform run by Civil Society where users can actually come and upload the sort of gender disinformation online violence issues that are there.  There's an expert panel that reviews it, and there's a recommendation made, and it's taken back to the tech company, which brings me back to a bigger question, which is how involved should government be and how much more involved should civil society actually be?  There's a requirement in the second half. 

It was a very great discussion. 

>> FERNANDA MARTINS:  I can see.  We need to finish because the other session will start.  But I really appreciate for sharing with you and listen to different perspectives, different countries in global self.  Thank you, and I hope that the session provides you some context, and we can do comparisons in the next months.  Thank you.