IGF 2021 – Day 4 – DC-Gender Tangled like Wool: Gender, Social & Digital Inequalities

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.



(Video Playing)

>> We all live in a digital world.  We all need it to be open and safe.  We all want to trust.

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united.

>> SHOHINI BANERJEE:  So thank you, everyone who is here.  I think we'll get started and maybe a few more people will be trickling in.  Some of the panelists, I think, are having a little bit of a tough time joining, but hopefully they will be able to join us soon.

So welcome Gender Tangled Like Wool my name is Shohini.  My pronouns are she, her.  I'm from Point of View.  A Civil Society organization working in the intersection of gender, sexuality, and technology in India.  And I will be the moderator for today.

This session will explore the relationship between digital inequality and social and economic inequality through the lens and lived experiences of gender, especially in the context of the COVID pandemic.

Gender inequality is a driver in inequality.  We will have a better understanding after the session ever the implications of this on policy and how to strengthen gender equal rights in forming digital policies and Internet governs systems.  We have a wonderful group of panelists.  I'm excited to get this conversation started.

I'll go through briefly how the session is structured.  We have about, I think, 90 minutes.  I will first go through the gender report card analysis, which is part the Dynamic Coalition's theme to look at how gender is integrated into IGF's discussion every year.  We will have about an hour of discussion.  It would be great if participants could just ‑‑ if you have any discussion points or questions, you can use the chat feature.

For those of us who are joining in person, I'm not sure how it will work, but flag us.  And we'll include it.  But we'll also have about 10 to 15 minutes for question and answer afterwards.

So really briefly, the gender report card is part of the Dynamic Coalition on gender and Internet Governance objectives to ensure the discussions in IGF is articulated by different genders as well as to ensure that gender perspectives, realities, and concerns inform the gender of Internet Governance.  While there are usually two components we look at, which is the representation in the IGF discussions, as well as the content.  Since last year it was fully online, getting an accurate data on the participation was ‑‑ the methodology just wasn't strong enough and was challenging.  We will only be presenting the information on the content.

So it's a very brief presentation.  Then we can go into a discussion on what today's session is about.

So I think I will be able to share my screen.  Give me one second.

Okay.  I hope everyone can see this.  So the gender report card is looking at ‑‑ it's analyzing the sessions of 2020 and looking at what was the type of content was really covered in these sessions.

So out of the 117 sessions that were there last year, about 78 report cards were filled, which is about 26% less from the year before.  And from that, when we're looking at the relevance of gender as a topic, only 10% showed that there was a direct engagement.  About 30% said there was partial engagement.  And 27% showed that there was no engagement at all.

Which is, again, less than the year previously.  So direct engagement in 2019 was higher.  So there was a 14% decrease last year.  Partial engagement of gender in these sessions also decreased from 41 to 34.  So that was 17% of decrease last year.

So this is a very quick summary.  And we will have, I think the report of this will be going up on the coalition's website.  In a little bit, I'm not exactly sure when it will be going up, but it will be available there.

But this shows ‑‑ given these numbers, we're glad to have today's session to ensure that we are talking about gender within Internet Governance.  I'll start off with our panelists.  If we can just have them introduce themselves, that would be great.  Starting with Mona.

>> MONA SHTAYA:  Hi, everybody.  I'm happy to be joining you today.  This is Mona Shtaya, the advocacy advisor at 7amleh, the Arab center for the advancement of social media.  I'm joining from London.  We're based in Palestine, and we are working there.

>> SHOHINI BANERJEE:  Thanks, we can go to Sadaf next.

>> SADAF KHAN:  I'm Sadaf Khan, the cofounder and managing editor for digital rights moderator Pakistan only focusing on Internet Governance and digital rights issues.  And very happy to be here.

>> SHOHINI BANERJEE:  Thanks.  Then, Avis.

>> AVIS MOMENI:  Okay.  Thanks very much.  I don't know what happened to the camera.  But I need permission to have the camera.  So let's say ‑‑ I'm Avis Momeni from the society organization.  I'm glad to be here and share my own experience with participants.

>> SHOHINI BANERJEE:  Thanks.  And I think we have a pretty small group.  Actually maybe we can get a round of introduction from everyone maybe from the in person participants.  I'm not really sure and panelists.  I'm not sure who is there.  If we can get an introduction from there, that would be great.

>> MALLORY KNODEL:  Poncelet Ileleji and I are here.  I'm Mallory Knodel from the center of democracy and technology.  I'm based in Washington.  But I'm here today in Katowice.

>> PONCELET ILELEJI:  I'm Poncelet Ileleji based in Gambia.  National resource person for The Gambian national IGF.  I'm the present on site in Poland.  Thanks.

>> SHOHINI BANERJEE:  Are there any other participants or panelists on site.

>> PONCELET ILELEJI:  There are only two, me and Mallory, yeah.

>> SHOHINI BANERJEE:  I think if we can get John and Riddhi to also introduce themselves, then we can dive straight into the discussion.  John, if you can get started?  Or Riddhi, I guess that's okay as well.

>> RIDDHI MEHTA:  I'm Riddhi Mehta.  I'm from Point of View.  I will be taking notes here today.

>> SHOHINI BANERJEE:  Okay.  And John?  I'm not sure if they can hear me.  I think we can dive in.  It's us.  It's a small group.  So we can talk ‑‑ I think it would be great to kind of start talking about the relationship between digital inequalities and social inequalities that exist in the region that you're working in through a gender lens, especially in the context of COVID.

So how has this relationship been impacted, whether it's exacerbated it or changed it in some form.  And I just want to remind if we can keep the comments to about three minutes.  And then we can get sort of a discussion going on the basis of that.  So anyone who wants to dive in, we can ‑‑ please feel free to do so.

>> SADAF KHAN:  I can start, if that's okay.  So I'm from Pakistan.  I'm not sure how many of the people present here are aware of the political demographics in the country, but Pakistan is a pretty conservative country in some ways.  We have a pretty patriarchal hierarchy in our culture.  When it comes to access to technology, unfortunately even though we have great indicators with regard to access.  We have one of the affordable internets in the region.  But when it comes to ownership of the mobile phone or access to internet, we also have the biggest gender digital divide in the region.

And what happened ‑‑ obviously the digital gender divide is not just maybe related to financial issues, affordability issues, accessibility issues, but it has very present and very stark social dimensions there.

So ownership of mobile phones which in Pakistan is linked to, you know, your identity card, your national identity card, your mobile phone is linked to your biometric data, et cetera.  And because of the nature of this idea and the fact that women feel unsafe giving data in public spaces, being in public spaces, oftentimes even if a woman would be using a mobile phone or internet, it would not be listed in her name or her direct ownership.

What happened during COVID, because in normal situations even if you don't have mobile ownership, it's fine.  People within families, let's say, do divide online time, et cetera.  They make it work.  But during COVID, obviously everybody was completely dependent on their devices, especially when it comes to education.  And households with single devices or even multiple devices prioritized male students over female students.

In multiple regions across Pakistan different public bodies put up public Wi‑Fis to which only boy and male students were allowed to access.  And there were ‑‑ obviously this is a very discriminatory practice.  They had good justifications regarding protection and safety and safety issues, bla, bla, bla.  It was exacerbated during COVID itself.

A, women didn't have access in their homes.  B, they didn't have access in public spaces.  C they were cut off from where they did have access before such as education et cetera.  Over the last year we have seen the internet in Pakistan become a very, very toxic place.  It was always toxic but the level of toxicity and the level of patriarchal hierarchy.  It has increased in the last two years.  We have seen campaigns starting organically without engagement from usual organizers.  That's proof that it has worsened.  It's not just activists looking at it and saying it's getting worse.  It's actually getting worse.

On the side of health, food, shelter, police, law enforcement, et cetera, there is a complete lack of data.  A number of government institutions, including the national commission on the status of women rights, they initiated discussions and discourses around it and gathering the collection.  Obviously when women are unable to connect, they're unable to participate in online consultations which was the way during this whole period.

So I feel that, you know, it's not that the social divides were obviously interconnected, but COVID has made it much course.  COVID has actually made our fight much more difficult.  Because the kind of hate that has been normalized over these couple of years, we have not seen that kind of hate against women in such ‑‑ with such intensity becoming so normalized.

So I'm sure there are some more examples in the region.  I'm sure Pakistan can't be the only ‑‑ Pakistani women can't be the only ones facing this extra dose of hatred and withdrawal.

>> SHOHINI BANERJEE:  Yeah.  As you mentioned, I'm sure it's not just happening in Pakistan.  In India we've seen how this has gotten worse in the social inequalities have been exacerbated.  But also I'm assuming in other regions as well, if the other panelists can speak to that as well.

>> MONA SHTAYA:  Yeah, maybe I can jump in.  When we're talking about Palestine and the Arab region, I'll zoom into Palestine because we are working there.  There is some kind of the gender‑based violence that is targeting woman and the other LGBTQ communities.  We at 7amleh have published gender‑based violence study in 2017 and it was a bit before the pandemic.  During the pandemic, it was worse.

But before the pandemic, our study showed that one out of three women in Palestine are being exposed to gender‑based violence in online spaces, like we're talking about bullying harassment and so on in the online spaces.  One out of these three woman are leaving the internet because of this gender‑based violence.

Earlier this year, we also published research about hate speech on ‑‑ in the social media platforms where we were asking people during the focus groups but also during the survey that there was surveyed ‑‑ we surveyed around 1,200 people about the basic or the main hate speech on the social media platforms.  They were focusing on the political circumstances.  They were mentioning the Israeli occupation but also the political situation between West Bank and Gaza Strip.  They did not mention the hate speech against woman on the social media platforms.

Maybe that's a sign how people can see the gender‑based violence or the hate speech.  They can't see that the hate speech against woman as a hate speech.  Maybe there is some kind of reflection, how people are classifying the hate speech.  And we have observed ‑‑ we have seen many means on the social media platforms during that two years of the pandemic, which really emphasized, and it was stressing on the very traditional roles for woman and their homes.  The very traditional roles that they are supposed by the society to be doing.  And this kind of content on the social media platforms is making labeling a woman and they are putting woman in specific places like they're supposed to be in.

Either in a good stereotyping in good way or bad way, quote, unquote.  Stereotyping is stereotyping.  Giving the strong woman in the posters a very radical, let's say, look or giving the woman who are exposed to violence a very traditional look.  Like most of that means ‑‑ what we're seeing, most of the posters we're seeing, the woman exposed to violence, they are brown woman.  They are putting on a hijab or something like that.  This is stereotyping woman on the social media platforms.  We are seeing that.

During the annual forum we at 7amleh are organizing, and it will held in May for 2022, for the past year we have organized around (Audio breaking up) on social media platforms.  Most of our speakers, all of our speakers they were stressing that they have observed, they have documented this kind of stooping content for woman in the social media during the pandemic.  It was increased because most of the people were sitting at their homes and thinking that all this home work that usually woman used to do so, we should like make extra labor for woman to continue doing so.

So that kind of content on social media is also putting extra effort for us as a digital rights defenders to remove from woman, because we're not supposed to do that.  That's very traditional thinking about woman.

The third thing I would like to highlight when talking about the gender‑based violence that specifically was exposed during the past two years, we all know that during ‑‑ since the pandemic had started, governments and regimes around the world, and specifically in the Global South, they were keen to normalizing using surveillance technologies.  They want to normalize that.  They want people to accept that and sometimes they want people to start demanding using these surveillance technologies under the pretext of saving or protecting public.  That happens.  We've seen that.  And we've monitored that.

But the unfortunate thing is this kind of valence, this kind of ‑‑ it was affecting woman.  Our last input about the CCTV cameras in east Jerusalem shows we have conducted interviews with woman in Jerusalem where they were stressing that they can't take off their hijab inside their homes because they feel they are surveilled in their homes because of the CCTV cameras in Jerusalem.  But also during June escalation on killing the political activist by the PA, when we took the streets asking for justice as political activist, the PA was confiscating and stealing our phones.  They were focusing on ladies, specifically ladies who are coming out of very conservative backgrounds in order to put shame on these ladies.

So after stealing our phones, they were sharing pictures, private pictures out of these phones on the social media platforms in order to intimidate woman and prevent these woman from taking the streets again asking justice or even expressing ourselves.

As a result of that, we see how woman are having much more complicated issues on the social media platforms or in the digital spaces when it comes to several topics like hate speech, gender‑based violence, surveillance, and so on.  Because of that ‑‑ I'm glad here today so we can share experience on what others do and what we at 7amleh are doing to protect women and think together what we are supposed to do with that.

That's for Palestine.  But definitely will continue and talk about the other things that we did and also that we are planning to do in the future.  Thank you.

>> SHOHINI BANERJEE:  Thanks, Mona.  I think we definitely want to get to where we talk about what are the types of strategies that are being used and which ones are effective and rights are forming and which ones are not.  You touched upon some of that.

>> SADAF KHAN:  Just one ‑‑ sorry.

>> SHOHINI BANERJEE:  Yes, please.

>> SADAF KHAN:  Something that Mona was saying, the issue was describing these different elements of speech that happen online that are not traditionally considered violent but do inflict violence on women.  I think what we have seen in the last couple of years and the way the internet has developed, there's a need to rethink how we define violence against women online.  So considering an environment that's not violent or hateful or inciteful maybe in a very direct matter but a culture promotes the stereotypes and the images that inflict violence on woman in many different ways.

That's something we don't really think about in that.  Dealing with those needs a very different strategy than direct incite to violence or hate speech.  It's interesting to see that being brought up.  It's something I've been thinking about a lot.

>> SHOHINI BANERJEE:  Yeah, thanks for adding that Sadaf.  I think when we're thinking through the implications or how the exasperation of social inequalities and digital inequalities during COVID has highlighted certain things for us, and what could be the policy implications on speech and what constitutes as violence against women online, that would be a very interesting thing to look at.

I want to just ask if Mallory or Poncelet want to add anything from the regions they're working in.  Mallory?

>> MALLORY KNODEL:  Sure.  I can weigh in.  At I'm going to speak less about direct effects.  Mona and others have said, which have been super fascinating.  I think they do factor in to the work that I'm looking at.

I'll start a bit back.  So I'm the chief technologist at the Center for Democracy and Technology.  I started my work as a staff member for progressive communications which has a robust women's programme.  At that point understood a lot better sort of the intersectional feminist issues at play in Internet Governance.  In particular my role right now is to engage in the technical community in internet standards setting space.  It's very far removed from sort of direct community engagement.

At the same time, though, I think that's the challenge that in my role as co‑chair of the human rights protocol considerations research group of the internet research task force, I'm looking for areas of study or ways to bring in the voice of those most affected from an intersectional feminist perspective.

That I think effectively always means I'm doing two things simultaneously.  One is figuring out how to lower the barriers to broader participation from multi‑stakeholder communities that involve women, diversity, and gender diversity, and race, diversity in geopolitical location.  That can happen through efforts of diversity and inclusion.  I spent to present day to get a draft through the task force that talks about the use of language.  Actually stopping the use of oppressive terms like master/slave, white list/black list.  It's not published.  That's a different story.

That's one track.  One track is lowering barriers and getting people involved and have perspectives to hopefully inspire or maybe influence the engineering that happens.

Then the second track then is on figuring out what are the substantive issues that are overlapping in feminist ‑‑ intersectional feminist analysis and protocol development.

So I think there's more exploration needed in that latter piece.  I think this Dynamic Coalition is exactly the kind of community of practice that's needed more in Internet Governance in getting more participation.  I think the score cards could be maybe replicated or shared with other bodies, other initiatives, other internet governmental agencies and so on that come to the IGF that at the very least need to know this Dynamic Coalition exists and pick up some of the methodology.  I came to actually explore that as a separate area of work.

Again, on the substance because it's a much harder one for me anyway, when I think about what can we say about the ways the internet is designed from the hardware to the physical infrastructure to the protocol and the application layer, what are those design choices?  What are those opportunities to improve the internet for folks that are most disadvantaged that don't have as much access?

We approach this problem similarly with the human rights framework in my research group.  We do have an RFC ‑‑ an RFC is an official publication of the task force.  We have an RFC on human rights as a kind of standard in its own right actually.  Standard for, you know, obviously respecting state's obligation to respect human rights, and then protocols as standards.  What are the areas of overlap there.  That RFC exists.

I wanted to do the same thing, but instead of using the universal declaration of human rights, I wanted to use the feminist principals of the internet that was a bottom‑up approach.  I participated in some of that as a staff member at APC.  Now as a member and ongoing work.  Going principle by principle looking at what are the issue areas uncovered that might affect protocol design.  It's not as easy a task as it sounds.  There's not always a whole lot.

That will be an ongoing effort.  But if researchers out there or folks interested in solving this problem, we keep our research open at GitHub.  And we meet three times a year as part of the internet engineering task force.  There are plenty of opportunities to come and work with us.

Another one that just came up recently ‑‑ so had is actually just at the idea stage, but I think it's actually pretty interesting is that there's another research group called the privacy enhancement and assessments research group that looks at the right to privacy narrowly.  The reason I mention it, there's a body of work in there that I'm currently responsible for on censorship methods.

So this research document sort of just outlines every single conceivable way from technical perspective sensors are able to block, filter, throttle content.  It's pretty technical.  Based on that approach ‑‑ I think that would be useful if it ever gets finalized and published ‑‑ the idea was to take sort of spouseware.  I don't know a better term for it.  Apps you can download, there's a huge market for this, you can download the apps on your phone or your intimate partner's phone and track her movements and read her texts.  These apps are ubiquitous.  I suspect there are probably only a handful of really robust app platforms that are responsible for those out there.  The rest are copy cats.  The idea is we take all of those apps, there's plenty of research that breaks down how they work.  They test them for sort of protocols they use.

I think it would be useful to try to document in the same way we documented the technical mechanisms for censorship, that we actually document what are the internet protocols that are being exploited or abused or leveraged to track people online.  Because I think one mistake that I see made quite often is that we trust end points.  When we're designing secure protocols, we assume our home routers and end devices are effectively good user agents.  They're proxies for what we want.  That I don't think is true if you have spyware or spouseware on your phone.  It's not true if you have someone using the same Wi‑Fi router who has higher technical expertise than you that can leverage that power differential to spy and to track.

So that's ‑‑ there's not been a single word written on that research project.  I'm giving you a peek on what might be a useful area of study that could then help, again, just inspire, sort of shift the imagination and perspective of those who are designing internet protocols to understand maybe what are some of the challenges and threat models they hadn't yet considered that are probably not usually brought up in engineering spaces that are ‑‑ there is such a heavy monoculture much privilege, western, white men that are mid to late career working on internet standards.

I wanted to share those and hopefully that is interesting.  I appreciate hearing the stories how affected communities experience the oppression and exacerbated by technology.  Hopefully the theory of change, which is if we can make the way the internet works a little bit more secure for those most affected, then everybody benefits from that.  So I hope we keep having that dialogue.  Thanks very much.

>> SHOHINI BANERJEE:  Thanks so much, Mallory.  Before we move on to Poncelet and others, I wanted to ask that ‑‑ you were giving a very different perspective from our first two panelists.  And I just wanted to ask what can we ‑‑ you spoke about design and the way that the internet is designed.  You spoke about apps, the technical kind of component of the internet.

So has COVID ‑‑ like during the pandemic, have there been patterns that have emerged as a result of the pandemic of maybe increased usage of these technologies that have highlighted really how unequal or how much barriers or what type of barrier that exists already.  Or is this just kind of continuation of what's been there before, and COVID hasn't really impacted as much or changed it as much?  If you can speak to that a little bit.  And then we'll move on to the other panelists.

>> MALLORY KNODEL:  Certainly I'll do my best.  Certainly usage has gone up.  It has shown, I think, the technical community is really patting itself on the back because the internet held up.  I wouldn't disagree.  I think that is true.  I think a lot of that is due to centralization of services.  So we've got some really big companies that were able to handle the load and worked together to balance it, rather than ‑‑ I don't think, for example, we saw like a really robust totally decentralized internet contributing to that in all ways.

We look to reports out of ITU and ISOC for that sort of health of the internet and increase in access figures.  But I think we at the same time shouldn't take those high‑level figures at face value.  I think there must be some detail in that.  Maybe devices are not well distributed and there's some gender dimension to the ways that we have access to devices, then I think that does have an implication for the ways we design tools for remote education, for remote work.  We often imagine that people who are connected have one ‑‑ at least one device.  Some of us have more than one device.  But there are many places where you've got children.  You've got teenagers.  You have elders.  You have a variety of folks who actually may be using devices together.

So I don't know if enough has been done to really interrogate the differential use of devices and what that means for security and for privacy and for access.  So that's another place where we could look.  I think that if we were to dig a bit deeper in it, COVID has given us a lot more data on that than I think we would have had otherwise.  I guess that's a silver lining.

>> SHOHINI BANERJEE:  Thanks, Mallory.  Maybe we can explore those aspects more when we talk about strategy and policy.

I think Poncelet was going to add to the conversation.

>> PONCELET ILELEJI:  Thank you very much.  Thank you, Mona, Sadaf, and Mallory.  I look at it from the perspective of living in a developing country like Gambia and what we've done in terms of gender inequalities that happened during the COVID and is still happening now.

I'll take The Gambia as a case in point of what we have done in terms of getting people to be more digitally included in programs, especially women.

A good example I will start with.  We just had our presidential elections.  There were more women voters that were registered than men.  Out of the 950,000 plus voters that were registered, we had about 55% of them were women.  That was impressive.  But the sad fact of it is a lot of the political parties were using social media to disseminate information.  And we have in Gambia where $5 is the cost of one gigabyte of data.  When you look at that, most of the women that were digitally excluded from not being well informed to be able to do their voting rights, which is their fundamental human rights was because they couldn't afford data.

So you have to think about affording data and feeding your family.  So to be hearing all these messages of the political parties.  Most of them relied on their husbands and things like that.  Online once in awhile.  Cost is a big issue, especially for disadvantaged women.  Whereas, we do have that problem with young girls going to universities during the COVID.  Because all the teleco companies, they made Google classroom accessible for free for students in the public university in Gambia.  So that was very good.

Despite the fact we have for every home of the 2 million population, for every three homes one person has a mobile phone.  The cost of mobile phones have been cheaper, but the affordability has been very expensive.  That's excluded a lot of people.

I'll give an example of a project we did earlier on in the year with rural women in the horticultural industry and it was funded by the association of progressive community.  Most of these women complained about how they have suffered to be able to access markets because everybody was using the web to take a picture.  They're using their children.  In carrying out that project and justifying the results, we had to make sure we had data provided for over 600 women that were in that programme.  That data helped a lot.  It wasn't data for them to look for information but for them to also check information on health, for their babies, on nutrition, and stuff like that.

So the inequalities that has existed for a developing country like Gambia could really have been breached if cost was down.  We have only one submarine cable.  All the companies that took a loan towards getting that cable they're still paying back.  It would make the cost high.  We don't have situation whereby we have community networks.  Even though we have a universal access policy that was developed in 2020, just last year, we have no really initiative to make it work that we can have all the public community, our hospitals like during the COVID, they were crying out that they needed the data.  The data wasn't coming forth but some said we have to provide data for them.

But the plus on that because costs of mobile phone is cheap, when the government had to do payments to people who were really affected, I think it was estimated in terms of poverty level, they did an aggregation of statistics, I think close to over 100,000 were the poor of the poor as the government put it, in different neighborhoods.  They transferred most of the moneys through mobile money.  So mobile money that was not really popular before COVID, even though it was there, we weren't doing it everywhere.  But all of a sudden the mobile money became very popular because people were receiving money from government to support them in terms of COVID.

And those mobile devices with their phone numbers helped the government when distributed the basic commodities that people use for their daily living, rice, sugar, and oil.  The government was able to use those, what do you call it, the data from the regulatory authority to know where people are located.  It helped a lot.

I will stop by saying the importance of a developing country to have community networks that will support women is very, very important.  Because if you look at them as illiterate, all the children are going to school.  And the children are the ones that use the mobile phone.  I'll give a simple example that happened during the COVID for me.  There was a lady that used to supply me fish.  During the COVID, she said the best way ‑‑ she was speaking a language and recorded and took a picture of the fish.  If I say it's okay, she'll give me the price.  Do voice recording.  She can type text and everything.  She used that for several of her customers.  I had to tell her to take the picture.  On a particular Sunday I sat with her and everything.  She said, people just forwarding her fish to order, people increased her customer base.  It shows most women use social media.  I think if we get them more included, especially in the Global South, we can make the world a better place.  Looking for the world digital cooperation which the United Nations has done.  The third one, those inequalities are important for us to address for vulnerable communities.  Thank you.

>> SHOHINI BANERJEE:  Thank you for sharing that.  I think you highlighted some of the issues about affordability, but also how political participation for women were impacted in COVID due to this lack of affordability.  Yeah, and you also spoke about some of the strategies that we could think about.

Others, I would like to get your input on what has been the impact of COVID in digital inequalities and social inequalities in the region that you're working.

>> PONCELOT ILELEJI:  The impact has now realized we have to be connected, and connectivity has become very, very important for every family in The Gambia.  So having the conference call and use of Google classroom that was only popular in private schools.  Public schools are now using it.  Women are now working together to do, what you call, joint usage of phone with data.  It contributes to get information, especially in relation to health, education, and agriculture.  We now have a lot of start‑ups that have come up that deal with delivery service.

So overall, despite the sadness that COVID has brought, it has spurred innovation.  That is great.  Among communities that relied on maybe doing things face‑to‑face, it has really spurred innovation.  That has the plus that would make us not only in Gambia or any developing country but the whole world to be resilient for any pandemic that would happen.  Thank you.

>> SHOHINI BANERJEE:  Thank you.  Yes, Avis, if you can just share your inputs on this.

>> AVIS MOMENI:  Thank you very much.  So at this minute from what of the Civil Society in Cameroon.  The experience we have done during COVID pandemic.  So regarding Cameroon, Cameroon situation.  Cameroon is located in Central Africa.  We have more than (?) people about what we have 51% of feminine population.  And we have more than 24% of this population based in rural area.  During the COVID pandemic, the rural area was most affected as some areas (?) or specifically areas of supporters.

Many communities in the project implemented, we supposed to reduce digital device.  For the perfect reason private mobile and to invest in the rural areas.  So as a consequence, the rural population did not have access to COVID information as measures against COVID‑19, specifically women and girls.

So what impacted.  People receive information from national radio or from community radio, most of the rural areas people were not prevented from the prevention of COVID‑19.  So due to lack of information in the rural area, there were a lot of disinformation inside this population.

We observed that the national programme has not been issued to boys and girls in these areas.  During the high cost of digital tools, citizens and rural populations were not able to access information.  We observed that workers to work virtually and at home were not be prepared.  And it was so hard, again, for women workers as the family environment was not adapted.  That was the impact we observed.

So maybe we share again and (?)

>> SHOHINI BANERJEE:  Thank you so much.  I think we're getting a good idea of like the different types of ways that the existing social inequalities and its relationship to digital inequalities were affected during COVID and even some of the strategies that have kind of come up from ‑‑ due to the needs that the COVID pandemic presented itself.

We heard Sadaf talk about the digital strategies in education, and Mona spoke about narratives and spoke about innovation.  What are the strategies and tools that have been successful in rights affirming?  And what have been the strategies which haven't been?

Really like from the perspective of how stakeholders look at digital inclusion, if you're talking about a strategy that's government‑driven or Civil Society‑driven or counter‑narratives that are coming up, grassroots‑driven, how has digital inclusion really been looked at when developing these strategies or tools?  Anyone can ‑‑

>> AVIS MOMENI:  In terms of strategy, from the government ‑‑ there were many consultations that took place.  For example, consultation to provide information to national radio and television, we had a message to people phone.  And also (?) sent a message to prevent people from disinformation and cure.  From the (?) we observed that most of them reduced the cost of the mobile communication and internet to families to communicate.  From Civil Societies some developed a website to provide COVID information and information related to the pandemic for people to access the information.  And then the data used from the (?) organization.

My own organization, we provide online tool on information to the community.  That is the strategy we adopt.  Maybe we come back to the (?) thank you.

>> SHOHINI BANERJEE:  Sure.  Thank you.

>> SADAF KHAN:  I was looking at your question.  I find it impossible to answer because strategy, especially when we're dealing with something like gender‑based violence, everything comes back to context.  And I think ‑‑ in Pakistan we have work through a number of ways.  We have collectivized which is not a new strategy.  We have organized things online.  We've made things visible.  We've stayed behind the scenes to look at different cases.  So case‑by‑case by case, engaging with corporations, with government, engaging with allies, engaging with Civil Societies, trying cohesive dialogue, et cetera, et cetera.  Sometimes things work.  Sometimes things don't work.

I feel the common thread is a persistence.  B, what I think we need to strategize for are ways to kind of bring the agency to women collectives, bring the agency ‑‑ have the agency somehow, how speech expression, experience of the internet is governed, but also what's the process through which these governance procedures are made.

I was hearing about all this discussion about affordability and access, et cetera, et cetera.  I think one of the key fundamentals in how we discuss these narratives imagination is more philosophy‑based, but I feel that understanding and then the internet as a utility as something that's not optional, a utility like electricity.  We understand the right to electricity or gas or water, et cetera.  And then finding ways to kind of make it gender neutral, make it gender positive, that would help, especially in terms of how we're governing it, how policy is responding to it.

I'll give you an example.  I've been made a (?) in Pakistan that deals specifically with regulation on social media.  As a freedom of expression activist, for a very long time my stance on content regulation and platform regulation has been straightforward given the country we live in and the attitudes we have faced.  We have talked against localization and against ways of the government to exercise control over how technology and digital discourse is regulated in the country.

But as someone who now faces the responsibility of creating a case would form a decision by, when I come to ‑‑ kind of compare how stances that we usually take from a principal ponied of view come to play.  For example, localization, no way to get data when a woman's pictures are leaked and not holding people accountable, privacy concerns, a lot of very genuine, very serious, very valid concerns have created an environment where we've opposed policies through which an action could be taken.

In the end I feel the crux of the matter is as a collective, as a Civil Society, as feminist activists, we need to find ways where we insert ourselves more prominently and more ‑‑ in a way that's detrimental to help how governance works and how platforms request and respond to individual cases of some?  In different context.  I understand the view of approaching international principles and creating models of governance and responsibilities, but reality is much more complex.  What's true in Pakistan may not be true in the US.  It may not be true in Palestine.  And what's true in one case in Pakistan may not be true in the other case.

Unless we find a way to create a truly multi‑stakeholder model that informs platform ‑‑

(Talking in background)

>> SADAF KHAN:  Sorry.  I think as a strategy, I think that's a missing link we try to hold our governments accountable a lot.  We try to work on policy and governments a lot.  We're not focusing on how our digital corporations behave.  In the end, they hold the key.  They are the key to how our data will be secure, our data will be shared, and how hate speech and attacks and violence against us would be tolerated or not tolerated.

I don't know it ‑‑ this long thing doesn't have anything to do with strategizing.  But one thing we do need to strategize on is dealing with having more effective solutions regarding corporate governance, finding ways to kind of insert private voices, collective voices, feminist voices in how corporate governance is dealt with.  I don't know how that would be done.

Yeah, I think this is a strategy that we need to start making.

>> SHOHINI BANERJEE:  Yeah.  Thanks for sharing.  Mona, if you want to jump in and then Mallory.

>> MONA SHTAYA:  Thanks, Shohini.  I don't want to repeat what Sadaf already talked about.  I totally agree that we should think about the multi‑stakeholders coordinations and working in field.  But there is something that we could build on, which is what some of social media companies had started in this regards specifically.

For example, Facebook they have adapted some videos that ‑‑ for the help center where we at 7amleh are telling woman if they are facing any kind of gender‑based violence, to report about that.  Because I'll be speaking now on our observatory but I'll continue with the Facebook point at the beginning.  They adapted that video we provided for the local woman in the Palestinian context that they should consult with woman organization to talk about their gender‑based violence that they're exposed to and check how these organizations could help.

So Facebook is also ‑‑ has also developed some tools to help woman.  So if woman are being harassed online or if there is someone who is pretending to be you on the social media platforms, you could report on that.  And social media platforms, such as Facebook, could take action, could block that account, could support you in this case technically.  I'm speaking technically here.  But for the support psychological support, it's better to have a woman organization where we in Palestine have a couple of organizations that are working with woman on that.

For 7amleh ourselves, we have launched the first Palestinian digital rights violation observatory.  It's where we're documenting all kinds of digital rights violations that Palestinians are exposed to.  We're talking about account censorship, content censorship, smearing campaigns and any other thing, including the gender‑based violence.

So in this platform woman can report to us.  And we are coordinating with other feminist organizations, so we can let them know there is a case here who needs psychosocial, for example, support, but we also take action with the social media platforms as a trusted partner to them.

We know that there is needed coordination in a higher level, as Sadaf said, in a multi‑stakeholders level.  At least we started from here.  I see that was helpful to be honest since June when woman were reporting to us on the cases when their phones were confiscated by the PA.  And they were sharing their private pictures.  At that time we have reported and we escalated that to the social media platforms to take down these private pictures.

And we are continuing to ‑‑ like last month there was a smear campaign against a woman in the Gaza Strip, we were able to take down that content.  We're dealing with visual context and also case‑by‑case.  But that kind of digital observatory is really helping us to manage the situation until we find like a policy solution.

But to be honest with you, a policy solution without raising awareness campaigns and without digital, let's say, raising awareness campaigns for people.  So people can know what is the gender‑based violence?  What is the hate speech on the online spaces against woman?  What is the surveillance for woman and many other things we can work.  Like changing policies without working with the grassroots, working with the communities.  It won't be efficient, because we need to work with the theory of change.  We need to tell people where is the real problem, so they can adapt their behavior, and they can grow up their children with a different mentality.  So we can protect woman in the digital spaces instead of choosing leaving the internet and leaving these digital spaces only for males, only for men.

That's the kind of work we're doing.  It's fruitful to now.  We know there is escalation and hate speech against woman, stereotyping woman, gender‑based violence, and other kinds of violation against woman in digital spaces.  But at least we try to find a starting point to work with woman in Palestine.  But we're also really glad if there could be any kind of coordination between us to also start working on a different level on that.  Thanks.

>> SHOHINI BANERJEE:  Thanks, Mona.  Yeah, I think you're speaking about how it's really important to have kind of a grassroots‑driven approach, that there's awareness that has to come in before we really can get a good strategy in a multi‑stakeholder approach in place.

I think Mallory was also ‑‑ wanted to speak about either one of the earlier points or what Sadaf was saying.

>> MALLORY KNODEL:  Yeah.  I wanted to bring a few threads together.  Mona added perfectly to what I was going to say in response to Sadaf.  So the thing I think that's great about what's been said is like is sort of intersectional feminists working online in the internet space, we actually want it all.  It's something that the feminist principles on the internet talk about a lot, which is we're not going to choose between these false dichotomies anymore.

We want local access and recourse to harms, at the same time we don't want localized content monitoring tools.  We don't have to choose one or the other.

I think we should always continue to remember that we don't have to and ask for what we want.  Our position should be complicated.  It's okay to have a complicated position.  We want to have user empowered reporting so that we don't see unwanted explicit imagery, at the same time we want to be able to post our own explicit imagery without censorship.  That's a perfectly acceptable position.  They're not in on decision to each other.

To that end, I think another false dichotomy we're often caught in is the online versus offline.  This is starting to come into more stark relief.  I'll just explain what I mean.  We started this conversation by talking about digital inclusion.  I always talk about this APC report on this sort of double exclusion that happens with difference in access levels.

On the first hand you're excluded because you don't have access to the internet, yes.  But it's a double exclusion because the prior analog, real‑life world that you once were able to engage in is rapidly disappearing and you're doubly disenfranchised.  You don't have access to real people that are accountable and you can talk to if you need to apply for a driver's license.  And then you also can't do it because you don't have access to the internet to apply for your driver's license, right?

I think the trend is also becoming this right to a world that isn't connected as well.  I think ‑‑ this is expressed.  I think there's a proposed law the right to disconnect.  We're constantly at work because our workplace is at home.  We don't necessarily want to.  There is a concept that's coming up.  I think for gender lens or inclusion lens, we need to hold on to a world that is maybe enhanced by connectivity but not replaced by it.

That, I think, is critical to ensure we have continual inclusion and accessibility of daily life for everyone.  We shouldn't have to be online to live our lives, right?  That's a complicated position, but it's okay to take that complicated position.

Then I think the other thing that Mona was just talking about, which is maybe filling that gap means that we have more organizations.  This is also something that Sadaf said as well, what are the organizations, social are structures, what are the Civil Society mechanisms that we create and strengthen to fill gnat gap?  I just want to recognize that we are ‑‑ that is the cost of digitization, there's a narrative out there that digitizing the provision of government services or things like that saves money in the long‑run.  It doesn't.  That cost just gets placed.  There are now community organizations and other folks that have to fill that gap.  It takes time and it takes resources.  It should also gain access and power to the platforms or the agencies that are responsible for running these services.

Many times we are just met with silence.  There's no one to talk to.  There's no one to go to.  And that's been the rule of community services.

I think another trend we can imagine is that in maybe a previous to the internet world, there would be, of course, services that are there to help with language barriers or literacy barriers.  You need support in applying to university or a variety of other things.  Those, I think, still exist.  They just now exist in the form of cybercafes and women's rights giving training how to be safe on your cell phones.  We've now rolled into these sorts of community‑based structures more digital aspects.  Digital security training, literacy, access to the internet, and all of that of the that's necessarily a bad thing.  I think we just need to recognize that as a really, really important function that Civil Society needs to get better at providing.

But then also, as you've all said and others have said, making the case that these structures are not informal.  They should actually be given access to influence the way the platforms work.  It isn't just end users.  There are structures before that as well.  There are organizations that are there to speak on behalf of the users, on behalf of the public interests, and on behalf of folks who are traditionally excluded from these kinds of spaces.

>> SHOHINI BANERJEE:  Thanks, Mallory.  That's so really like ‑‑ you made some excellent points.  I saw all the other panelists nodding along agreeing who you were saying.

If I can kind of say there's a common understanding, as Mallory said, that our position can be complicated.  There are various strategies and stakeholders and really understanding the contextualization that is necessary to create a strategy to include digital inclusion.  If I can take a step back, if we think about various other stakeholders who are involved in policy making, is it that there are singular understandings?  Is it that these false dichotomies are taken as actual dichotomies?  And is that where there is the barrier to inclusivity?  If so, how can that change?  Would that be through community‑based structures or some of the strategies you suggested?

So I think I want to ‑‑ we're speaking about the Civil Society perspective.  In policy making there are various other stakeholders, so kind of understanding where, how they approach it and what could be changed in that.  If anyone wants to speak to that.


>> SHOHINI BANERJEE:  Any panelists, Avis?  Yes.

>> AVIS MOMENI:  Thank you.  About policy approach to digital inclusion, in terms of policy approach, we suggest that there is the necessity for Civil Society to educate for National Geographic coverage of telecom infrastructure and also to advocate for the revised law on freedom for assembly from a digital aspect.

Also emphasis of advocating for implementation of African declaration on rights and freedoms and national level of African countries.  Because we have seen by adopting the enforcement through the use of ICT (?) that's the approach about ‑‑ thank you.

>> PONCELET ILELEJI:  From me I want to look at a lookout perspective.  I think we have to put things into context that one rule doesn't apply to all in terms of digital inclusion.  We have to be able to create programmes that are grassroot oriented to get ‑‑ because most of our women, most of our girls, more and more they are getting online every day.

Most of the content they consume has nothing to do with their local environment.  I think when I look at The Gambia, for example, this is one of the shortcomings we have had.  Even though we have high literacy rates among women, especially at the grassroots level, we have not made use of technology in terms of, all of Africa we're either French speaking or you're English speaking or Portuguese speaking.  That's the reality of our life.  We have these languages.  And the internet is dominated by them.  Why don't we use enough of voice messaging to transmit information to people so that they would be able to know about their health, their education, and general things that would improve life?

A because at the end of the day if the internet, it's main rule is a tool for empowerment.  And we have seen how a lot of things have happened since March last year and due to COVID, a lot of women are still excluded from these because of this notion they're illiterates and the information that comes out.  For me and my continent we're created enough content in voice to be able to address that.  I think that is very, very important.  How we generate content and relate that content and how to pass that content on to media.

When we had the COVID vaccines coming to Africa, a lot of people had a lot of misinformation.  So like in The Gambia, we had cases whereby where whole tons of vaccines went to waste.  How was the information been passed?  It was only mostly in English going to a website and stuff ‑‑ there was nothing in voice, no nothing sending voice messages through.

One thing we know in our continent, not only in Gambia and in Africa, women, they have very strong elective groups.  And those collective groups we have not really used them very well and used digital means to get to them whereby we can transmit ‑‑ it was a voice message, what you should do.  This is where you can get vaccinated.  The importance of vaccination, all in the local language.  So that local content aspect we have really failed.

I hope the educated among women in Africa will take this up.  Because they can address better than the men.  Thank you.

>> SHOHINI BANERJEE:  Thank you so much.  Sorry, Sadaf.  Please, go ahead.

>> SADAF KHAN:  I think it's essential that the local content, I think it connects to people's ability to use the internet and forming connectivity.  It goes beyond access.  Whether it's language or whether it's format within which that content is produced and the ability to access that content is created.  I think they all come together to create inclusivity or to include inclusivity online.

I was thinking, I think this actually also speaks to one of the challenges that ‑‑ you phrased your question earlier about strategies and some challenges that are opposed to those strategies.  I feel in one of my context one of the major challenges when that kind of ham perked the progress that strong woman collectives can make is their ability to be intersectional in that way.  It's much more easier to rely on the resources that are already available online.

It's much easier to write in English.  It's much easier to use tools that are catchy, that will get you engagement.  But it's much more difficult to create content that's accessible to women who are outside that circle.  I do feel that, even as activists we end up creating ‑‑ when we do create content that speaks to them, we create it in our own voices.  It's kind of actually speaking to them rather than us trying to include them.  It becomes a lot of work.  And it's not, you know ‑‑ it is difficult, it's very difficult as collectives, we're supposed to bring together all these intersectionalities, not because it's a lot but often some of the intersectionalities might be criminalized.  In a lot of countries, sexual minorities are criminalized.  Their existence is criminalized.  Yet this is the question we come back to again and again.  A lot of us get out and demonstrate on woman's day and respond to it.  Do you publicly raise your voice for people whose existence is criminalized in the country?  We try and rationalize, you know, but if we did that, we would not get the advocacy or be able to be out in the streets.

So I feel that intersectionality is difficult in spaces and much more difficult in digital spaces because we are being directed by tech.  We are being directed by design, by hardware, by algorithms that are made to speak to the majority.  Within that taking a real feminist approach that's intersectional in terms of content, in terms of enabling access to content, it's really difficult and challenging.  If someone knows how to successfully navigate it, that would be lovely to hear.

>> SHOHINI BANERJEE:  If someone wants to speak to that or ‑‑ we have about, I think eight minutes left, if any of the participants or panelists have any questions for each other or just want to share some last points about thinking about what are the key factors that we need to think about when moving forward.  But if there are any participant questions or panelist questions for each other, please, we would love to have some cross‑conversations as well.

>> PONCELET ILELEJI:  I will start with us to focus in my last company in the key areas for action within this roadmap on digital cooperation.  I think we should try to embed it through our national and regional IGF initiative.  I think in our various community organizations they are very important, because they definitely address everything we are discussing today.  As much as possible we should try to localize the content so that it can get to as much people as possible.  That would be my closing statement.  Thank you, all.

>> MALLORY KNODEL:  I just wanted to say what Sadaf said around positive framing about how to make content more accessible to folks who are traditionally excluded but specifically from a gender lens.  I would really like to start asking that question.  I suspect there's been quite a bit of writing about it, but as a technical person, I find that question really engaging because there are proposals out there to optimize the internet to make it censorship resistant and all those things.

I think in all those efforts, it tends to center the intermediary and the content delivery network, instead of actually the question is the end point.  And maybe not even the end point as the device.  It's the end point as the user of the device, and that even presents an even more engaging issue from the perspective of, say, group use or sharing of devices that are not necessarily owned by one person only.

I'd love to follow that up.  Something that I said before that I'm trying to remember now.  Sorry, I forgot.  Oh, yes.  More ‑‑ I'd love to see the Dynamic Coalition grow and also go out with its work, like it's score card work and other initiatives to other tangential Internet Governance bodies that are grappling with diversity, equity, and inclusion or at the very least where their work can land and come to.  I'm thinking of something called the Inclusive Naming Initiative.  There are a few things out there that they're probably not aware that this Dynamic Coalition exists or that the IGF itself is free to participate in, it has intercessional work.  I would love to look at efforts in which we could do a bit more outreach together.

>> SHOHINI BANERJEE:  Thanks for that, Mallory.  Any last thoughts from other participants ‑‑ I'm sorry, other panelists?  I guess not.  I want to thank the panelists for this fruitful discussion.  I think there was a lot to cover because we're trying to understand ‑‑ I mean, from a regional perspective, everybody comes from a different region, their work is from that context.  To understand how social inequalities and digital inequalities impact ‑‑ what are the interlinkages and how COVID impacted that.  From that understanding to get to a stage when we're thinking about strategies for digital exclusion and the various stakeholders, what are the various factors that we need to think of and what strategies may be successful for what stakeholders.  That was a lot to cover.  But I think ‑‑ the last few points that were discussed were really crucial.  I think we covered a lot.  I just wanted to thank all the panelists for their very fruitful input and to all the participants who joined in.

>> SADAF KHAN:  Thanks, everyone.

>> AVIS MOMENI:  Thanks a lot.

>> SHOHINI BANERJEE:  Thank you.

>> PONCELET ILELEJI:  Thank you.

>> MONA SHTAYA:  Have a great day.