The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> We all live in a digital world. We all need it to be open and safe. We all want to trust.
>> And to be trusted.
>> We all despise control.
>> And desire freedom.
>> We are all united.
>> CATHERINE MUYA: Okay. Thank you, everyone, for joining our session today. My name is Catherine Muya. And I will be one of the speakers today. And I'm joined by my host speaker. Her name is Esther. I can see that we have people both online and offline. I will give a wave to the people who are joining us from Poland. Thank you so much. I'll ask Esther to introduce herself as well.
>> ESTHER MWEMA: Hi. This is so exciting to be part of the IGF. And to just cohost this with you here in Zambia and all the way there in Kenya. And we have people in Poland. Hi. It's really just a big sign of how the Internet can connect us all.
So I'll introduce myself. My name is Esther. I'm an open Internet leader. I'm very excited to be cohosting this lightning talk with you today.
>> CATHERINE MUYA: Okay. So I will introduce myself. My name is Catherine Muya. I work as the program officer in Eastern Africa. I'm mostly concerned with freedom of expression and access to information and issues to do with digital inclusion and privacy within the Eastern African region where we work. Before that I served as an open Internet lead where I met Esther. And this is a program that empowers people from different jurisdictions around the world. So Africa and some of our colleagues from Southern ‑‑ SEE region I think. Just to engage in Internet leadership. Yes. Internet leadership.
So I want to speak more about what prompted me to do this or to rather undertake this project and whatever we are going to be discussing today. During my time as an open Internet leader I focus ‑‑ my in‑country ‑‑ I want us to begin and why I thought that was important. And it was because at the time I looked at what was going on around the time. Because when we are making the applications to around COVID. What happened was the issues of online violence had been identified before, right?
But the impact was not being felt. So what we have in Kenya is a culture of people who already because of our culture are really misogynic and anti‑misogynic and they spread hate speech. It is demeaning on women. What happens, and I've given an example for some time now, regardless of what happens it is always blamed on women somehow. So whether, for example, it is femicide. I will give an example of one of our medical students and her name was Ivy. She was unfortunately a victim of femicide. One of her ex‑boyfriends came to her school and axed her. Sorry. We had all this means of access and how Nairobi girls are easy and they are contributing to their own demise.
Over the COVID period, it was also a victim of online harassment. Once she came out to speak about COVID and they started looking through her social media posts. They attacked her and attacked her family and released her nudes. That was sort of the environment that was going on at the time.
That prompted me to check on how the law, the Kenyan law that we have was addressing online violence. Maybe because when we talk about issues of inclusivity we want more people to come online, but the experiences of women online already who are pushing those who are not, who are not connected to not want to connect at all.
So this is a place where I was coming from. I checked a lot on what were the most prevalent forms of online violence we had in Kenya. A lot of studies had been done. So what I did was to consolidate that research. And a lot of this had to do with issues of hacking, insults, sharing of personal images. And then after that, I looked at what laws do we have in Kenya and how are they regulating online violence.
So when it came to issues of children, there was a lot of reform that was needed within our Kenyan law that the Government is trying to do but it has slowed down. Because we are almost going towards an election season in Kenya in the next year. So that is ‑‑ that process is always slowed down because you have to come with new parliamentarians who have to accept the law from Parliament, which delays the whole process.
And another thing the laws that we already have in terms of cybercrime laws are being used to stifle freedom of expression and harass journalists as opposed to being implemented online.
The last and final thing, a lot of efforts were going on from Civil Society and Government as well. They were not known even to the people who were victims. Even journalists that I talk to, the people, they had been victims but they didn't know what they were supposed to do after that. There was a lot of lack of awareness.
At this point to avoid ‑‑ I will invite Esther to give her experience.
>> ESTHER MWEMA: I see you. I think we need to deep dive in to that after I go a bit in to where I'm coming here, because just hearing the stories that you are sharing is almost ‑‑ it almost seems like the misogyny practices that are embedded in our society are showing up online. And real life situations like the murder has turned in to a joke. And what does that mean for victims of gender‑based violence? How does that affect how we as woman can report cases that are affecting us?
Because if we go online and we are turned in to a meme, does that just amplify it? That's something we can talk a bit about. And after that I will also ‑‑ also invite everyone attending to also contribute their perspectives. Just putting that preamble there and also sharing some from my personal perspective a bit of grassroots. So we are a youth‑led organization. And we are working to increase censorship in local communities. And we do so by trying to equip or view the capacity of the youth to engage and connect governance. We are always doing this with mindfulness on including gender diverse persons and women in our program. This way we can have women representing their own communities.
And I will also add that there is a push for bridging the gender digital divide. There is a big push to bridge just the digital divide from the north/south aspect and also from the urban and rural aspects.
So the main purpose of this session is to ask the question, as we are bringing more people online, are we bringing women online to be harassed? Are we bringing women online in spaces where they can be safe? Or in spaces where the abuse they are facing will be amplified? And digital grassroots right now is an action coalition leader in the generation equality for technology and innovation for gender equality.
And one key question we're asking how can we innovate and view technology that is feminist and responsive to the needs of girls, women and gender diverse people all over the world from different cultures. And this is something that I believe should be central to the conversation on connectivity. We want to bridge the gender digital divide.
But when people join ‑‑ become connected online, when they start engaging on social media, are they more likely to face harm or created a space where women's rights can be upheld.
And I think this is a conversation that needs to be taken more seriously, especially for us who are advocating for connectivity. From what we are seeing, we are seeing women and girls online and sees violence and goes offline. And there is a lot of entitlement to women online to sharing a women's images without consent. To shaming women simply because someone posted a picture. And I feel like that has affected the ‑‑ that has affected the quality of connectivity. We need to have meaningful connectivity, not just connectivity but meaningful connectivity.
And censoring women allows us to ‑‑ the patriarchal lens and we have to realize that the big techs that are allowing us to engage online are they male dominate. They are male dominate. How does that influence the products that they are pushing to us.
So that's like a key issue we have been working on. The digital grassroots, see a solution comes in when we are equipping community members with the solutions.
So instead of a topdown approach where we escaped what communities need, we use a bottom‑up approach to sort of just say this is what the community needs, and responding to the community needs in that way. And this is in line with the original values of an open and free Internet over decentralized incident. So the more we embody the needs of the communities that we're serving, the more the Internet will be a healthy place where people can connect, and view the connections without harm, especially for women, girls, and gender diverse people.
And I will just conclude with what digital work has been doing specifically in that lens. Each year we have an Ambassador program. And in this program we have four weeks of training on Internet issues, such as Internet security, Internet for economy, and Internet for social life. And we have four weeks of mentorship. So we match our mentors with industry experts, who then equip these young people, to understand how to participate meaningful in Internet Governance and connect their communities. We also have a community leader program where we connect with other organizations. We have collaborated with the open Internet for democracy initiative. We also collaborated with an Internet health report. And this has allowed us to amplify the voices of people on the ground. And we always have an equitable view of where we are making ‑‑ we make sure that our programs captures a need to 50% or more of women which has allowed us to be able to have a feminist approach to this issue.
There is still a lot to be done. And I wouldn't say Civil Society is the only answer. Every stakeholder has a role to play. So that is it for me.
>> CATHERINE MUYA: Esther, I think you raise amazing points. And maybe we accept questions or any feedback from our participants, I just wanted to ask, you mentioned a lot or the work that you do and the bottom‑up approach that you do.
I think taking your perspective as a developer, like I don't anticipate that this is something that people are thinking about. Right? Like if you are a developer you are not really thinking about online violence or something. But then how do we then make ‑‑ so is there a need for us to broaden the recipients of the messages that we're giving out? If you are a developer, you are thinking about the best user experience, not really thinking about safety or anything like this. So I think maybe I just want to pick your brain on what you think could be done to improve certain aspects or to increase the number of stakeholders that we get from the development community inside these discussions or what's the role inside this discussion.
>> ESTHER MWEMA: Thank you for that. Before I answer that, I would like to hear from you on a Human Rights centered approach based on the case study that you shared, which is not a case study. It is a real life story, a very sad story. And what could have been done, what could be done differently.
But in terms of who needs to be in this conversation, I feel like we need to recognize that the social sciences cannot be separate from the more technical aspects of the Internet. Because at the end of the day, the Internet is a communication medium. It allows us to connect, allows us to express ourselves.
And so what we see happening is that there is a technical aspect that has been covered, but the social science part of it has not really been implemented for many reasons. But one of the core reasons is profitability.
The more people use, for example, an app, the more profitable it is. And so when we see that globally, I think this is going in to the broader concept of the world we live in. Globally, women have less economic power. There is a lot of gender inequality in terms of economy. Which means that who is buying at this point. It is either men or products that serve this group. And because they are also the ones who created it, use of gender equality that's happening globally. So there is a need for a conversation on understanding the social science behind it. And also seeing ‑‑ sticking more on a feminist lens. And this can be done, I believe that this should be an access of each and every technical person's study or journey. Because we're not ‑‑ it is sort of like building a house, but you are not taking in to consideration the people who are going to live in it.
And so it is not just about viewing a product that works, or this doesn't have many bugs. It is about building for people in a way that acknowledges or at least creates a state where it can be amplified, rather than harm being amplified. Which also goes back to how the people profit online and can we change where harm is not profitable than girls. So that's an asset I can give off the top of my head. I would invite anyone who has a comment, I think, to give it in the chat or to raise their hands in the room. Over to you.
>> CATHERINE MUYA: Thank you. So I think those are really important. I want to answer your question on what I think could have been done differently, right? So imagine at the head of the pandemic, right, people don't believe No. 1, that ‑‑ because it was around March, May, people don't believe that the pandemic is real. So we are all struggling to believe COVID‑19. Then you are the first patient. So what happened is she came from studies abroad. She came and she was a first patient. And she was admitted. And now has recovered. You want people to believe that Coronavirus is real and wash their hands. And you happen on national TV. And you are giving your story. The aim of you coming out knowing that you would face that much stigma is so that people can see you as a real person. People can see you as someone who has gone through the whole pandemic and recovered. And now take measures seriously. But that's not what happened.
What happened is the victim was strolled. People said she was lying and said she was a puppet for the Government. They went and looked through her Facebook post and they brought it to Twitter and they called them directors of investigation on Twitter. And they have said they are unravelling what is now the truth about her. They got information about her boyfriend. They got information about her nudes. They made fun of her and shared them everywhere, on WhatsApp and Telegram. It became an issue of national concern. So the Minister of Public Health, he came out to speak about it. And some journalists, they came out to speak about it. They turned on anyone who came out to support her. So they came out and attacked the Minister online and attacked the journalists. And the bad thing about it was that the journalist was also female.
So they started asking questions about the journalists and her husband. The fact that we can't ‑‑ her wealth. It was the same thing for everyone.
What I think could have been done differently, all those trending hashtags are still online. Because anyone else who came out to support the victim was also falling victim. I think that in that sense it really made it difficult to support her or it really made it difficult for you to come out and stand for the truth. Even if you saw a poster, a tweet that was abusive you couldn't do anything about it because you felt like if you did, then they would turn on you next or if you spoke out against it, then they would turn on you next.
Which is why I think it is an important aspect of connectivity because what we're talking about when we talk about meaningful connectivity is that people should have the biases, feed and also have regular access to the Internet.
But what you have is people who are icons, people who you look up to. So like a real respected journalist or a Minister being trolled online for doing the right thing. Then you kept asking yourself if you want to do the right thing also or if it just cost less for you not to do it. The worst thing about it because it was trending, Twitter makes it much more popular for a whole week.
The whole conversation trended for a whole week and I think the entire weekend. One of the things that we need to do is we need to be able to No. 1, be able to first have this community of people who are able to report first. But that only happens if people are aware that they can do that. And then I think the second thing is how do we build confidence on making people, because you see if you are saying that we are taking a bottom‑up approach then it is easier for the Internet users themselves to just report the post or easier for them to report it or post something that they think is important or just be an active online bystander. If the platform itself is inaccessible or the algorithms itself are promoting content that's insensitive, how can we be able to change that.
I think one of the things that I think could have been done differently, No. 1, if we had access to the ‑‑ this is the trending hashtags but it is not trending in the right way. It happens every time, when we are making memes of the unfortunate medical student. So I think No. 1, access to platforms is something that we don't have or we don't have the luxury of having.
And then two, unboldening people are something we are not doing enough. We don't feel safe and we don't feel like we can be able to participate to create a sister online community.
I think those are my comments for your question.
>> ESTHER MWEMA: Thank you so much. I think those are really great pointers that you say. Firstly, the platforms that we use have to be accessible. It doesn't have to be difficult for the user to connect with that platform. The second step is making sure we try to evaluate how the algorithms work and what they promote and how to respond to that. And I think it is also very important to see what does regulation mean. And are there any consequences for the platforms that cause harm, because I feel like at this point, in the current environment that we are living, there isn't really any real consequences for the platforms that we use and for the harms that they cause. It remains on us to think about how to protect the overviews. And I feel that's an insufficient way to addressing the issues.
At the same time I want to hear from the room or from the chat on your thoughts about what we have been discussing. So please feel free. Feel free to raise your hand or unmute.
>> CATHERINE MUYA: I'm not sure how the participants at Poland are able to maybe give us their thoughts. I don't know if they can unmute or speak or they could also just tag us. Oh, yeah. So they have a comment.
>> ESTHER MWEMA: This is so exciting to see.
>> Hi. Nice to meet you. You talk about what actions we can do as Internet users in Twitter or in Facebook or other social apps. Well, I'd like to ask you how to be a good active bystander in the Internet. What we can do, for example, when we see hashtags that are promoting ‑‑ hashtags which do not promote things that are sensible as you say. What can we do? Because should we just report them or ignore them or should we just make an actual action against them? So just write a comment that's against some ‑‑ some insensible content.
>> CATHERINE MUYA: Thank you. I don't know if we have another question because we just have a few minutes left and then we can answer both of them. But we have one from Frank. Maybe we can take that and just answer before our time runs out.
>> Yes. Thank you so much. Maybe one thing I would want to ‑‑ I would want to inquire from you, Catherine, how do you deal with the levels of digital literacy with the grassroots people that you work with? I will share an example. We work with refugees, but you find that they can't understand the information like online sexual harassment, things that happen online because of the knowledge on digital rights and digital ‑‑ high levels of digital literacy. Maybe you could share your perspectives on that. Thank you.
>> CATHERINE MUYA: Okay. So because of time I think we have to answer those two questions. So I'll take the first one. And then Esther, you can jump on. So for how to be an active bystander, so when you are an active bystander you either like a supportive comment or you report the comment to the platform or you report the tweet to the platform and you are able to create a safe community. I think the one thing that you need to remember as an active bystander is that you also need to be safe. So you always do the things that you think are most comfortable for you. Whether that's ‑‑ if somebody has already re‑tweeted something that's supportive. Just using the tools that you have at your disposal to promote active messages. Yes, you can do something as an active bystander. Final thing about reaching the online communities, my example has been that demonstrations work and also language works. So how you design your campaign is really important.
So issues of language, so if you are coming to talk to maybe people ‑‑ they understand digital literacy. Probably you think they do or probably the way you are presenting it. If it has changed or if you design your campaign to more or less have a lot of visuals, come down to maybe using different languages or M‑Enabling, this has been my example, when I show maybe a demonstration of what things are, give people examples they can relate with, that makes people feel very connected. And then they get to understand digital rights from that perspective as we finish.
>> ESTHER MWEMA: Thank you so much, Cate. Thank you so much for the questions from Poland and also from our participant Frank. And I will say that Internet at the bottom line is a personal issue. And many times we see that the discussions online have become politicalized. People are using it as a political button. Even as a good bystander, you also have your authorities. And so even when you comment, someone can turn it in to a political view, even if it is just a Human Rights view. And that's something we need to start to be aware of. And Frank, you also mentioned working with refugees. And also working with people who are not connected. They also have the political view and the political voice. It could be a Human Rights view. But when they join online, it is seen as a political perspective. So I believe that first thing should be safety by design. Security by design. So if it means being aware of how to protect your identity online, how to use good and strong passwords, how to use two factor authentication, then that is a good point to start.
>> CATHERINE MUYA: Okay. Okay. I think we have to wrap up because we are at like 4:15 exactly. So I'm going to thank you all for your participation in Poland, for the participants we had in Poland and online for joining us in this session. We hope to work together in the future. Bye. Thank you so much to the organizers. And just have ‑‑ please just feel free to reach out to us on Twitter and we can answer your comments from there or on the platform on IGF. Thank you so much.
>> ESTHER MWEMA: Thank you.