IGF 2022 Day 3 IRPC Access & participation as enablers of digital human rights

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> RAASHI SAXENA: Good morning, everyone. Thank you for joining us. My name is Raashi Saxena, one of the Co‑Chairs of the IRPC, and welcome to this session on the IRPC Access and Participation as Enablers of Digital Human Rights. I am going to give the mic to the people who are joining in who will personally introduce themselves, and then we will move on with my Co‑Chair, Minda, who is joining us remotely. And then we will have a 15‑minute Q&A session with our panelists today. 

>> ROSELYN ODOYO: Good morning. My name is Roselyn Odoyo, a Senior Program Officer at the Mozilla Foundation. More specifically, I engage with our fellowships and awards as well as our Africa Innovation Ready Program. 

>> NEEMA LUGANGIRA: Good morning. I'm a member of Parliament from Tanzania and Chairperson of the African Parliamentary Network on Internet Governance, which at the moment has about 36 members of Parliament from 25 African countries, and I'm happy to be here. Thank you. 

>> JUNE PARRIS: Good morning. I'm June Parris, a member of the Committee of the IRPC. It's very nice to see you here and welcome. 

>> VICTOR NDEDE: Good morning. My name is Victor Ndede. I work with Amnesty International in Kenya, looking largely at data governance and another program on digital rights for children and young people. Thank you. 

>> RAASHI SAXENA: Thanks, Victor. And now I'll pass it on to Minda. Minda, can you hear us? 

>> MINDA MOREIRA: Yes, hello. I'm here, here online. We have also Catherine. So, maybe Catherine would like to say a few words about herself. She will be part of the panel, too. 

>> CATHERINE MUYA: Thank you. Thank you very much. Hello. I'm joining remotely. My name is Catherine Muya. I work as the Program Officer for Digital Rights and Policy in Eastern Africa. I lead our work around digital rights, and of course, policy. Thank you. 

>> MINDA MOREIRA: Thank you, Catherine. And we are still missing Yohannes. He is trying to join but is having some difficulties. So, while he's trying to join, I will just quickly introduce the IRPC that many of you probably already know, but for the ones who are coming for the first time, I would like to welcome you all to this session.

So, the IRPC was formed in 2008, and it is a Dynamic Coalition based here at the Internet Governance Forum. We are an open network of individuals and organizations committed to making human rights work for the Internet. Anyone can join us. The only thing you need to do is to join our mailing list. I will be sharing the link in the chat later. Or you can go to our website, which is Internetrightsandprinciplescoalition.org. 

Our outreach work draws on the Charter of Human Rights and Principles for the Internet. That's the booklet here, the English version. And the Charter articulates existing international human rights law and norms and translates them to the online context. So, the Charter was first published in 2011, and it is now translated into 12 languages. And the Charter also includes ten principles which are available in 27 languages.

So, through the Charter, the IRPC has important rights‑inspired initiatives, and we have partnered with other Dynamic Coalitions here at the IGF and all stakeholder groups to raise awareness on the need to implement rights‑based frameworks for the online environment. And since 2014, the IRPC is also an observer to the CDMS site, which is the Steering Committee on Information Media Society at the Council of Europe.

So, another very important aspect of our work are the translations of the Charter, and we are extremely grateful for all the volunteers and all the translation teams that over the years made the Charter accessible to so many people around the world. Our latest project is a translation that we've just launched, and it was made in partnership with Digital Rights Nepal team and coordinated by our IRPC member (?). We are always open to new translations of the Charter, and we really would love to have some translations into the main African languages that are at the moment missing in our group of translations, so if anyone would like to volunteer, just get in touch with us.

So, through the Charter, we also have discussed and highlighted many human rights issues online, from the protection of the rights of minorities, including migrants and refugees, privacy and data protection, online abuse, to the relationship of artificial intelligence and human rights, or issues at the intersection of digital technologies, human rights, and environmental sustainability.

This year, the IRPC Steering Committee has been focusing on Article 1 of the Charter ‑‑ that is the right to access the Internet. Then, Article 5, which is freedom of expression and information on the Internet, and Article 18, right to legal remedy and fair trial for actions involving the Internet. And these articles will be highlighted here today through the discussion, Access and Participation as Enablers of Digital Human Rights, which is the theme of today's panel discussion. And that is all from me now. Please leave any questions in the chat. And I will pass it over to you, Raashi. Thank you. 

>> RAASHI SAXENA: We also have a physical copy of the Nepali translation, so you're more than welcome to grab a copy, if you need. And now, we will start with the discussion. So, our discussion will focus more on the access and participation, content moderation, and what are the mechanisms for the human rights violations that we see online, and particularly in the Pan‑African context. So, we could start with, perhaps, Victor? What do you see as the major barriers to access and participation online, based on your experiences? 

>> VICTOR NDEDE: Thank you very much, Raashi.  When we talk about access and participation as enablers for digital rights, of course, we have to look at the barriers, because as we have constantly said that the world has shifted to a digital ‑‑ to the digital space, a lot of our lives are now transacted in the digital space, equally, a lot of our rights are enjoyed on the digital space. And so, then, it means that for all of us to have equal standing in human rights, we all have to have access to the digital space. And so, then, people accessing and participating in the digital space enables them to enjoy their human rights.

And so, what are the barriers? The first barrier I would speak to is network infrastructure and policy. And of course, in the context of the African continent, where we have, of course, seen an increased penetration of mobile Internet, we have seen movement from 2G, 3G, 4G, and now 5G's also being mooted. But while all these developments have been happening, we have seen that access in the rural areas, access in the remote areas has not been at par with access in what we call the urban centers and urban settlements. A majority of our populations actually stay in the rural areas and not in the urban areas. And so, in these rural areas, we have challenges with the network infrastructure, that in some countries, once you leave the main cities, getting even a 2G connection becomes a problem. 

And so, this then creates a challenge in people accessing the digital space and participating effectively in the digital space. And so, if these rights are being enjoyed on the digital space where people are actually exercising their rights of the digital space and there's a network problem, you cannot access the Internet, then that then is a barrier. And the network infrastructure and the policies really need to be looked at if we are to ensure that everyone is ‑‑ we don't leave anyone behind in this digital transformation.

And so, in as much as we talk about expanding the infrastructure, we really need to look at the policies around this infrastructure. So, we have countries where policies are actually in favor of expanding infrastructure to rural areas, expanding infrastructure to underserved communities, and these are the policies that we really, really need to have in place.

The other thing to go over quickly, in the interest of time, is the aspect of taxation. Combined with our low incomes within the continent, the cost of devices, the user fees in terms of purchase of the data plans, this creates an affordability challenge for the majority of our users. And this issue is further compounded by the government taxes and fees that are usually levied on digital services.

If I was to speak ‑‑ I am based in Nairobi, Kenya. The digital space has been the easiest place for government to continuously increase tax, which, to be honest, is a very problematic thing. If you look at the taxation that is, in terms of the type of taxation that the digital sector has in Kenya, it's taxed under the excise duty. For those who may not know about excise and a bit of this tax administration conversation, excise duty's actually more or less a sin tax. It's taxed on things that, of course, the government wants to persuade people not to use, like say tobacco, alcohol, and things that basically attract that sin tax. So, if you find in Kenya where mobile data payment plans, devices are now attracting excise duty, then it's actually an active action ‑‑ it's an action by the government to dissuade people from getting onto the digital space. 

And so, with that increased ‑‑ with the challenge of affordability, it then means that a lot of the people are not able to access the digital space because, as I mentioned, with the low incomes, the cost of devices, access and participation on the space becomes difficult. And as such, you are not able to actually enjoy your rights in the online space. And so, here what we would really want to see is, of course, a reduction in mobile taxes, which would then increase digital inclusion for all on the continent, and of course, leading to greater contributions to the economy. And this, of course, is to move governments towards expanding the tax base so that they get their taxes from elsewhere and not just the digital space.

The other thing to talk about quickly is, we have a challenge with consumer barriers around the digital space. And so, here we are looking at digital literacy, Internet awareness amongst the population. It's good to have the Internet, but the challenge that faces a lot of people is how do we use the Internet? Averagely, the basic person would only use the Internet here to access social media and maybe, at best, either a Google search or something of that sort, but there is a lot more that people can get from the Internet that they don't know, and this is largely contributed to the fact that there is very little digital literacy and Internet awareness. And so, people are not able to harness the full potential of the Internet. People are not able to harness the full potential of the Internet because of this challenge. And if people are not digitally literate, people are not aware of how best to harness the Internet, then access and participation in this digital space becomes very difficult.

And so, a lot of work has to go into making sure that people are literate so that we don't just say, we have provided the Internet. You have 4 or 5G Internet. But if you don't know what to do with it, the best they can do is either to post a photo in social media platforms or things like that, and there's more to do.

The last thing, in the interest of time, and perhaps maybe another chance we will get to talk about other things, is content on the Internet.  I'm sure some of my colleagues would be entering the field of content moderation, but content plays a very vital role in ensuring access and participation on the Internet. A lot of the content that we have, especially on ‑‑ because a lot of our Internet penetration has been through smartphone penetration ‑‑ a lot of the content we have is in English, which as many would know in this continent is not our first or even second language for some. And so, then it then means that the accessibility of the platforms, accessibility of the Internet based on language alone is limited for some certain groups of people, and then that then limits their participation on the internet space and actually limits them from enjoying their digital rights. And so, there needs to be conversation around how we get localized content for people on the Internet so that they are able to appreciate and understand the Internet in their local context, without which, we may believe we have ticked all the boxes, we have removed the taxation, we have the network, we have digital literacy, but if we are not able to understand the content that is on the Internet. And so, that then means that we still deny them an opportunity to participate in the Internet. 

I think I'll stop there, Raashi, and hand over to you. 

>> RAASHI SAXENA: Thanks, Victor. And also pass it on to Roselyn, who lives in Nairobi, herself, and will talk a little bit about the digital rights landscape there and the work that they do with the Mozilla Foundation. 

>> ROSELYN ODOYO: Thank you very much. Building on to what Victor mentioned in the context of challenges to access, I think given the challenges that ‑‑ like, even for the challenges that Victor mentioned, it tends to be that for the most marginalized communities, they are the most, like, they're disproportionately affected, even in the context of let's say the rural/urban divide in the context of access, in the context of language as a barrier to access, in the context of who is it that these innovations and these spaces are created for.

And in the rare ‑‑ well, maybe not rare ‑‑ in the instances in which folks from marginalized communities do attempt to exercise their right to access, you tend to find that a lot of the legal and policy environment in a lot of the countries, at least particularly on the continent, tend to be quite hostile in the context of there's a turnaround in the utility of that access. So, you're out here as an African sex worker, trying to access the digital space to enhance your economic disposition. However, or an LGBTQI person or a refugee person. Then that access is utilized against you to surveil you and curtail your rights.

And so, I think that when we think through like challenges to, and the context of challenges in the framework of the interventions that tend to happen, tend to happen in a siloed way. So, you have the folks working on the digital rights, working in the context of digital rights, and then you have folks that are working with the marginalized communities, working in the context of human rights. But then, what ends up happening is that what's created is a silo. And so, it's perceived as digital rights is over there, human rights is over here, whilst neglecting the reality that a lot of the challenges to access and a lot of the abuses that tend to happen in real life also manifest online, and vice versa. And so, even when thinking through how these are addressed ‑‑ and this is why even in the context of tech and digital rights, it's important to include civil society and the broad spectrum of what civil society looks like in partnership and collaboration, so then all the investments work towards the overall objective of inclusivity. 

>> RAASHI SAXENA: Thanks, Roselyn. Now we will hear from Neema and also a little bit about perhaps what the government is doing about this? 

>> NEEMA LUGANGIRA: Thank you. First of all, I will start on the last point. I can't speak on behalf of the government because I'm not part of the Executive. I'm a parliamentarian. And there is usually an assumption that Parliamentarians are part of Executive, but we're not. So, my comments are strictly as a Parliamentarian, not on behalf of government. 

I wanted to touch first on the issue of taxation. I think about we're talking about mobile taxation, we need to be very clear what type of taxation are we talking about. Because at the same time, our countries, especially developing countries, are losing out a lot of money on taxation, especially because of the international tax regulations. There is an issue of international tax as a concept of country of origin of sales. And with a lot of these mobile services, online services that are being provided, it's difficult for our countries to tap into the tax, which is based on the sales that are generated in our countries. So, we need to be very clear when we're talking about tax what type of taxes are we talking about.

I am pro making sure that African countries benefit and get the rightful percentage of the tax on the sales generated from our continent, so we need to be very clear when we're talking about issues of taxation in terms of human rights, because I am of the opinion, if you're generating income in our country, you should pay tax. Because if local companies are paying tax, local civil servants are paying tax, individuals are paying tax, why shouldn't the tech companies pay tax? And that has nothing to do with human rights. So, I think we need to have that clarity and separation.  That's number one. 

Number two is, it's very important for us to also acknowledge the challenge of freedom of expression. You know, whenever we're talking about human rights, we peg it that it's important to make sure that we have freedom of expression, but we talk about it just on one side of the coin. We forget the other side of the coin, where there's a group of people who purposefully and targetfully use freedom of expression to make another group not use their own freedom of expression, and this is rarely discussed, even when we're talking about human rights. And to me, that's against human rights. 

You know, you shouldn't use your freedom of expression and make me not use my freedom of expression. But over time, because the digital space has been left ungoverned with lack of regulatory frameworks, that has somehow become the norm, and it's somehow an acceptable behavior. So, when governments or countries or stakeholders or parliamentarians are trying to curb that, then you find human rights are complaining and saying that we are trying to limit freedom of expression. So, I think it's very important when we're talking about the online human rights, let us not forget what is not acceptable offline should not be acceptable online.

A lot of people, particularly women, are resorting to self‑censoring, meaning not participating in the online space, meaning we are increasing the digital gender divide, only because of this, you know, caveat that it's freedom of expression. Hate speech, you know, abuse. Because if you come up to me right now offline right now and you attack me verbally and abuse me, I'm within every right to call 911, and you'll be put in a cell. Why shouldn't that be the case if it's online? If I, suddenly, do it online, and because I'm a parliamentarian, I'll have people at me saying that I am limiting freedom of expression. So, I think we need to separate this.

And speaking as a female parliamentarian, we endure a lot of online abuse, and at no point would you ever find a human rights activist calling it out, because it's somewhat seen that if you're in politics, you deserve it. If you're in politics, you should have thick skin. And when you raise it, one of the comments that I usually get is, "You need to learn how to accept criticism."  And what I always say, online abuse has got nothing to do with criticism. So, I'm going to conclude with this statement that, focus on the agenda, not gender.  The minute you shift to gender, then it's no longer criticism. That is point‑blank abuse. That is not freedom of expression. So, I think we have to be sincere and we have to be truthful when we're talking about online human rights, and let us not stick on one side of the coin but look at both sides of the coin. Thank you. 

>> RAASHI SAXENA: Thanks, Neema. We will now go to Catherine. Can you hear us? 

>> CATHERINE MUYA: Yes, I can. 

>> RAASHI SAXENA: Okay, great. So, what do you think, given the work that you actually do around online content moderation with Article 19, what do you think are the most pressing challenges in online content moderation in Africa? 

>> CATHERINE MUYA: Okay.  Well, I think for us, given the work we do, we currently are implementing a project called Social Media for Peace that looks at content moderation within the Kenyan context, basically. And we have come up, we have identified a couple of challenges. And I think one of those challenges just being the fact that content moderation is really not transparent in that sense, and we don't really know how it's done and who does it.

And so, over the last period, we've discovered a few things and a few elements that actually have challenged freedom of expression in this sense. Because, for example, I'll give you an example. In Kenya, we have human rights activists whose accounts are continuously destabled because of manipulation and strong policy. I think this has now been made worse with all the changes that have happened in the company, but the idea was that the human rights defenders are just generally people who go online to complain, but their campaigns would not really be differentiated from, like, manipulation and spam. And so, eventually, what results is that the action, the human rights defenders then have their accounts suspended. And my concern was really that there was no effective remedies for this. Their platforms didn't really provide any remedy. And the fact that the remedies are completely automated, there was no ‑‑ it was just like arbitrary suspensions without warnings, without any other due process, which for me is against human rights and really even against the business and human rights responsibilities. 

But aside from that, I think in the opaqueness of it all, we have discovered that even the people who do work on content moderation really don't get fair and equitable remuneration, which again is a sincere problem, and it's difficult because, again, it's these people who still work on the Kenyan and African continent that are required to do this work, and it's a lot of intense work. But I think now we need to look at what are the effective strategies that actually make it work. And for me, I think it's also the fact that, you hear a lot about the lack of, like, local context and local knowledge, which is something that we found in our research, but it affects content moderation in the sense that the people who are making these decisions are not really fully aware. 

And it might require a lot of, in my opinion, a lot of intense, how to say, intense implementation. But I want to use the word that has finances ‑‑ investments, that's the word.  It requires a lot of investment that I don't think companies are prepared to make. If you see with the accounts from big tech, I think those are the responsibility teams that are the ones suffering the most. So, I think there was not any real commitment to going to actually change this effectiveness. And so, it eventually affects everyone.

And like Neema was talking about, online harassment.  I think that if the platforms really could ‑‑ I'm sure they have a lot of work on trust and safety terms, but aside from content moderation, we really need to think about how do you then ‑‑ what are the report mechanisms? I think they're not really very highlighted, and I think that's an awareness problem, not really a content moderation problem. 

But I think if we look at the past Kenyan election and how content moderation was done in the context of a just‑concluded election, I think there was some effort to try and engage with local civil society by platforms, but I think that engagement was really very one‑off, and it wasn't sustained. And so, beyond the election ‑‑ so, in the time that the period of an election has ended, I have not really seen the same constant engagement, the same constant type of dealing with problematic content on the platform, which then makes it a very one‑off problem, and that is going to really just catch the attention the next time we have an election so that you can see that you've been doing work around it. So, I think that we really need to think about sustained efforts in making content moderation very sustainable in that sense. Because as it is now, I don't really foresee, like, systematic change.

So, for example, during our elections, you could see enabling of content that was misleading or, actually, the platform going out of its way to provide like reliable facts or reliable things that were actually labeled as misinformation, but I haven't seen that going on now. It doesn't mean that just because we don't have elections, those things stopped. It just means that your engagement wasn't really consistent, and so, the same problems are set to reoccur. 

So, I think the way it's designed, really, is sort of to just have the same problems reoccurring but not really getting the long‑term solutions to those problems. 

>> RAASHI SAXENA: Can you hear me? Thanks. It also begs the question of content moderation being localized. There are so many nuances that perhaps the big tech teams don't understand. And also, the heinous content of, you know, looking at (?) stuff for 12 to 15 hours. We are not sure if that is a viable career option for content moderators. And I have also seen that Indigenous, there is not a lot of budget by big tech for hiring Indigenous researchers who can sort of bridge this gap.

So, as we move on to our next discussion, how do you think that we can effectively localize content moderation on global platforms? Perhaps do you want to go first? 

>> ROSELYN ODOYO: I think the conversation on content moderation, as Catherine pointed out, is that the challenge tends to be where it fails. So, from an idealistic place, it's supposed to be a ‑‑ there's supposed to be ‑‑ the utility it's supposed to offer is supposed to be in remedy of things like hate speech, where online violence, for example. However, oftentimes, when tends to happen ‑‑ and this is because also a lot of the big tech companies are responding to economic or financial‑related gain, and that's in the context of the ways in which they'll interact with government for example, and the permission to exist or operate within a country, so much so that ‑‑ and this was detailed, for example, in the work of the two fellows under the Tech and Society Fellowship of Mozilla, (?) who wrote some really extensive work around the ways in which disinformation and misinformation leading up to the Kenyan election were happening. And that was happening in the context of the existence of several false and fake profiles that existed spreading this, and it wasn't until an intervention, or like this report that they did, that, let's say Twitter, for example, decided to engage and say, okay, we are going to disable these platforms.

Now, the place of the way it's supposed to offer utility is in the creation of an environment within that is made possible, and that includes from, with the partnership from, like, government, for example.  But are we ‑‑ who's benefitting from laws that are creating that are supposed to protect communities, and in what ways are the gaps that exist within these laws and policies that are supposed to protect these communities being exploited, and by whom, and what is it that can be done to remedy this, beyond putting the onus on individual activists or communities to then call for accountability. I think that tends to be kind of finding the balance.

And then, also, of course, again, I like to circle back to the role that civil society plays, which is that bridging that gap between those who should be accountable and held responsible and those who experience great harm as a result of these gaps being exploited. 

>> RAASHI SAXENA: Thanks, Roselyn. And now that we are on the topic of hate speech, we also have Yohannes who has joined online. Yohannes, can you hear us? 

>> YOHANNES ENEYEW AYALEW: Yes, yes, I am hearing you. 

>> RAASHI SAXENA: Okay, perfect. Perhaps you can introduce yourself to the group and also talk a little bit about how hate speech can kind of appeal or help with ‑‑ kind of is one of the most pressing challenges in causing offline violence, and how is that related? How is the interplay between online hate speech and violence, which leads to mass atrocities in your region? 

>> YOHANNES ENEYEW AYALEW: Hello, everyone. My name is Yohannes. I am currently a candidate in the university in Australia with research on Internet freedom in Africa. To answer the question asked, the impact of hate speech is profound in the context of the wider Ethiopia, even outside in the recent report of the UN finding, the role of hate speech was great, especially platforms where unable to moderate content that is, you know, propagated by the supporters of both (?). So, the hate speech in content of the world in northern Ethiopia, basically that covers three regions, was, you know, profound.

In addition to that, when we report to platforms like Facebook, their response was very slow, even at times when calls for immediate violence was reported. They don't even reply to our reports. Sometimes we're forced to write them email and send it afterwards. So, I think platforms should double down, you know, their content moderation when it comes to hate speech as well as violent disinformation. I will add more later. Thank you. 

>> RAASHI SAXENA: And now we will move on to Santosh, who is one of our standing committee members and will also talk a little bit about the work they do from the Asian perspective. And he is an important member of Digital Rights Nepal. Santosh, I will give it to you. 

>> SANTOSH SIDGEL: Thank you. Thank you, Raashi. This is Santosh Sidgel from Digital Rights Nepal. I'm a member of the IRPC Steering Committee as well. So, first of all, I will briefly talk about the translation that was shared with you all, the Nepali version of the IRPC Charter. While working on the translation in Nepal, we felt that this is not only the technical work, the translation, but it is a process of knowledge‑building, capacity‑building, and working with the different stakeholders. Because in the process of translation, we collaborated with the different stakeholders, including the Nepal Office of the Nepal Government, different civil society organizations, technical community, and it was a collaborative process, so it was not a technical translation.  So, the concept of the rights, principles and the values, we discussed about that. And many of the world ‑‑ it is a technical part ‑‑ many of the words that were not available in the palette. For example, governance, there was not a single meaning, so we explored it.

So, I suggest the other translation also, because it will not only be the translation, it will be the process of bringing the discussion about the Internet values and digital rights into the country. So, I suggest that. That is one important part.

Another part is talking about the accessibility and exercising the right in the country from the perspective of Nepal and South Asia. We have a different kind of ‑‑ we have to, I think, categorize the different aspects. One is infrastructure. Another is capacity. Another is the laws and policies that enable or just kind of control the access to the process. And in between, we have to also think about the nuances of which community we are talking about. Just like Africa, South Asia, or Nepal, we have more than 120 languages and ethnic groups, so we have to talk about that. The gender issues are there.

Already, as Neema, I think Neema said earlier, those people who are already vulnerable in the offline will be vulnerable in the online. And I believe that online is just the replication of our offline world. All of the violence that we see in the offline world is replicated in the online world. So, only working in the online or making digital rights will not go, but we have to make it ‑‑ we have to embed it with the online area ‑‑ offline world also. So, that is my kind of reflection. Thank you, Raashi. 

>> RAASHI SAXENA: Thanks so much, Santosh. And now I wanted to move on to the right to the access of the Internet. And based on the panelists' experiences, what do you think can be done or avoided to promote digital inclusion? We start, perhaps, with Victor? 

>> VICTOR NDEDE: Okay, thank you.  I think if I get to your question, Raashi, there's a very interesting point that Honorable Neema brought up, which was on that we should, of course, get the taxes and get platforms and the tech sector to really pay their portion of taxes.  The distinction I wanted to make was in relation to the end user, or the end user taxes, not really in terms of the profits that these, of course, tech companies make on the continent, across the continent. It is more or less looking at the tax practices that actually hinder uptick of these services.

And one of the things we see is, say, for example, if we, of course, have various taxes. So, when you purchase the device, of course there's a tax that was levied either in customs that already pushes the cost of that device up. Sim cards would attract a certain tax. The service itself, in terms of either if it's a data plan or whatever plan you have, attracts a certain tax. And then, now you come to the operators, who, of course, have to pay their ‑‑ who have to pay their taxes, in terms of their corporate taxes, their network and licensing fees, which, of course, may not be a problem. But the key concern for us is the user taxes. Because you see, if the mobile operators' taxes go up, then a portion of that tax is then passed on to the consumers, which then means that adoption of service really slows down. And normally, the people who will be most penalized by this are the low‑income earners, who are already marginalized and are now pushed further into that sort of marginalization.

And equally, even if you look at taxation on the side of the operators, if it is deemed to be too unfavorable to operate, then of course, operator margins would reduce. They would have limited funds for, say, reinvestment or investment in better technology, which then would mean that either you have the quality of service affected or a network expansion being limited, which still then affects those we already say, perhaps, low‑income consumers, who are already marginalized. 

So, it's more or less, how do we strike a balance, in terms of, of course, everyone paying their fair share of taxation, as, of course, the continent needs a lot of ‑‑ to expand the tax base to, of course, lift ourselves and increase our GDPs, but how do we do that in a way that affords everyone an opportunity to get included on the digital platforms?

And so, to come back to your question, Raashi, for me, I think the thing is, when you talk about digital inclusion, I think of the four things I mentioned being done purposely, in the sense that we look at increasing our infrastructure access, looking at how we tax the sector. The third thing, we look at the aspect of consumer barriers, which are digital literacy and people just being aware of what the Internet is, because a lot of work has not really gone into people really appreciating how much they can get from the Internet, so that the average smartphone user would probably be on social media platforms and betting sites, and that's it, and yet, there is more to achieve, or there is more to do on the Internet, over and above those two primary things.

And lastly, of course, the content.  We really need to bring our content or localize our content to the level where a majority of the users within the continent would be able to understand what really is happening on the Internet.

If we are talking about access and participation as enablers of digital rights, when you look at some of the rights that we now ‑‑ and for the context on the continent, and maybe I speak more for even Kenya, where I come from ‑‑ a majority of the rights that people actually now exercise more online starts with access to information, freedom of expression, which still means them getting information and using that information to express themselves. 

We now see more of our protests happening online. We now see a lot of our people/associates exercise their freedom of association online. And if the content that is there for them to understand ‑‑ say, for example, conversations around a certain protest, a certain common thing that brings people together, is not in content that they understand, then basically, you exclude a majority of those people from actually participating properly in either that protest or in that sort of association, because it's in a language that they don't understand ‑‑ it's not their first or second language ‑‑ and it makes it difficult for them to participate. 

And so, it has to be a wholesome approach and it has to be very intentional in terms of how we want to ensure that everyone has access and everyone is able to participate on the digital platforms, and then, thereby, then everyone then gets to enjoy their rights online, without which we will move with a certain group of people and a majority of a population who remain rural, who remain without access or marginalized communities ‑‑ Roselyn was talking about, of course, refugee populations who still have very limited access based on where those camps are placed and what level of infrastructure is afforded to those places. So, I think those would be my comments. Thank you. 

>> RAASHI SAXENA: Thanks. I was just reflecting on what you were saying. And in India, we have a saying where we talk about (non‑English word) which basically means full housing and shelter, and we added a fourth dimension to it, which we say data, to be able to enjoy those aspects.  Neema, how, based on your experience, would you first always to promote more digital inclusion within the Pan‑African continent? 

>> NEEMA LUGANGIRA: Thank you. I think the first thing would be with the social media platforms themselves, in the sense that a lot of the people who are maybe on their side, part of the process of looking at the content and, you know, whenever one reports maybe any sort of abuse or anything like that, it's ‑‑ I'm still not sure how they make that decision that this is an abuse and this is not, especially in the context of different country's cultures. For example, I have had through my personal experience whereby you have been, you know, downright abused, but it's written in Kiswahili, so you report, and you try your best to translate as close as possible to what that statement means, and then the response you get back from whoever is on the other side is that maybe this post did not violate any of our rules. And that's, I think, if we want to manage better the issue of content moderation, I think that's one of the things we need to start from. 

Because one thing that is said in Tanzania, maybe if you translate it to Kenya, even if it's Kiswahili, they may have a different meaning or understanding. It may not be abusive to them, but in the context of us, it may be abusive to us.  How do then the social media platforms take into account these differences? Because it seems as if they have just one standard kind of blanket across all kind of reporting. So, I think that's where we need to start from. 

Then that brings me to the next point that, perhaps there is a need for them in their own teams to have a more wider representation of these different contexts.  So, that's on two fronts. But then, on the other side, as a parliamentarian, obviously, I think sometimes legislation is needed. You know, much as ‑‑ now I'm in a room where people may not like to hear issues of legislation, but sometimes legislation is needed, kind of so that to put that baseline of what ‑‑ because at the moment, online abuse, hate speech, online harassment, it's not clear. It's left in a gray area. So, maybe to remove that gray area, there needs to be some kind of legislation.

Now, the risk of that is oftentimes, when civil society's having such discussions, they, like such discussions of, if it's human rights, online human rights, et cetera, they will do these discussions in silo as civil society. Rarely would they engage parliamentarians in those discussions.  And then ourselves as parliamentarians, we will also be discussing, in terms of how to protect the citizens, how we're going to protect girls, how we're going to protect children, how we're going to protect women and all other vulnerable groups, how we're going to protect people with disability from online abuse. 

So, then what happens, you know, we'll have draft bills in place. And I know every parliament will announce that there's a bill that's coming. You know, people come for comments. What I've seen in a lot of countries is, not everybody takes that up.  Not everybody takes that up. So, very few take up that opportunity and give comments on those draft bills. Others will continue sitting on their own discussing the draft bill without inviting parliamentarians to that discussion. And in the end, most of what civil society tends to do, once they have discussed the bill and they have their comments, they come up to you as a parliamentarian. "Excuse me, Honorable," you know, "Civil Society, we have sat, and these are our comments. Can you please look at them and make sure you take them into account?" Is that participatory? 

Even if it was you, in your own office, your organization. Someone comes to you and says, "This is what we've done and this is what we need you to do."  How are you going to feel about it? Most likely, are you going to take it up? Most likely, you won't. So, I think there needs to be also a different ‑‑ we need to change the way we engage, because parliamentarians and civil society were supposed to be on the same side and complement each other, because civil society has access to research. They do a lot of funding. They can do research. They can do lots of studies. And that can be, in a way, to capacitate legislators. But depending on how we're approaching each other, then that's where sometimes there is that gap in place. So, what I would like to say is that civil society should strengthen engagement with parliamentarians, and at the same time, when there is a call for comments or call for feedback, I think everybody should take it up, because sometimes we tend not to take it up, and then later on we start complaining. 

But if there is engagement ‑‑ like, I can take myself as an example. I am probably one of the most online abused women in politics in Tanzania. So, naturally, I would probably want the most stringent rule to make sure that doesn't happen.  So, if there is no engagement between us, I will continue working on my perception, which is based on my experience. But if there is engagement, perhaps I can learn of better ways to tackle and to curb that, better ways to have a legislation that will not hinder participation, so‑called freedom of expression, but at the same time, protect from this online violation that is under the blanket of freedom of expression.  So, my call is, let's strengthen the collaboration. And that is specifically why we started up the African Parliamentary Network on Internet Governance, to make sure that we strengthen the role of African parliamentarians in digital development in Africa. And digital development cannot happen if the online space is not safe.

You know, I've seen women who, they have online businesses selling lingerie. The abuse that they get is horrific. But you see males having a business selling female lingerie, and they get claps. So, there you go. If we are to achieve true digital development, we have to make sure that the online space is safe for everyone. Thank you. 

>> RAASHI SAXENA: Thanks, Neema. I do believe that now we are going to move on to the audience Q&A. So, just wondering if anyone has any questions. Santosh, if you could help us. We have three hands and we also have two mics. Oh, there's a third mic? Okay. So, I ‑‑

>> MINDA MOREIRA: Can I also interrupt? Raashi, we also have a few questions in the chat. So, after the questions from the floor, I can let the online participants pose the questions, too. 

>> RAASHI SAXENA: Go ahead. Please go ahead. 

>> AUDIENCE: Thank you. I am Abdu from Ethiopia.  I agree with the parliamentarian from Tanzania, when she said that, in my words, Internet or cyberspace is not a no man's land. It needs regulations and policies, yes.  But the problem is, I think the Kenyan presenter said, is exaggerated taxation, like luxury goods, may, you know, affect the accessibility. So, how do you see that? Yeah, thank you. 

>> RAASHI SAXENA: We can have one more question? Maybe we can answer this question and then we move on to the next question? Yeah. 

>> NEEMA LUGANGIRA: Now, on the issue of taxation, I think what's needed is we need to package our argument well. We need to package the argument well. Because if you say taxation on mobile devices or mobile‑related issues is limiting access, then I will ask you ‑‑ and this is me here being a devil's advocate ‑‑ as a legislator, I will ask you, what about petrol? What about food? What about other areas that are being taxed? You know? So, I think we need to strengthen the argument on, in what way is it limiting participation, and come up with, like, solutions, what then should be done? Because just saying in a blanket, "remove taxes," probably that's not going to be an easy sell. But if you package it better, then, I think then it can be easier understood, because all other sectors are being taxed. All other sectors are being taxed.  Why shouldn't digital‑related sector be taxed?

I can give an example in Tanzania. I think earlier this year or sometime last year, I started up a discussion on Twitter asking, why should an online business pay taxes? Because if an offline business, which has a store, when they reach annual revenue of 4 million Tanzanian shillings, they're due to pay tax, then why shouldn't an online business, when they reach the same 4 million Tanzanian shillings not be due for paying tax? So, we need to make arguments that can strengthen our argument. I think at the moment, it's missing. Just saying that, it's not concrete enough. So, let's find ways to concretize this argument.

I agree with you. I come from a (?) region and I understand it, but I, myself, am not an expert. So, if we can get those facts, concrete arguments, then as legislators, we can be able to advocate for the same. So, this is upon you guys to help us get those concrete facts. Thank you. 

>> AUDIENCE: Hello. My name is Irene. I am from Kenya. I've really enjoyed this conversation. It's very diverse. So, I want to bring in ‑‑ we are talking about accessibility and inclusion. 

So, I want us to also look at each from a different perspective, because 15% of the African population lives with some form of disability. So, when we talk about accessibility, for us it means, how is the design and development of all the content that's online? If I'm blind, am I able to understand the image that you've posted online? If you're doing a Human Rights campaign, all the content that's online, can someone who's blind or someone with reading disabilities, can they be able to access that information? Because then we can be talking about real inclusion. Otherwise, we are leaving 15% of the population out. 

The best part about this is that the World Wide Web has a set of digital accessibility guidelines that content developers, regular designers, everyone can use. So, let's design and develop content with people with disabilities in mind. Thank you. 

>> RAASHI SAXENA: Thank you for your comment. 

>> MAUREEN MWADIME: Thank you so much to the panel for the informative session. My name is Maureen Mwadime and I work for Kenya International Commission on Human Rights. It's a constitutional commission set in the Constitution. 

From my end, very interesting discussion. And picking up from the taxes and everything else being said about affordability. I think we really need to think about regulated commercialization, for instance, of the Internet, especially in as far as e‑learning is concerned, like what the honorable member was saying. We need to be very specific when we are, you know, attacking a specific agenda that needs to be addressed. So, what we need is finding that delicate balance between business and human rights, and also establish what government incentives can be given, perhaps, to be able to incentivize the businessmen to be able to lay the networks all the way to the rural areas.

The other input that I have on the barriers that were being discussed earlier on digital literacy and access, in terms of Internet connectivity infrastructure. And I want to mention the bit about electricity. We always have this last mile aspiration, and we tend to forget that, well, not all of us is connected to the grid, so it's a conversation that we need to broaden so that we can make the aspirations really practical.

The other thing is on sustained public conversation, especially with those impacted most with digital technologies. Most conversations happen in urban areas, closed spaces ‑‑ parliaments, boardrooms ‑‑ and not necessarily in those far‑flung areas where the vulnerable and marginalized groups really are impacted the most by the digital technologies once they are rolled out.

On the online violence and harassment, I totally agree. We've had several cases that have led to (?) especially now just‑concluded election.

And the other thing that I didn't hear mentioned is the lack of coordination on tech governance actors and the lack of capacity from oversight mechanisms to effectively discharge their functions in digital spaces. We are a national Human Rights institution. There are sisters and brothers here who are also NHRIs, just like we are formed. Unfortunately, we are always left out of the conversation and come in at the very last stage when the violations have occurred and need to be remedied.

Then, overlaps and lack of clarity on the functions of institutions set up, for instance, to monitor hate speech. In Kenya, we have the Communication Authority of Kenya and the National Cohesion and Implementation Commission. These need, for lack of clarity, on who is doing what and what is the threshold when it comes to hate speech so that, again, the self‑censorship does not apply.

And lastly is the lack of trust across the stakeholders. We really need to understand the players in the ICT ecosystem to understand and appreciate the kind of leverage that each player has. Each player has very different strengths, and I think we can all leverage on that to be able to make this space very enabling in terms of realizing the rights of each and every person within Kenya, and of course, beyond. Thank you. 

>> RAASHI SAXENA: Thank you so much for your comments. We have a hand up there. 

>> AUDIENCE: Okay, thank you very much. I'm (?) from the Democratic Republic of Congo. I would like just to add something about the taxation. You know, many countries in our African continent are really ready to create imaginary taxes, because yes, they really want to increase their GDP, but they are not ready to agree we need an alternative financing model based on models adopted in their regions.  So, I'm trying to ask myself, why? And what are the intentions behind?

Yes, we don't really need legal frameworks, because in some of our countries, there are policies that are really stating that public authorities have to provide the population with electronic communication services. It's enough for double price, get the list of (?) geographical location in the national territory. And in my country, we call it the principle of universal service. But recently, the civil society in my country has been fighting so that an imaginary attack that's been created may be removed. The tax was called ram. It was a tax on mobile devices, okay. It's a tax on mobile devices. But when we tried to analyze the context behind the creation of that tax, the tax was not really illegal. It was just an imaginary tax. And finally, we've been able to get the removal of the tax. So, if you can analyze as well, some studies that have been carried out have been able to prove that the taxation is really a hindrance to digital inclusion in African countries.

And yes, in conclusion, I would like to say that I really strongly agree with the idea from the UN Secretary‑General saying that when our countries are prioritizing taxation, and when they really want to make profit to the expense of people, yes, we are really leaving ourselves with an incomplete picture of the true cost of economic growth. And let me tell you, the true cost is simply the digital inclusion. Thank you very much. 

>> RAASHI SAXENA: Thank you so much. I'm going to move to an online question. And I also want to keep a count of how many hands we have. We also have only five minutes, so we suggest you keep your comments and questions short. So, there's one question, and then I'm going to hand it over to Neema, who has to leave, so she'll wind up as she has some comments on the points that were just made.

What impact can ‑‑ this is a question coming in from the Advocate Supreme Court of Bangladesh. He says, what impact can access and participation have as an enabler of digital human rights in developing state like Bangladesh? Santosh, perhaps you want to take that? 

>> SANTOSH SIDGEL: As a member of civil society, as earlier said, there are multiple stakeholders in the ecosystem and civil society is one of the parts, and we have been working to promote and protect digital rights. And we have been working with the legislator member of Parliament, parliamentary committee, the Nepal Technical Authority, that is the regulator, and with the end user also. So, in Asia as a civil society organization, we have a very important role to promote the discussion around it and to influence the policy‑making processes where we can put the right best languages, at the same time enable the kind of environment where other stakeholders could also find their voices. So, it's the kind of delicate balance between the (?) and process of negotiation, policy‑making process to enable the digital rights, the process of a negotiation, I believe. But sometimes it's not easy. 

For example, the technology bill in Nepal is in the Parliament for the last three or four years and we have seen no consequences because there is kind of civil society saying the bill will cover the freedom of expression that will really have impact on the privacy. There is noted protection law and the government is trying to push that bill, so that is there. But I believe that as a principle, digital rights principle, we have the basis to promote this access and participation in the digital space, remaining all the challenges we have. Yeah, thank you. 

>> RAASHI SAXENA: Neema, would you like to give your comments and also what role you think stakeholders and groups to ensure that individuals can obtain the appropriate adjustments for human rights violations online? 

>> NEEMA LUGANGIRA: Thank you. And thank you for this very vibrant discussion. Very quickly to conclude. The issue of taxes on mobile phones. You know, here, I think all groups need to be made accountable. I can give an example in Tanzania. In Parliament, actually, it was last year, we pushed for taxes to be removed on mobile devices, and they were removed. So, for a year, government followed up to see if the prices are going to drop. Their prices didn't drop. So, the following year, it was put back. So, now, what argument do I have as a legislator to raise the issue again if civil society, legislators, pushed for something, it happened, private sector did not? So, I think civil society, when we are looking at these things, let's just not look at government, Parliament, let's also look at private sector. What are they doing? It's all of us have to play a role towards this issue.

And as a concluding remark, I think for us to strengthen the status of the online human rights, all stakeholders need to discuss it together.  That's why it's very important in such a discussion to have legislators there. Excluding legislators creates a risk of legislation being drafted and passed which, perhaps, may not be aligned to the ecosystem. So, that I leave it to you, to all of you. If you want to engage African Parliamentarians, the African parliamentarian Internet governance is there if you want to participate. But if you choose to do it on your own, later on, do not point fingers back to us, but rather, point back the finger to yourselves. Thank you. 

>> RAASHI SAXENA: Thanks, Neema. We have hands up, and we'd request everyone to keep their comments and questions under one minute because we are heading into the last ten minutes of our sessions and we still need concluding remarks from our panelists. 

>> AUDIENCE: Yes, thank you.  It's so sad Honorable Neema is leaving us. I wanted her thoughts on this. For example, in Ghana, when the Internet boost came ‑‑ anyway, my name is Joshua from Ghana ‑‑ it came with a lot of promotions for social media. Even right now, as there is a lot of digital tax in Ghana, even you have an option for Facebook data saver, where if you have less data, you can still use Facebook. I'm thinking can we tax more of the social media sites, because then there is also the statistics that a lot more Ghanaians spend an average of five hours on social media doing nothing productive. 

Just 2% of Ghanaians on the Internet use it productively. So, how can we then ensure that it becomes quite expensive to waste time, so to speak, on social media, whereby, right, whereby we reduce or give incentives for using it productively, like online education sites and other things that would build Africa? So, thereby, we're getting our taxes, but we're also not limiting access. In fact, we're improving it for productivity and for longevity. 

>> RAASHI SAXENA: Thank you so much. We move on to our next comment. I see someone here in the front. Is there anyone else who has a question? You have a comment. You have a comment, too? Okay.  Okay, then. 

>> AUDIENCE: Okay, thank you. I am Hannah and I have a question for all of you. As we see that the cybercrimes or the cyberspace is one of the human rights violation instrument becoming the prime human rights violation instrument, what about the exposing stage for human rights violations in the world? It's become reluctant and we see that everywhere, there is a massive human rights violation by this digital world. I think we are not moderated to see those human rights violations these days because those economical, social devaluations, the fake news, the fake reports, different agendas covered by other agendas, also human rights violations, and also the government in the other organizations' works are being exposed, as we are seeing there is a huge human rights violation everywhere. 

And as we are Africans, that we are becoming part of this digital world by massive billions numbers in the future, I think we have to be more ready to see those human rights violations becoming out there in everywhere, in every place. So, the digital wallet is not only as an instrument to violate the human rights violations, it's also an exposing the stage to see those human rights violations becoming, all those agendas being exposed and everything. So, I have a concern for all of you to think about this. 

>> RAASHI SAXENA: Thank you. Minda has her hand up. Minda, go ahead. 

>> MINDA MOREIRA: Yes. I'm just aware that we are really reaching the end of the session. We still have one question here in the chat that you might want to read and pose to the panel. And yeah, that's all I wanted to say right now, because we will start to have to wrap up in the next two‑three minutes or so. 

>> RAASHI SAXENA: Absolutely. So, the question, I believe I missed, it says: What do civil societies think about hosting a high‑level meeting like the IGF in countries that have weaponized the Internet and communication to violate human rights like in Tigray, Ethiopia? Does anyone here, the panelists want to take this question? Yohannes, perhaps you want to take on this question? 

>> YOHANNES ENEYEW AYALEW: Sure, of course. I think when we think of violations in Tigray, we have to look beyond Tigray, like the conflict was not only in Tigray, but access now, and as civil societies report, at least 10 million people were affected by the Internet shutdown. That means that whenever we think of Internet blackout, there is accusations, as there is counteraccusations, between the government and the Tigray People Liberation Forum. Initially when the conflict was outbroke in Tigray, the government, especially the telecom, has brought, you know, evidence and also videos to introduce that the forces cut the Internet in Tigray. The Internet organization counteraccused the federal government for, you know, depriving the people of Tigray accessing Internet.

But for me, as the problem lingers in three regions, so we need to programitize beyond Tigray, so the government in the future, as well as in policy and laws, especially in the digital policy as well as in the (?) plan, needs to work hard, especially to end the practice of Internet shutdown. And also, like, this is also an unfortunate moment for the Ethiopian government to reflect upon its past practices, and also like in the future to take the digital rights of people not only in Tigray, but also in our far end as well as other parts of the country, especially in Welega area at the moment in the western part of Oromo, there is also Internet blackout. So, we need to think beyond Tigray. So, for this climate, I would say that whenever an advocacy is made towards these kinds of issues, I know, I understand, you know, the pain that Tigray people have suffered in the past two years, but also the people in Amhara have suffered the same thing, so we need to contextualize the issue beyond Tigray. This is my response. Thank you. 

>> RAASHI SAXENA: Thanks, Yohannes. I do believe that now we will need a one‑minute takeaway from all our speakers. We are running out of time. And then I will also have to hand it over to Minda and finish my concluding remarks. So, we perhaps could go online. We will go with Catherine. What would your key actions and recommendations be to all stakeholders to ensure that we have a free, open, diverse, and human rights‑based environment online. 

>> CATHERINE MUYA: Thank you. So, I think I'll start first with the legal and regulatory framework that Neema was talking about. And I think there are a lot of lessons that we can learn from Africa as a whole. I once gave a talk at Namibia IGF last year, and I was telling them about our universal access regulations and our infrastructure‑sharing policies that deepen connectivity and network infrastructure that is really key to promoting accessibility. So, I think that learning across from each other, just really learning about our different regulatory frameworks, is really key.

The issues around taxation were that, I know for a fact that in Kenya, we haven't agreed to the ongoing taxation framework that's been promoted by ‑‑ like further has been settled on by the OECD, mostly because Kenya has its own taxation framework. But I think it's something that requires constant conversation, and it's something that also requires looking at, and as she was saying, looking at stakeholders and how they're implementing different things.

But when it comes to regulation, we have a lot of cyber regulations and cybercrime laws in eastern Africa around cyber harassment, but I think the biggest issue comes with the implementation of the legislation. So, the legislature will make this law, but the implementation will not actually be used to make it very effective to deal with online violence or online harassment. The implementation will be that it will probably be used to harass media practitioner as and bloggers and other things, and yet, the people who face harassment really don't have a clear way of addressing or accessing remedies as I had alluded to before. So we need to design regulation also in a way that's effective, and that means that we need to constantly dialogue with policymakers, as she was saying. But then, that also, that work for civil society also has to be supported.

And so, my final call is to donors also as well, because sometimes when we present projects that have to do with law and policy and things like this, they assume that it doesn't relate to social change, but it actually does in terms of creating a very sustainable environment for free expression, sustained discussions with policymakers that don't immediately reach your outcome in three months as you would different projects, but they do have sustained and long‑lasting solutions. So, I think that this work for civil society also has to be supported. 

>> RAASHI SAXENA: Thanks, Catherine. Now we move on to Roselyn. Keep it within a minute. 

>> ROSELYN ODOYO: I like what the Honorable Neema said when she talked about a multi‑stakeholder approach, because the truth is that any sustainable change tends to happen where there's collaboration. But more importantly, because we are talking about human rights‑centered approaches, this collaboration needs to be human‑centered in the sense of ‑‑ I think somebody had mentioned earlier when talking about taxation ‑‑ when governments put profit over people. 

And so, how is it that civil society, for example, can engage in a way that's intersectional and keep government accountable? How is it that government can engage? Because every law will ‑‑ every power‑wielding entity is susceptible to abuse or abuse of power, right? So, how is it that government can then also learn and engage in spaces and in collaboration with civil society to find out what is harmful and ways in which ‑‑ because we've been talking about taxation, but there's also negative taxes, like the social media tax imposed on, let's say Ugandan citizens, for example. So, how is it that we can find the balance? How is it that we can work with big tech and telecommunications companies as regards the infrastructure as a pathway to access for people in places for which infrastructure's an obstacle? 

And so, I believe that if we are able to work from a place of intersectionality, from a human‑centered place and driven by people over profit, then access is achievable. 

>> RAASHI SAXENA: Thanks, Roselyn. Victor? 

>> VICTOR NDEDE: I think in the interests of time, I'll just underscore the importance of a multi‑stakeholder approach that has all the actors in the space collaborating to ensure that there is access from the private sector to the government or the executive and legislators, and even civil society as all the key barriers to access actually affects ‑‑ the key barriers to access actually involve all the key stakeholders. Thank you. 

>> RAASHI SAXENA: Thanks, Victor. And I'm going to pass it on to Minda. And unless Yohannes has a few lines to say, but...

>> MINDA MOREIRA: Thank you. 

>> RAASHI SAXENA: Go ahead, Minda. 

>> MINDA MOREIRA: Yes, thank you. Yohannes, would you like to say anything else before I close the session with Raashi? 

>> YOHANNES ENEYEW AYALEW: Well, I have two points, one on the practice of Internet shutdown, that is on access. The other is on content moderation. Regarding Internet shutdown, I think in Ethiopia in the past like ten years or so, like the practice of Internet shutdown was severe, but especially in the context of the world in Ethiopia, the Internet has been blocked in three regions, not only in Tigray. So, whenever these kinds of forums are organized, you know, these are an unfortunate moment for the government to reflect more, especially to keep it, the Internet on, and also to enable individuals who were deprived of access to get a better access of Internet and digital technologies.

Regarding content moderation, I think social media platforms should work more on the issue of linguistic diversity, and there is language (?) especially when we were flagging content that causes, you know, violence offline, especially in platforms like Meta, were not responsive. So, we need to work more on that.

The last point will be, like, they were also monetizing, you know, from violence, or they should think about, you know, deploying artificial intelligence, especially they need to think to train this artificial intelligence technologies with local context as in Ethiopia. In Ethiopia, we have 92 languages, so that you know, whenever discussions, like Internet Governance Forum, are being made, we need to talk about, you know, how these platforms are responsive to the issue of linguistic injustice and also in countries like Ethiopia where, you know, things are understood on initiative, we need to think beyond ethnicity and tribes online. Thank you so much. 

>> MINDA MOREIRA: Thank you, Yohannes. I would just like to say thank you to everyone, this great panel. It was a brilliant panel, and we have learned so much about this discussion, and it is something that we will take as well for the work that we are continuing to do at IRPC. I just wanted to remind everyone that if you are part of our mailing list, we will have our elections for the steering committees soon. So, if you would like to be more involved, just feel free to apply for one of the positions available.

And finally, I would like to also thank Raashi for the great moderation and our technical team and our captioner today that I think is Ellen. And I see you all around at the IGF, and hopefully also next year in Japan. Thank you. 

>> RAASHI SAXENA: Thanks, everyone. The session is now closed.