IGF 2022 Day 3 WS #502 Platform regulation: perspectives from the Global South

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

(Speaking in language other than English, no translation). 

>> LUCIA LEON:  We can start in English if the others are okay with that?  The other panelists?  Okay.  I can certainly repeat the first question in English so that we're all the same ‑‑ for the audience maybe.  Okay.  We are trying to address how do the content moderation policies and practices of major digital platforms impact differently on different regions of the globe and what regulatory principles of content moderation processes can be better of Freedom of Expression and other fundamental rights in diverse legal systems and what are the challenges for the regulation of large global digital platforms and what are the prepared mechanisms to do so.

We'll start with Raul.

>> RAUL ECHEBERRIA:  Thank you to the organizer, hello to the colleagues on the panel, especially those connected remotely.  Hello, we have met in the past in Brazil.  He's done an excellent job with Brazil’s joint data protection.  It is good to have him in the panel with us.

This is supposing that we're going to talk about the perspective from the Global South, I can't speak for Latin ‑‑ I can speak for Latin America, I think this is similar to the majority of the world as now we say fortunately we have the change in expression in the international fora, changing Global South to the majority of the world, it is much more representative of what we're talking about.

There are plenty of ‑‑ I would like to focus on the principles for regulation and things we can do in this field.  We see a lot of proposals of regulation, law proposals and sometimes they conflict with Human Rights in developing countries.  This is something very common, and while the proof of that is while we have many times disagreements within private sector, Civil Society, most of the times that we discuss law proposals in Latin America we're on the same side because of this point.   .

This is related, but besides Human Rights issue, there is another, the conflict with the development objectives.  This is what differentiates or should differentiate our regulatory approach from other parts of the world.  We're usually put in front of existing models and we know the U.S. model is a business‑driven model, not saying that's a bad thing, it is the way they present that.

We have the Chinese model, that's also business driven but with a huge participation of the states in implementation of the policy.

We have a third model, it is the European model, a model that's based on the protection of rights.

There is no one of those models representing really our interest, the protecting Rights of the people, especially Latin America, that's the region I can speak about with more properly, it is the protection of rights, something that is very important for us, it is essential in our culture but at the same time, and there is no trade‑off possible here, and this is something that we have to understand.  We need also that the policies are compatible and in alignment with our development objectives, this is something different, for example, from the European framework.

Having said that, I think in order to be efficient with the time, the proposal, the law proposals, the legislation proposals should pass a test, a test on Human Rights, and others, and what is known in the Internet, it is the Internet variance, and it is those properties that are key in order to keep the Internet as a valuable asset for human kind, and those are global reachability, integrity, and no fragmentation, one of the objectives that the Secretary‑General of the UN has proposed toward the Digital Compact.

So the way that we can run the test, be sure that we pass those tests are first of all, it is to have the participation of all stakeholders, the involvement of all stakeholders from the beginning of the discussions, so this is a way to anticipate, sometimes what happens, after one year, that somebody or a team or a lawmaker has been working intensively in the signing of a proposal and the public discussion starts, it is when we realize that the proposal has problems and we could have been more efficient if all stakeholders would have been involved in a proper manner from the beginning of the discussion.

Efficiencies, it is not a minor thing, the Global South, Latin America in my case, we don't have an abundance of resources and many people think that developing policies, it is something for free, but it is not, it is very expensive, expensive in terms of time, energies we put in that and on the implementation, in order to be efficient, in a multistakeholder model would not only permit to have better quality law, but also to be much more efficient and to have more time to spend on important things.

One more minute.

Regulation, but it is beside what I already said, regulation is not the only answer to every problem.  We need innovative public policies, instruments and I heard had morning, I felt verified myself, this is something that I always say, that it is not only regulations and law, we have to be innovative in the ways we develop public policies and need new instruments like sandboxes, international, I think that is an innovative concept, maybe we could work with two, three, four countries and with the commitments to analyze and to evaluate different kinds of policies.  We need this agreement and good practice, this could be enforced voluntarily by governments and private companies, Civil Society, those are new instruments and new mechanisms and I'll speak about that in the next intervention.

Thank you very much.

>> LUCIA LEON: Thank you.  Lillian Nalwoga.

>> LILLIAN NALWOGA: Yes.  We were discussing about how the content moderation policies can impact in a different way according to the region and also about what measure, what principles can be applied in order to better have Freedom of Expression and other rights according to the various legal systems.

>> LILLIAN NALWOGA: Thank you.  Thank you once again for inviting me to participate on this panel.  First of all, I was a bit lost in translation.  I think I should add Spanish to my list of languages.  Like Raul Echeberria has already shared, say, from Africa, we're seeing quite a huge investments from digital platform, and indeed where it has been a huge presence, we have seen some sort of push back from governments in how the platforms are operating.

Of course, the aspects of say violations, Human Rights violations which Raul Echeberria has already alluded to, the issues of data exploitation, Freedom of Expression when it comes to, say, how they're enabling rights violations like pushing for or enabling misinformation on the platforms, how it is regulated.

Several scenarios came out and in most cases we have seen most of the platforms without mentioning names has been in terms of enabling maybe start‑up companies, digital hub, that kind of thing.  There is also the issue of labour exploitation.

The question we ask then, is regulation, how can we regulate the platforms?  We have seen some governments have gone ahead to take it to the extreme with blockages, you know, blocking shutdowns, targeted social media blackouts, and again, I think just as has happened say in Europe where we have seen that the companies have been pushed to register, you know, to have registrations and offices, this is something that governments from Africa need to look into.

One, we have said that most people experience the Internet through social media, and as we get more and more people connecting, they'll not go to ‑‑ it will be probably through Meta related platforms, Facebook, WhatsApp before going to, you know, searching, researching about Google.

When you look at these aspects, you look at the violation, the moderations that are on this, there are at times the conflict between how much can be done with the platform.

I think that the thing would be talking about regulation, one, can we look for or push for registration, having offices in the countries where they operate because then that would provide the jurisdiction issues in case of violations of the platforms, where do we go.

In Uganda, where I come from, we see issues of violations and the other extreme, those that are genuine, again, you know, people hide behind the platforms.  You don't know where to report a case.  If you take a case to say law enforcement, there is the jurisdiction issue for cases, you know, you have to go all the way to Ireland to get that.  Even when there are genuine crimes happening, there is that issue. 

I think in terms of looking at how do we enhance our experience on these platforms, first thing that we can look into, it is pushing for having physical offices, physical places for ‑‑ offices for the platforms, in the country where is they operate.

Another example, there was a case in Kenya ‑‑ I would say the name Facebook because I think it has been researched and used ‑‑ there was a case of exploitation issues with the content moderators.  Facebook hired a third‑party organization in Kenya to, you know, hire content moderators and there was lots of labour exploitation and these people could not sue Facebook because Facebook is not legally registered in Kenya.  When you look at issues of labour rights, you know, data extortion and exploitation, I think this is where we should be pushing for regulation and registration of offices and that gives us jurisdiction aspects when cases like this arise and then making the platforms accountable in the countries where they operate.

Thank you.

>> LUCIA LEON: Thank you for that.

We also have the participation of Paula Martins who is online.

>> PAULO VICTOR MELO: (Speaking language other than English).

‑‑ when we talk about regulation of digital platforms, we have to understand the structure that supports all of the platforms and where they're based where they're sitting on basically and some authors call digital colonialism.  This is the basis I guess for democratic regulation of the platforms.

The first thing I would like to draw your attention to is for these platforms not being packed for governments that are committed with fascism for example and I don't know if this is the best place for us to talk about this, recalling that although IGF is receiving us, remotely, but IGF is happening in Ethiopia at the moment, one of the most historical examples of resistance against fascism the grammatically las went out and kept them from invading.  We have to look at this carefully and keeping the hate discourse from spreading, one of the discussions, for example, that happened, it was when Elon Musk said where is my friend Trump, I miss him in Twitter, the platforms, they're not a place for friend to find each other, to meet up let's say although companies own it and they're shareholders, it is a digital public space and this public space, it is not separated from the non‑digital public space, everything that happens in the platforms have repercussions in the physical space really.

So although this is not the space for fascist stages and other projects and it doesn't match with the democratic perspective of the platforms.  We're going through the world cup, if every play was decided by only the ones in that land, the ones that are playing there, can you imagine the craziness?  What's happening today, it is we're not looking at every country's particularities and being authoritarian, making disinformation, hate discourse, political disinformation, really being able to permeate and circulate through these platforms.

For example, in Brazil, we have some groups who have been acting in a freeway and finding a stage in these platforms to defend coups against the democratic processes.  The regulation, it is really not a fight against the platforms, but a fight for the lives of the peoples, the human beings that have race, gender, belonging in a territory but are also part of the platforms.

Thank you.

>> LUCIA LEON: Thank you so much.  We have also now Orlando Silva.

>> ORLANDO SILVA: Good morning.

First of all, I would like to congratulate everyone here starting by Raul, also Lillian and Paulo and everyone who is organizing this super important forum.

First of all, I would like to say that to regulate the Internet is this is a yes or a no?  For sure it is a yes.  It is actually extremely urgent because the Internet cannot follow-through as a Lawless territory and that be the rule, no laws for everybody.  It cannot be a territory where the rules only support the profit for big tech companies.  We need to have commitment, global commitment of Internet regulation.  This is something that to me is very important and speaks deeply, although we have the need of having national laws and rules, specific rules for each country adapting each reality of each country, it's necessary in my point of view for us to have principles, true concepts, parameters, a plan, a global plan, really internationally, I'm sure that the new President of Brazil, starting January 1, 2023 will have within his international agenda a dialogue, a much needed dialogue for the rules for Internet and its regulations.

I think this is our starting point for our reflections.

Apart from that, I also think it is much needed for us to have a model being built which incorporates both public schedules and agendas, government agendas.

>> This is the booth.  The audio is breaking up a little bit.

>> ORLANDO SILVA: Having the Internet platforms, incorporating Civil Society in a table, in this global structure so that we can help and structure the rules, global rules, and local rules.  This is essential really so that we can guarantee Human Rights across.

I don't see any possibility really for life to not happen with a very strong digital presence.  It is our reality, it is a growing reality.  It is not going to stop.  It is something that is more and more present in all of our lives as more connectivity happens within the context of global population and the space of society, it is always going to have a virtual dimension to it, and the risk to Human Rights is with the risk to life itself.  We just left, we just got out of a pandemic in which Internet's place was a place where we were supposed to find information but often it was a place of disinformation, and often that disinformation was providing information against freedom of speech was not enabling really life to go on because the disinformation was causing people to die.  What appears to us as a fabulous model for everybody to be able to speak up brings with it risks to the freedom of speech, and we have to have freedom of speech guaranteed.  This makes it so that digital platforms cannot operate according to their own criteria, but rather on these models that have been set up together publicly as a whole.

Here in Brazil, we have a very specific issue, it is to moderation, yes or no?  There are those who defend that these platforms should not have moderation.  We shouldn't moderate any type of content.

In my understanding, not only should we, but we must have moderation.  Nonetheless, we have toy mechanisms so that the user can actually question this moderation, and this moderation has to be done actually based on fundaments, in this case, freedom of speech, it is really blocked or they may have a way somehow of a channel to communicate.

This is an example when we talk about moderation of a topic that really demands global parameters and creation so that this can adapt the rules and specific aspects of each different nation.

Freedom of speech does have a sequence of parameters and it can operate as freedom of speech in each different country, but it is unavoidable we have representation in each of the country, because the laws are never just going to be globally equal.  Laws are local, they have local impact and it is very, very important that we have this within us.

Other essential rights, privacy for example, they're permanently at risk in a space that has no regulation or only private regulation without any government or public regulation is what we observe currently.

For example, everything is growing in this crazy scale and it offers risks to privacy.

Another thing, another central right I would like to mention that all across the world, democracy and these platforms not regulated can represent risks to democracy.  The free availability of information and knowledge impacts decision making.  We just finished having democratic elections here in Brazil, democracy that's actually being reclaimed, and for sure millions of Brazilians participated in this process, voted and what we saw in the end, it was a mission, a mission on behalf of the platforms.  Effecting democracy.  This tries to influence the decision making regarding voting, we have to have international, national cooperation to unite the big tech, Civil Society so that we can really make the best of all the wonderful things that Internet has to offer for our personal lives and economy.

>> LUCIA LEON: Most of our lives, we ‑‑ the first contact that we have is really through these large platforms instead of being in other types of spaces.  With this in mind, we have different propositions of how we can look at this topic.  I think Raul Echeberria here was saying that not necessarily really a regulation as we were thinking, but rather different types of policies.  I can't recall, maybe you can explain.

The same sense, we may have other more specific suggestions like elections, for example, which is an important topic I think, and the topic of creating third parties for moderation and other aspects as well.

Now I would like to continue this talk here by asking how do you believe that the regulations that are happening now at the Global North, in the United States, Europe, could impact Africa and Latin America?  What is the local focus that Civil Society in these different regions should promote to guarantee that the regulation is aligned with the international standards when we talk regarding Human Rights?  Raul Echeberria, could you add to this?

>> RAUL ECHEBERRIA: Public policy instruments, it doesn't mean that we're against regulation at all.  Regulation is needed in every human activity.

Also, in the Digital World, it is plenty of regulation.  This is something I would like to clarify, platforms are subject to regulations on consumer protections, on data, personal data protection and in fact the law, the law, it was possible to lead that discussion and it is one of the most exemplary data protection law in Latin America and I think it is also outside of the region and it includes a lot of protections to the citizens and restrictions in the way that data is collected and used.  The data is a matter of regulations in many platforms due to the business model and subject to different kinds of regulation. 

There are also regulation content moderation.  There are a lot of laws around the world, in the reaming too, to moderate content to protect intellectual property and to control gambling and illegal activities and many other things.  There are a lot of regulations on that.

The discussion probably we should focus the discussion on, it is how we move forward in improving those regulatory frameworks.  I would repeat, it is not only developing law but good practices and also when we pass laws, we have to pass the test, that's my proposal, what I already said in my first intervention.  We should be careful that the laws are not contradictory with the protection of Human Rights.  Many times with Human Rights law, the laws try to protect, so there are a lot of unintended consequences of the public policies that are promoted many times.

I agree, I think that we could agree on almost everything that's been said in the first round with regard to the things we want to protect or the challenge that we have and the things we care about.  The problem is that when we try to put that with a specific text and person, this is when we see the real challenges and the way we should move forward, it is to start to get agreements on principles, general principles and then trying to go down, trying to move for concrete aspects.

One of the things with regard ‑‑ it is important ‑‑ sorry to keep in mind ‑‑ that the platforms are in between two thinking groups.  One group that think that the platforms should do much more in content moderation in order to avoid hate speech, digital violence, and many other things and another group thinks there should not be moderation at all.

In a democracy, we have to consider all of the points of view, even if we're not comfortable with them.

The platform is in between the two groups and this is something that's not depending only on the platform but depending on the discussions that society has to have.

With regard to the specific aspects of that moderation, like things trying to solve the issue of these, one thing to avoid, it is to criminalize the platforms.  We have to discuss this at the same time that we discuss the responsibility of other stakeholders like governments.  Governments have huge responsibility on keeping a healthy, public debate.

Recently, the Freedom of Expression Rapporteur for the Americas started an open consultation, an open dialogue and in the dialogue it is said that the public debate, the equality of the ‑‑ the quality of the public debate in the regions had degenerated a lot.  So the Rapporteur has said many times I have heard him many times saying we cannot expect to have a high quality debate in the digital environment if the main actors of the digital debates don't perform that way in the real world.

The digital world, it is a representation of what we do.  We have to discuss regulation of platforms at the same time we discuss the possibility of all stakeholders on these things.

Just to finish, I will say, in the beginning, that we need also new mechanism, and we I think that we have to think about that, what is the mechanism and where we can discuss many things, I understand what has been said about the territoriality of the laws, but we need also agreements at international level and we need to think about how we can innovate and develop new mechanism, new kinds of mechanisms that could be more useful to achieve these objectives.

Thank you very much.

>> LUCIA LEON: Now we move to Lillian. 

When trying to discuss what's under discussion in Europe and America will impact Africa in this case and what's the regional approach that Civil Society in these regions could move forward to ensure that regulations align with the Human Rights standards.

>> LIZ OREMBO: Thank you.

Interesting question.  One, I think it has already been talked about by the previous speakers, it is platforms are becoming very powerful, wherever they are.  What more can we do to get around, you know, like what was said, there is already one side that is pushing for content moderation, there is a side that's not for content moderation and the thing we're seeing in Africa, it is the aspects of the regulation, the call for the regulation, it is non ‑‑ the content moderation, it is with The Rights, Freedom of Expression, hate speech, misinformation and the like, but then there is also the issue of the economic value of these platforms.  Like what was alluded to, many people are going to the platforms as, you know, a way of marketing their product.

They're using them for business platforms.  They're enabling them to get out of the, you know, poverty basket.  Again, where we see, it is, yes, at times we hear, especially from the very big ones that they do not get that much of, you know, revenue from Africa but every day I go on any of the platforms I experience a social media ad, an advertising ad, and what we're seeing, we're having now governments who are pushing for regulation, but as a way of taxation.  Again, how do we balance this, you know, where a platform is not physically registered in a country.

From the Civil Society, yes, there is the issue of trying to push for guidelines if we're going to go into say aspects of taxation, how it should be.

I know already in the European ‑‑ in Europe, this platforms, they do pay, they register and pay taxes in, Africa, it is not the same.  However, there are some countries again who are pushing for this, you know, but then as from Civil Society, of course, there is the issue of dialogue and the aspect of whatever approach we're taking, whether telecommunication just moderating content or we're looking at the digital economy, taxation, there has to be ultimate respect for rights.  When you're looking at taxation, there is an issue of again the issue of exploitation and with exploitation there is an issue of at what point do we get the tax or do we get the value for, you know, the ads and at what point is the government not overstepping, you know, in terms of trying to control how much the platforms can gain from, you know, from the citizens within Africa.

I think there is the need for pushing for dialogue, where you have ‑‑ if there is any guideline, principles that are to be followed, they have to be within the international Human Rights standards.  Again, it is difficult, at times it is difficult.  Even when we have this conversation, it is hard to see the platforms, you know, coming to be with you in the same ‑‑ on the same table.

What I have seen, there are those that are of interest, of course, let's say the biggest platforms like, you know, those ran by Meta, Facebook, WhatsApp, they're doing some bit of engagement, you know, working with different Civil Society organizations, building capacity for understanding how the platforms work.

Again, there are those other platforms that have just decided not to, you know, engage and not to mention yet twit, Twitter is totally, you know, almost, you know, an office is opened and people are just fired.  There is that aspect of how do we get these platforms to come on the same table, for them to understand how to use the platforms and at the same time not to abuse them.

>> LILLIAN NALWOGA: Thank you very much, Lillian.

We're running out of time.

I'm going to pass very quick to Paulo Victor Melo.

>> PAULO VICTOR MELO: Today, I won't take too much time in the second round.

I think from the point of view from the Global South it is important that the digital platform regulation happened in a way that they don't validate these authoritarian government projects.  At the same time, that these platforms follow certain rules.

I think that a little amount of tech, not too much pluralism, diversification, so I think we would have to create competition somehow so that we don't have monopolies.  We have to say that the main digital platforms were able to really overcome the providers from specific countries, which is called a platform imperialism.

So the combat against these practices that were not of competitiveness and as mentioned before, also we have to respect standards for Human Rights and for freedom of speech and also the right to these rights really before the entrepreneur world.  We have to consider that this is a market that's in continuous development, continuously and also consider the different means of the different types of accesses or facilities to connectivity, for example, when you consider the Global South and specific parcels of our population in the Global South that really have low rates of the availability really to the connectivity and literacy even.  When we consider this, we have to consider these agents and these participants.

In the end, also no participation that says itself is democratic can present themselves from participating actively in the different locations where they are, and also the digital.  This digital environment cannot be a topic of only specialists but should also involve the population that is being directly affected by that and what I'm calling here, digital colonialism.

For example, these kids who died at the extraction of minerals that supported a digital world per se, how are the Indigenous People going to support this for the extraction of mineral?  How are they going to participate in the regulation of these platforms?  I think these are essential basis for to us actually have a digital Dynamic Coalition environment.

>> LILLIAN NALWOGA: Thank you so much.

I now would like to pass the floor to Orlando Silva.

>> ORLANDO SILVA: Thank you.  Thank you so much once again for this opportunity.  I would like to add and to wrap up my participation here with two things.

I see very few rules like Raul, few rules actually in the Human Rights protection and public interests, just very, very few, few rules.

I believe that the rules should be pointing out to the obligations of transparency, that the digital platforms need or must have in order to demonstrate that, yes, they are respecting essentially Human Rights.  I think that's something that I would like to have more time to really base my thoughts on.  I don't have that.  The toxic public debate that's not healthy, yes, I do agree with Raul, it is not an issue of the platforms, the quality of the debate, but I would like to call the attention to one of these aspects which is what we call the accounting with the public agent so that is the presence of these leaders, these public spokespeople who need to be more responsible and actually be liable, suffer sanctions according to the way that they're behaving themselves.

I mean, these are words again, repercussion, public repercussion, and these are the main ones.

Also, in the Global North, to use here the categories that were presented, I also think that the Global North should be a reference to this regular active debate that we're having.

The German experience to the fight against the hate speech so it is the German experience regarding the hate speech, when they work on the logic of self‑regulation in which the digital platforms establish rules following the standards that are fixed with the law and are based on principles, based on the nature of the nature of Internet which is based on the inoccupation, non‑occupation, and so very detailed rules become old very quickly.  They are just outdated.  That is one of the Internet's characteristics. 

I think the idea of regulated, auto regulation, it can exist and it tests then the responsibility for each one and the policies are not defined only considering commercial interests.  They are also considering public interests that is fixed in the regulation of the Internet.  I think this is a whole different topic that we have to work on or could and it could bring together different regulatory expanse we have seen throughout the world considering each locations reality and the global.

Thank you for the opportunity to learn from you this morning.  It is an enormous pleasure to see Paulo Victor Melo and Raul Echeberria and everybody here participating with us.

>> LUCIA LEON: Thank you so much.

Thank you, everyone, for being here, for participating, thank you for the regional context and positioning and really making available your own experiences to all of us.  Also to keep safe.  I think we all have it clear here, the need to respect Human Rights in any type of public policy and if it is difficult to speak about how and the specific mechanisms, it is a topic that is not finished, really not nearly, but I think it was very good to talk about this today.

Thank you so much for being here with us during this talk.