IGF 2022 Day 3 WS #458 Do Diverging Platform Regulations Risk an Open Internet?

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR: Good morning, everyone.  So welcome to the session.  I was just told that one of our speakers, Gbenga, will be coming a bit later in the session so maybe we can start. 

     Alex, over to you for the introductions.

     >> ALEX KRASODOMSKI: Thank you so much. 

     Hi, everybody, good morning from I was going to say sunny London.  It's not sunny yet, but I have high hopes.  My name is Alex, and I'm part of Chatham House's Digital Society Initiative. 

     Thank you so, so much to everybody for joining us.  And thank you to the IGF for hosting us.  For those of you who don't know the Digital Society Initiative, we are at Chatham House which is sort of an international think tank.  We work with governments around the world to promote, to protect, and to provision digital technology that is open, equitable and respectful of fundamental democratic values.

     So we're joined or I'm joined by colleagues Yasmin and Jackie who kindly introduced me.  They'll be in the room, and they'll lead today's workshop looking at global platform regulation, a comparative look at global platform regulation and a superb panel of experts and practitioners. 

     Today's discussion I think marks the first high level discussion that we've held around this critical subject and the similarities and the conflicts in international approaches to digital platforms. 

     There are critical drivers in shaping the future of the internet.  And from our perspective, finding a path forward that negotiates these differences, that negotiates the conflicts often.  Finding a solution that is multi-lateral, that is multi-stakeholder will be a difference I think between preserving and building a global open internet or, unfortunately, its gradual breakup. 

     Now Chatham House is committed, it's a committed home for these conversations.  And today's event is as much an invitation as anything else.  As we move into 2023, we hope you will take the opportunity to reach out and to join us in this mission. 

     Now with that invitation, Yasmin, over to you.

     >> MODERATOR: Thank you, Alex.  And I'm sorry to hear that you are joining us before sunrise.  I hope you had a coffee at least. 

     So good morning to all those in Addis today with me and hello to everyone joining us in cyberspace.  Thank you for joining us for discussions today.  Thank you, Alex, for the introduction. 

     My name is Yasmin from Chatham House.  So I will be giving you a little teaser first before our stellar panel regarding our work on the global platform regulation and some of the key questions basically as a warmup to our discussions today and to pick on your brains later on.

     So if I could have the PowerPoints brought up, please.  Thank you. 

     So, you know, we have been on a journey over the past 20 to 30 years and as the internet grows and has evolved into this giant beast that we have today and that pretty much defines our everyday lives now, governments went from the question of whether to govern technology and if we should govern technology to how.

     So the shift has changed from if to how, and rightly so because it's there and it's not going anywhere.  It will increasingly grow, and we'll grown dependent on it. 

     So while governments have been trying to answer the same question, they haven't done so in the same way.  So over the past six months with Global Partners Digital here, with Jackie representing, the Chatham House have been looking at how governments are approaching the question of governing digital platforms. 

     And so we have surveyed over 55 pieces of key legislation across 50 geographies and looked into each of them against a set of pre-defined metrics.

     For example, here we have the example of Venezuela.  And we came up with 30 metrics organized around seven categories which eventually led us to building this huge table on some of the things we could find, and we could not find.  Sorry, you don't want to see that table at the moment.  It's still a work in progress We're basically building a paper on that and we're happy to share once it's up.

     And while we have this giant table, it led us to the question of what does a platform regulation look like worldwide?  Because obviously they are all doing it differently, but it doesn't mean that there is no pattern to observe.

     In fact, against the set of metrics that we have, for example, we have seen that the average regulatory regime tackles online harm by fining, 71% of them, or blocking platforms for removing content classed as illegal, 75%, in the country once a platform has been notified. 

     So you can see here that there is a sort of difference being notified or not notified.  And then there's also provisions in terms of the content that is not illegal per se but in some way it is still seen as harmful.  So ranging from pornography, for example, or self-harm content.  In short, yeah, it's harmful but it's also legal, yeah. 

     And so this led us to the question in terms of in light of the pattern what types of regulatory regimes are out there.  And so we have seen that again countries are regulating digital platforms differently.  So we can see like the little dots here, we played a bit around with the colors and we started bringing countries together which might -- again, we are writing a report on, and we are happy to share the paper once it is up there.  Feel free to reach out to us, either to myself, Jackie or anyone really on the team will be happy to talk to you about that on an ad hoc basis. 

     So the first group that we've identified -- so first before I move to the group, I have to show you like this really fancy thing that Alex built and Jackie.  But again, it's still a work in progress and it's still really dynamic.  So, of course, like if you want to reach to us to provide some feedback or insights on your region or your country, feel free to do so.

     So the first group that we've identified in those countries are those that we called with independent regulation.  And what do I mean here is, for example, regulation of digital platforms in these countries are made by an independent authority.  There is an explicit mention of freedom of expression.  And there are limitations in place on the enforcement of regulator's powers in line with the freedom of expression safeguards.

     And the second group that we have is what we call the firm regimes.  So in the case of these regulations, we've seen that all of them open the door to prison sentences for individual platform employees and that the regulations have provisions surrounding content that may be illegal, per se, again, but could be damaging to social order.

     And the third group that we have here is for the group of countries focusing on proactive monitoring.  So for the countries within this group, all five regimes included have requirements for proactive content moderation.  And individual employees of platforms may be held liable for failing to moderate content. 

     All of them also contain provisions around illegal content that could be damaging to social order, again, like the previous group that we have seen.  And sanctions include blocking and restricting access to content or platforms, but not all of them extend to prison sentences.

     Again, I want to reiterate that fact that even though we have all of these groupings based on patterns and commonalities, it doesn't mean that they are uniform in implementation.  It depends on a lot of factors, which leads to the final part of my presentation.  What is missing from this data?  Of course, it is fancy to see like the graph and the data points, but they are not oracles.

     They provide you insights, but that is not it.  So what is missing?  One of the key things that I think we need to reflect a bit more is on the local democratic context in each country.  How are regulatory instruments developed, translated, and applied in the light of local democratic context both at the national level, but also regional level?

     So very quickly, we can point at, for example, two specific countries of interest, India and Bangladesh.  Both explicitly address platform requirements on freedom of expression, demand user empowerment through transparency and redress and they conduct human rights due diligence and report on content moderation systems. 

     But obviously the context in each of these countries are different and so naturally the way these requirements are interpreted and implemented would differ.

     So before we move on to the brilliant panel that that we have here, I would like you all to reflect a little bit on three quick key questions as they present digital platform regulations from their respective regions and countries.

     So first of all, how would you think about local democratic context?  And second of all, what does a human rights based approach to regulation mean to you?  And third, are these measurable through reviewing legislation? 

     So on that note, I would like now to call on the first of our brilliant speakers who may help us provide some answers to the questions, Jackie, policy lead at Global Partners Digital, who will be sharing some insights from the European perspective.  Over to you.

     >> JACQUELINE ROWE: Great.  Thank you, Yasmin.  And thank you for having me on this panel today.  It's really exciting to see some of our kind of initial stages of research presented and to get some feedback from you all here today about how we might take this further.

     So I will focus my -- some brief remarks just on looking at some trends that we have seen in platform regulation in the region of Europe.  And by that I don't just mean the European Union but also individual nation states both within the EU block and outside of the EU block in that region. 

     And I think by looking a little bit at what some of those laws and platform regulation laws around platform regulation entail we can get a bit of a sense of the trends and the drivers in that particular region.  And then we'll hear from other panelists on other regions shortly. 

     So obviously to state the obvious, one of the major kind of wins for regular regulatory convergence when it comes to platform regulation this year has been with the passing of the European Union Digital Services Act which I'm sure many of us are kind of -- have been looking at in detail for a long time. 

     It has been in force now as of the 16th of November.  But just to provide a quick update or refresher that the Digital Services Act updates the previous e-commerce directive and provides much clearer rules around how platforms should be addressing illegal content that's shared by their users and exactly when they can be held liable for such content if they haven't acted on notice and takedown mechanisms. 

     The DSA also requires platforms to have clear terms of service and redress systems in place for their users, requires them to publish in transparency reports.  And crucially it places extra obligations on the very largest platforms with regards to human rights, due diligence, and assessing systemic risks that their platforms pose. 

     It is interesting, Yasmin was just picking up on the kind of the regulation point or how to enforce the regulation.  And the DSA imposes this quite interesting kind of framework of regulators whereby each Member State will appoint a national regulator, a digital service coordinator.  But then at a high level then the European Commission itself will be the one enforcing the requirements relating to the largest platforms. 

     So we have this two-tiered approach of independent regulators enforcing the legislation which we'll maybe come back to a little bit throughout the discussion.

     And, you know, no regulation is perfect, and it remains to be seen in practice how this will work. I'm sure there will be improvements to be made to the DSA, but I can say that from all of these 55 pieces of legislation that we reviewed that place, you know, regulations around how platforms moderate content and very carefully calibrate it to balance the risks that services pose with the need for -- yeah, it is very nuanced.  We could have a whole discussion on the DAA, but I will move on to what is happening also at the national level. 

     And we looked at -- in this dataset we looked at some proposals or laws that are already in force as well from Germany, France, Austria, Poland and Ireland, which are all EU Member States.  And we also looked at proposals from the UK, Albania, Belarus, Russia, and Turkiey, which are not EU Member States.

     Now while I won't go into detail on each of those regulatory regimes within the sort of five minutes I have here, we can draw -- through this database exercise you can draw some kind of top level similarities and differences with the DSA. 

     So, for example, most of those national level platform regulations also include clear requirements around notice and takedown procedures although the mechanisms differ a bit between countries.  Most of them require platforms to have some kind of user complaints mechanism in place.  And some of them also require appeals mechanisms as well where users can appeal platform content moderation decisions. 

     Most of those national level laws also require platforms a certain degree of transparency with their users and also with regulators about how they are moderating the prohibited content. 

     And importantly, 60% of those national level proposals also differentiate platforms by size.  So we are seeing this graded approach that the largest platforms should have the most responsibility when it comes to managing risks that their services pose.

     So there are some similarities, there are also some key differences.  Of those national level laws, nearly half of them impose strict time limits on platforms removing content once they have been notified either by a user or by a regulator or when ordered to do so by a Court.  And that is not something that the DSA has explicitly done. 

     Also, nearly a third of those national level proposals within the European regions require -- actually require platforms to proactively monitor for prohibited content in some way.  This might only be specific types of content or specific circumstances, but nearly a third include some kind of requirement to that end.

     Before I pass on to the next panelist, I will just pull out a couple of outliers within the European region.  It's no surprise to anyone that kind of Russia and Belarus and Turkiye's proposals don't really align with the DSA so strongly. 

     Those proposals include quite vaguely worded restrictions on quite politicized and subjective content types.  So they include things like prohibitions on the sharing of obscene content or content which offends the values of the State. 

     So much broader and more sweeping content categories.  They also impose -- as Yasmin kind of mentioned briefly with respect to Russia and Belarus, they impose criminal sanctions on individual platform employees for failing to comply with regulators' demands. 

     Yasmin's looking at me to hurry up.  So that's an outlier group where there's human rights concerns. 

     And then another outlier group would be the UK and Ireland regimes which align more with the approach taken by regulators in New Zealand and Australia and to an extent Canada. And we can maybe go into that a bit more in the discussion. 

     So I hope that provides a helpful overview of some of the things we can see from this data.  And I'm looking forward to hearing reflections from the other panelists as well.

     >> MODERATOR: Thank you, Jackie.  I have been told there are some audio issues for those joining us on Zoom.  So while perhaps the technicians look into it, may I please ask speakers to slow down a little bit. 

     I was also guilty of speaking too quickly.  But thank you, Jackie, for keeping up with the time.  And I was particularly struck -- well, not really surprisingly by the outliers that you mentioned the EU region because obviously -- the European region, not the EU region because when we talk about Europe and the European Union context we talk about large cities and harmonization.  But the fact that we have the outliers with Russia, Belarus and Turkiye it just shows how the larger context in each of the countries can affect regulation.  And I thought there was a really good point and perhaps we can reflect a bit more on that. 

     On that note, Juan Carlos if I may give the floor to you for some insights on Latin America.

     >> JUAN CARLOS: Yes, thank you, Yasmin.  Can you hear me well?  Thank you. 

     First of all, thank you on behalf of Derechos Digitales for the opportunity to be a part of this session.  I will try to slow down as well. 

     As an organization that works in the intersection of human rights and digital technologies in Latin America, our experience is vastly different than that of the European region.  However, we can probably find some commonalities with other regions as well.

     But I will focus on several trends.  It is important to note that for Latin America there is no regional body that produces directives or regulation for the countries.  Therefore, we find very different approaches from the countries in the region and both laws and attempt at legislation have been scattershot without much harmonization. 

     It doesn't preclude the fact that there is some expectation of better and more aligned standards for the region in this area.  But so far we have not seen efforts in that direction from the governments themselves.

     So with regards to which narratives have guided this very diverse approach, one very common narrative has been that there are these big powerful tech giants with outside influence in social interactions and on public debates.  Regardless of which platforms they are speaking about, the diversity of platforms or the services or user experiences are not really taken into account.

     On the flip side, it is also common to find perception and dystopia narratives around the internet with the platforms as cesspools of filth and harm and misinformation and abuse by other people which would require governmental action of some kind to induce platforms to do something. 

     These narratives are used and pushed as the basis to introduce legislation that would somehow control or make platforms accountable mostly by ambitious politicians.  And it's common to see in the past few years that legislative initiatives have been strongly linked to very specific figures in Congress in several countries in the region.

     But speaking directly about the content of those regulatory initiatives and also acknowledging what you were showing, Yasmin, on laws that have been passed in places like Brazil and Chile and Venezuela as extremes probably of what this region shows is that at least in the previous decade there was a recognition by the courts and by some law systems that there is need for liability protection for third-party contents in favor of platforms in case of actual knowledge of illegality of the content.

     In a broader sense in the Brazilian law and copyright law in Chile and Costa Rica and also for services in Paraguay.  However, after the panics of the last several years the focus has been not so much on an idea of the internet or corporate law but rather on the harms of social media.  And it is common to see that new regulatory initiatives are focused on social networks, speaking mostly about digital social networks.

     There have been proposals for stronger controls over which kinds of materials are present in networks that could be harmful to children as there have been initiatives in Colombia and Peru to regulate online behavior.  Also in Peru which mirrors what the Venezuelan law against hatred but also the Nicaraguan law on cybersecurity which penalized strongly certain behavior and certain expressions online. 

     But there have also been broader efforts or broader proposals to regulate online platforms including through establishing new obligations of transparency, redress mechanisms, sanctions in the form of fines as well and also proposals to block or restrict the functioning of platforms in certain countries.

     Those have not reached the status of law.  However, they have been proposed in the region in places like Chile itself and Colombia and Peru in a copy of the Colombian bill.  And also proposed very publicly by a Senator in Mexico a few years ago which was not presented in Congress after a public outcry on how strong the measures were in that proposal. 

     But whether we will see those become law in the next few years, of course, is subject to speculation.  Yet there is very strong push from the outside for these bills not to pass.

     And I think that links to a final point on rules at the regional level.  The InterAmerican human rights system provides very strong protections for freedom of expression and access to information.  Although to reach that status it has been tested several times in the Inter American Courts of Human Rights.  So whenever these new initiatives have taken place it has been mostly the efforts of civil society and academia that have pointed out that the efforts do not conform or are not adequate to international human rights law and standards.

     So what we would expect trying to address some of the final questions in your presentation, Yasmin, is that any consensus at the regional level on how to regulate platforms that are often located outside the region has the shared human rights system in view and that it adequately considers how different our national and regulatory environments are, but also that it is transparent and participatory processes are to achieve better outcomes for those processes.  Thank you.

     >> MODERATOR: Thank you.  And thank you for respecting the time limit again.  I'm really happy about the panelists today.

     So, you know, I was really -- I found it interesting that you mentioned the greater focus this past few years on the harm of social media over the bigger picture on the internet.  I thought that was really interesting and perhaps in the wider discussions you can elaborate why is that.

     And the -- you know, your point you made on the InterAmerican Human Rights System is also really interesting and obviously that is something that I hope our audience as well will ask questions about. 

     So on that note, I was -- after we went from Latin America we are going to travel to Africa.  Gbenga, over to you.

     >> GBENGA SESAN: Thankfully it's a short flight, in quotes.

     Interestingly enough, 18 years ago in this same venue at the ECA, the focus of many African countries was the what we called the African Information Study Initiative and National Information and Communication Infrastructure, POTS. 

     I had the chance to be the Vice-Chair of the African Technical Advisory committee then.  And it was very promising, very promising that while there were other regions that were talking about, you know, regulation and things one thing that many African countries focused on was increasing access and what was probably known as ICT for D, ICT for development, how would you get computers into schools and all that.  Well, things changed.

     And if you ask yourself why did that change, it is very simple.  Because the politics was very different from the reality online.  There were not many people online.  You had the Prime Minister of Ethiopia mention it going from 11 million to 13 million.  In Nigeria that number jumped from 200,000 to 11 million, and from 11 million to 86 million. 

     When that happens then you start paying attention to whatever happens in that particular space.  And unfortunately, what then happened was the -- and you can't divorce offline policies and practices from what happens in the digital space.  Because the environment offline wasn't one that was really exactly human rights respecting, that was transported to the digital space.  And so when you begin to talk about conversations around regulation, in many African countries -- I work at Paradigm Initiative, and we have been doing reports on African countries for the last 8-10 years.  And the one thing that we have been talking about for the last two years is the fact that regulation in many African countries isn't focused on regulation in terms of standards. 

     The definition of regulation is actually control.  And there is a big difference between control and standards.  The problem with control is that it protects the state and agents of the state.  The good thing about standards is that it protects everyone.  That is why you have scenarios where there are people who work in intelligence in some of those countries, and I can give a few examples, who propose these policies that protect themselves.  And then when they lose power then they become victims of their own proposals which is great, by the way, because it then empowers us to be able to say in conversations that if you don't create standards that protect everyone you will become a victim of your own failures. 

     And the reason why there is increased attention being paid you'll notice in many African countries in the last 18 months had proposals on online content moderation. 

     In fact, in many countries’ platforms were literally forced to register in countries.  One of the major reasons for that is because the last stand in civic space is the digital space.  In many cases I can't go on TV or put in newspapers some of the things I would say.  I have had many of my colleagues who are talking to the radio station and then the radio station says okay, we can't continue because we will lose our licenses. 

     There are countries that we reported on where telecom companies had their licenses withheld, not renewed, because they did not comply immediately with a phone call that says shut is down.  Without in fact the last time that Nigeria shut down Twitter, it was just a memo.  It wasn't -- it wasn't -- the Courts were not open so there was no legal order, it was just a memo. 

     And this is one of the dangerous trends that we see.  We were hoping, you know, that the bad -- COVID is bad news in any way, in every way, but we were hoping that the fact that trust was going to be at the center of all the conversations then would allow many governments to pay attention to the need to create standards and not to focus on control. 

     But many emergency laws were created during COVID, which we warned a lot about that, that they were going to become the new normal.  And unfortunately they are now the new normal.  I mean not just in Africa but in many other places and countries where emergency laws have literally now become the new normal.

     Interestingly enough, platforms have not helped too much.  And I'm glad that this is a conversation that also includes platforms.  In many cases where platforms -- if you call yourself a global company, then you must understand the global context.  Unfortunately, many platforms are just American companies trying to operate globally. And because of that, context missing in many cases.  I mean I can give many examples of scenarios where we -- I mean we actually do work with platforms and really tell them I mean really you screwed up.  And unfortunately what happens in that case is governments then use that excuse to then create not standards but try to impose control.

     You know, platforms played into the hands of India.  Played into the hands of Nigeria and will continue to play into the hands s of many other countries if they're not careful.  But I want to end with good news.  I need some good news myself.

     The biggest bit of news that I think this trend has led to is that citizens have since realized that the people that we pay taxes to to protect us are not really going to do it, so we have to protect ourselves.  From Togo to Nigeria to Zambia to many other countries, I'm glad to see citizens pushing back.

     Togo shut down the internet, citizens went to court, they won.  U.S. Court.  You will see the trend that people don't usually win at local courts, and there is a reason for that, you know.

     Same thing in Nigeria.  The Twitter ban has been declared illegal, but by a U.S. Court, not a local court.  And I trust that incidents including the Ethiopian shutdown, ongoing shutdown in Tigray and others will be challenged by citizens.  And I think this is good news because it then means we are forcing governments, platforms and every other stakeholder into a conversation about creating standards and not controlling them.

     >> MODERATOR: Thank you.  I really liked the differentiation that you made between control and standards.  I thought it was an interesting perspective that I think we can all learn from and reflect on.  And also the power of the citizens to protect ourselves, as you said. 

     So on that note we are going to do our last run to South Asia.  I'm going to hand over first the floor to Usama who is here with us on the Pakistani perspective.  And then I'm going to hand over to Aman who is online and will be joining us from the cyberspace. 

     USAMA KHILJI: Thank you, Yasmin.  And thank you to Chatham House and Global Partners Digital for putting this panel together.  I think it is an excellent design of getting perspectives, putting it into context. 

     So I will speak a bit about, you know, the situation in Pakistan and largely in South Asia.  But when Gbenga was speaking, you could have just replaced the country's name and the context would literally be the same.  So that is quite interesting. 

     There are, of course, a few differences.  But the perceptions and narratives around regulating online content in Pakistan, I will take a step back and talk about, you know, essentially there is an effort to exert control. 

     And it goes back into, you know, if you look at historically Pakistan before independence was a British colony.  Before that, it was ruled by, you know, dynasties that were royal.  Then our penal code is still the penal code from 1860 that was written by the British.  And it's the same in Bangladesh and in India.  So a lot of our laws are designed by colonial powers that were trying to control subjects in their colony. 

     And the state sort of inherited those laws and didn't change them much if you look at the penal code today.  So you still have concepts such as sedition.  You still have concepts such as treason.  And all of these are quite, you know, omnipresent in the region.  So in India you also have the issue with sedition where dissent is considered going against the state.  Even though there are constitutional protections for freedom of speech. 

     There is a similar trend in the region and the entire, you know, narrative of control is pretty much there where the state tries to control what citizens say online.  And, you know, like Gbenga also said.  I wrote a column called The Last Fortress and it was really about social media being the last fortress where people are able to express their views and criticize state policies like they are entitled to as taxpaying and voting citizens. 

     So back to history.  So then since 1947 then we've had around three to four bouts of military dictatorships.  And in that we've also seen a lot of control.  It was really in 2008 when, you know, since 2008 we've had Democratically elected governments in Pakistan until now.  It is the longest stretch of civilian governments in place. 

     And I think that has been good news because what we've seen is that yes, the state has tried to exert a lot of control, and it continues to, but the institutions are strong such as the judiciary, the media, and civil society pressure. 

     And in that sense these actors are able to exert influence or have dialogue.  The parliamentary committees are open to listening to citizens who are able to call public hearings on laws and issues. So in that sense regulation has been -- there has been a very, very deep conversation around it.

     So I would -- so in Pakistan we had the Prevention of Electronic Crimes Act that was passed in 2016.  So notice the word crimes.  And it is under this law that platform regulation is done.  So there is Section 37 of the law.  And Section 37 deals with unlawful online content.

     The way it is worded is that it takes the provisos from Article 19 which guarantees freedom of speech, and it says except for and there is a few conditions.  So that language is picked up and put in Section 37.

     So essentially what it does is it gives the regulator the right to interpret constitutional language which obviously gives, you know, issues of due process and issues of accountability where the regulator tends to play the role of judge, jury, and executioner when it comes to online content moderation.

     Again, the language is based on the restrictions in freedom of speech rather than the safeguards and guarantees.  And under Section 37 is where the rules are made for regulating social media.

     So these were formulated in 2020 during COVID.  And it was really -- so when the  prevention of Electronic Crimes Act Law was being debated it was done in parliament and even that was not a very inclusive process.  We were able to speak in the parliamentary committees and try to propose amendments and point out what the issues would be.  And like Gbenga also mentioned with Nigeria, the government at that time was like no, this is necessary. 

     And when two years later they were not in government there are leaders that were being charged under the law that they fought for by the opposition that was opposing the law.  When they were in opposition, but when they are in government, they kept using that law, right? 

     So there is that, you know, karma that really comes around.  So when you look at the rules, so in 2020 all of a sudden we found that social media rules have been nullified. 

     And there was no consultation.  And they were called protection against online harms but there was only protection of the state in that, you know, large thing that you saw.

     There was lip service to a lot of protection to, you know, citizens.  But the way we see it play out -- and I will show you how the evidence that we gathered to show how that is done -- the main focus is on controlling dissent online.

     So the groups that are most vocal on decisions around digital governance in the region so I would say there is -- there is civil society and I think the judiciary has also become active.  The regulator is very active and political parties are very active. 

     So what we really see is that the online harms rules reform.  Then those were, there is a lot of public outcry against them so then the government said okay, we're going to have multi-stakeholder consultations and we're going to revise them. 

     There were some consultations, but they weren't meaningful because the feedback was not really taken into account.  And then a new draft was passed.  So then we had another -- it was -- so then they stopped pretending it was against online harms and they called it rules for removal and blocking of online content, right. 

     Which were like okay, at least there is honesty around what this is really is.  And then when those were passed they were again giving too much power to the regulator.  It included fines on intermediaries, and it also included stipulations for taking -- blocking the entire platform that does not comply with government requests. 

     So in the end what we have seen, and that is the best news, Yasmin, sorry, is that the Courts stepped in again this year and said this is prima facie unconstitutional and asked the government and the parliament to revise the rules. 

     So we are in that process and hopefully something good will come out of it.  But I think the biggest takeaway from this is the multi-stakeholder advocacy where lawyer groups from the Supreme Court and the Pakistan Bar Council, where journalist bodies such as the Pakistan Federal Union of Journalists, civil society actors and digital rights and human rights space all got together and did advocacy together.  Did petitions in courts, did hearings in parliament.  And through that we have had receptiveness from the state.  Thank you.

     >> MODERATOR:  Thank you.  And I'm sorry to rush you.  But the presentation it was really interesting and --

     >> USAMA KHILJI: There is so much more.

     >> MODERATOR: We can talk about it later.  But I liked how you started your presentation with, you know, the heritage of, you know, colonial powers and everything and institutions.  But then you ended up with a good note on the power of civil society.  I thought it was a really good transition. 

     So now we have the last presenter with a regional perspective joining us online from India.  Aman, over to you.

     >> AMAN NAIR: Thanks, Yasmin.  Sorry.  Can you hear me?  Okay.  Cool. 

     Yeah, so if I had done this presentation even a month and a half ago it would have been totally different.  If you aren't aware, it is an incredibly interesting time to be a tech policy researcher in India.

     Again, historically I would have echoed a lot of the same sentiments that panelists before me have about some of the narratives around platform governance in India. 

     But in the last month and a half the State has passed or has attempted to pass draft legislation that will fundamentally alter the nature of the internet in India and sort of reverse many of the strategies that it has adopted.

    But before I get to that maybe I could start with a brief overview of some of the features and narratives that have historically existed when discussing platform governance in India. 

     So the first one and the one that is most vocally sort of proclaimed by researchers in the space and civil society is that platform governance by the State has been incredibly reactionary, incremental, and noncohesive. 

     If you look at the ambit of legislation that is passed, they are consistently passed by varying ministries with varying definitions that tend to sort of try and explain the same phenomenon.  And no one really has a clear understanding of what rules apply where, or at least a murky understanding at best. 

     And when sort of examining the intentions of the State passing these regulations, you see sort of two clear trends emerge.  One, that the State has a subtly protectionist or maybe not so subtly protectionist attitude towards platforms with a constant distrust of platforms that aren't Indian, viewing them sort of in the context of forces of opposing states like or as forces of sort of modern digital colonialism.

     And as a result of that, they see regulation as a means of exerting sovereignty over these platforms.  And States tend to focus on three main types of platforms.  Social media platforms, e-commerce platforms, and also government and civic tech platforms.  So historically platform governance in India has been under the control of one major act which is the Information Technology Act which was passed in the year 2000. 

     It is the overarching act which while initially was conceived as an e-commerce evolved to cover platforms as a whole on the internet.  Under this act, there are multitudes of rules that have been passed and various secondary legislations that seek to sort of regulate various facets of platforms. 

     It is where you find your rules on intermediary guidelines, where you find your rules on the processing of data by platforms and whatnot.  More recently you have seen consumer protection rules attempt to enter into platform regulation with the Consumer Protection Authority now looking to enforce rules that e-commerce platforms have to follow. 

     And all of this has happened over sort of the last decade or so.  But in the last month the State has introduced two new draft legislations that set the country on a worrying path forward.  The first of these is the draft telecom bill and the second is the draft digital personal data protection bill.

     So the draft telecom bill was released about a month and a half ago.  And as it stands right now, it could fundamentally alter the nature of platforms and the way individuals in India interact with the internet. 

     So for a little bit of context, it replaces a prior telecom bill which required telecom operators to obtain a license to be able to operate infrastructure within India.

     During the interim once that license was obtained and since the rise of what we call over the top platforms in India, your Netflixes and so on, telecom operators have pushed against this license by saying that platforms such as Netflix have an undue advantage. 

     And so the State has responded to that by now imposing licensing requirements on over the top platforms and any platform that has a messaging service on the internet.  And the State hasn't allowed for any sort of limitations or restrictions on terms of size.  So you could theoretically start any business which has a customer help line, any business that allows two individuals to communicate any information between each other and you would need a license from the State.  That flies in the face of the open and free internet that, you know, the state has historically eschewed as principle. 

     On the draft Personal Data Bill, the bill does away with a lot of protections that individuals have as it pertains to data rights and more worrily expands the surveillance regime in India by giving the State wide-sweeping exemptions from civil liberties and from civil rights.  Most notably, the Supreme Court of India has laid down a few tasks that must be adopted when dealing with a curtailing of a right to privacy.  Those are done away with in this law. 

     And so these two bills alone while not necessarily dealing with the platform regulation directly in the same way that like historical bills like the IT Act have, their indirect effects change the way that individuals interact with platforms.  They change the way that platforms can operate.  And so all of this again is happening in the background of another bill that the State is looking to pass which is the Digital India Act which is still under works. 

     So as from the Indian context it is an incredibly uncertain time to be discussing platform regulation simply because no one is clearly aware of the strategy that the State is wanting to adopt.  And every time we get a bit of information it seems bleak at best.  So yeah, with that sort of depressing message I will end.

     >> MODERATOR: Thank you, Aman.  I found it really interesting that you said that if we had this presentation one month ago it would have been completely different. 

     And I would be curious to hear like in our open discussions later perhaps how civil society reacts to these two particular bills that you have mentioned because I feel like there is a lot of implications on the power balance between government and civil society and the population so I would be curious to hear your insights on that. 

     On that note, after we have done the travel around the world, I will be handing over the floor to Meg who is joining us from Meta and will be providing some insights from the industrial perspective.

     >> MEG CHANG: Hi, everyone, thanks for having me on this.  Always a pleasure to be part of these discussions because it is a huge learning experience for us at Meta whenever we get to hear what other stakeholders are thinking about when it comes to content regulation. 

     So I lead the Meta, Meta's Content Regulation Policy team for the Asia-Pacific region.  And so a big part of the work that I do is participating in conferences like this as well as consultations with different stakeholders to help inform our thinking about content regulation. 

     And what I think is very interesting with all of the speakers that have spoken so far is I think it highlights to you how challenging it is for a company like Meta to navigate the global regulatory landscape, especially when it comes to the area of content regulation. 

     And I think it's pretty obvious like some of the reasons why, but I'll highlight some of that.  But then also like highlight how we're trying to navigate some of these challenges and tensions across all of the different countries around the world.

     So like, you know, it's already been highlighted that like legal environments and speech norms vary.  All of us, especially the global forms have a global user base.  So we have a broad spectrum about what expressions should be permitted online and what shouldn't.  And then we have a lot of different stakeholders as well that have their own vested interest in terms of what they believe should be expressed online or not. 

     And so having to navigate that requires us then to really think through like the few different tradeoffs, what kind of standards because like Gbenga mentioned about standards, that is a big part of our thinking in terms of how do we establish some set of global standards that we can operate under that could at least like address the most common and concerning issues for society around the globe. 

     Another aspect that I want to highlight in terms of the challenges is technology and speech are dynamic.  There is a broad range of services and products that it's not just social media but other types of platforms which free expression is also exercised.  So how do you regulate all of the different ranges of products and services and technologies out there and also emerging technologies that are out there? 

     But to kind of add to that complexity is that the different types of communication and expression that takes place, whether it's in the form of text or images or video or whatever other types of medium that will like come to light in the future, like how do you navigate all of that with regulation? 

     And then finally, which points to like one of the challenges that we had which was one of the reasons why I was asked to speak was about enforcement.  How do you then apply and implement these laws?  How do you enforce on it?  The one thing I want to say is enforcement as a result of all of these evolving as well as competing interests and norms and dynamics that enforcement will always be imperfect depending on where you are sitting. 

     So, you know, one person's, for example, like misinformation being a very good example.  Like one person's misinformation could be another person's opinion.  So therefore like in terms of enforcement it is always going to be imperfect in some way to someone even if -- even with there being global standards given the dynamic and value-based space of speech.

     So given all of this, like how are we then trying to navigate all of these different complexities, all of these different tensions and interests across the internet, across different countries and different cultures and mediums? 

     So back in 2020 we did publish a white paper where it was about how to chart a way forward for online content regulation where we outlined a set of principles. 

     And these principles are derived from our own experiences of content moderation as a global platform.  It is also derived from all of the consultations and like regular consultations that we have with different stakeholders, trusted partners, legal and safety experts from around the globe to inform our policies and the rules and the terms of service that we have on our platform that then also informs how we enforce. 

     So even with all of the different regulations around the globe, the one thing that we are always trying to do is look at like how do we take the intention of all of these different regulations.  Is it trying to address safety?  Is it trying to address harmful content?  And look at a from a global lens.  And then how do we create some sort of commonality for the different regulations and concerns and regulatory intent and then be able to come up with some global principles and standards and to guide how we operate our policies and our enforcement.

     So the corporate human rights policy -- because I think one of the questions that Yasmin already originally asked at the start was like how do -- what are the human rights based approach to regulation mean? 

     At the heart of how we operate we have a corporate human rights policy that is kind of at the heart of how we look at product design, product governance.  And that is kind of like the baseline for how we approach everything from our own product design and development and launches as well as our rules and policies all the way to how we even look at how to comply with regulation in the first place, how do we implement and comply with regulation.  Which I think then goes to some of the points that has been addressed in terms of like State overreach and like how are you addressing those issues as a company?

     And so I -- so I think that is a really key part.  And then I think this is why like a forum like this is so important for a company like Meta because going back to what was being said about standards, what we are hoping is by having more of these conversations with like stakeholders like yourself or with other intergovernmental organizations is to come up with some sort of a global framework and standards. 

     And one of the things that industry as a whole and not just Meta is trying to contribute to this conversation is by the digital trust and safety partnership that we have launched I think it was a year and a half ago to try to develop what like at least from an industry standpoint what are some of these best practice standards for trust and safety that can be implemented globally.  And hopefully that can then inform a lot of the discussions that we're having here today as well. 

     On that, I will leave it there and hopefully we will be able to have further conversations on this.  Thank you so much.

     >> MODERATOR: Thank you, Meg.  I thought it was a really interesting perspective that you shared.  And obviously I -- I mean I appreciate that it must be really challenging in a way to navigate all of the sort of the entire landscape of regulations.  Because even within region I think no one can agree with each other. So I can imagine if it is a global sort of undertaking. 

     So on that note, thank you so much for all of the panelists.  And now in addition to it being Q&A, I would actually invite participants both online and in person to share their insights on, first of all, what forms of online platform regulations are emerging in your part of the world?  And in what ways do they diverge with those that have been shared today in our discussions?

     Second, what risk does that policy pose to an open and interoperable internet as well as to human rights? 

     And third, how can these risks be mitigated and what opportunities are there for encouraging harmonization and consensus? 

     I would also invite you to think about it from a multi-stakeholder perspective, so you know, thinking about it from the government perspective, civil society perspective, and also at all levels.  I think that will us to a really interesting discussion.  So I see that we have one hand raised online.  Izaan, I will hand over to you.

     >> IZAAN KHAN: Can you hear me?  Perfect.  Thank you for a very interesting discussion.

     One of the kinds of regulation or at least one of the things that are being imposed in a number of different countries all across the world, for example, we have instances of this being demonstrated in India, Indonesia, Viet Nam, places like Brazil, some countries in Africa and increasingly in the western democratic world, for example, proposals like this in the UK to have what we call hostage-taking laws which are essentially those kinds of laws where you -- the social media platforms like Meta, for example, or Twitter would need to have a government liaison officer or grievance officer that will be able to respond to requests to take down content at the threat of criminal sanctions and prison essentially for noncompliance.

     And this is a very worrying trend that has a major potential to restrict the open and free internet if you consider that some of these laws have the potential for extra territoriality. 

     So it's not just that the content should be taken down for that jurisdiction alone but globally.  So we risk having this kind of internet where it is not fragmented as one of the main themes of this, you know, forum is, but you have an internet where the standard of human rights across the board has fallen. 

     We have seen this increasingly being applied to content for individuals and organizations that are outside of that particular country as well.

     And so one of the questions that I really wanted to ask the panel is in such instances where we have these sort of unilateral measures that are constantly being taken, what can we as, you know, civil society organizations and within this multi-stakeholder process actually do about this trend? 

     So the combination of extra territoriality and hostage-taking laws is one of the very, very big threats when it comes to platform and content regulation. 

     And I look forward to hearing, you know, potential responses about what we can do to challenge these sort of, you know, developments that are taking place.  Thank you.

     >> MODERATOR: Thank you.  That was a really interesting perspective.  And I must admit that I mean personally extra territoriality is something that I often sometimes overlook.  I'm assuming I'm guilty because I'm a lawyer so it's not really a good thing.  But thank you for the mind raising. And I would be keen on hearing more on your perspectives here in this room and if you have anything to share about it. 

     I see that there is a hand raised at the back of the room.  Perhaps you can go towards the microphone and just press the button.

     >> AUDIENCE: Thank you very much.  Can you hear me?  My name is John Amal (phonetic).  I work for the African Telecommunications Union. 

     Sometimes I rather unfortunately tend to think that we speak for ourselves in this discourse.  And that for me tends to be the unfortunate beat of it.

     Governments are in the business of governance and sometimes we don't like it so much, but that's the reason for which they exist.  We may not agree with some of the policies or, indeed, most of their policies, but we elect them. 

     What I seem to see is that civil society -- and this is where I tend to want to see a little bit more -- the civil society is speaking to itself.  And I would like to see in this forum government representatives also sharing their perspectives in terms of why they do the things they do.

     Because then it becomes a government bashing forum rather than, you know, perspectives being shared.  What, for example, has come out through the gentleman from India, and I see that a lot in this sort of work that I do.  The EU is trying to push back quite a bit on that in terms of platforms and the major European players that invest in networks and most of us know this quite a bit.

     The networks are saying they have put quite some money in the networks and the platforms are riding on these networks and they want to see some sort of fairness in terms of the water that passes through the pipe and the volume of the pipe, the size of the pipe. 

     And I think that is a healthy debate that we shouldn't, you know, close our eyes to.  What the networks in Europe are saying and in terms of, you know, fairly sharing the proceeds of what goes through the pipes. 

     So I think there needs to be a little bit more openness.  I see governments as bad, but not that bad as we portray them.  And I have never seen most of our governments now especially in the developing world increasingly saying that multi-stakeholder frameworks are evil or not right.  I see that quite a bit in the sort of work that I do.  Most of our governments realize that it is not the business -- it is not the monopoly of governments to govern especially in the digital space.

     And so a lot more of stakeholder involvement in terms of our policy-making processes it seems is being brought onboard.  So what I would love to see is how can we speak less to ourselves and speak more to all of the constituencies that involved in the process, especially governments.  And I see that as sometimes the missing link.  Thank you.

     >> MODERATOR: Thank you for the fresh reminder.  I think --

     >> AUDIENCE: And just one more thing. 

     Africa seems to be the focus.  Africa is on the table being, you know, sliced.  There are so many initiatives about Africa that I have lost count.  Why is this the case?  The Europeans have theirs, the Japanese have theirs, the Chinese, the Indians, the Americans, why is there so much focus about Africa?  There seems to me to be an attempt at -- you know, standardizing western values as homogenous and relevant to everyone. Thank you.

     >> MODERATOR: Thank you.  And I guess it's a really good reminder that this forum is the perfect opportunity for us to all speak to each other and learn from one medium to another, but also from on stakeholder to another. 

     And I think it is a good way to spark the discussions.  And I see a lot of hands raised here.  So perhaps you can go first and then the gentleman there.  And then I thought there was a hand there.  And then you will come next. Thank you.

     >> AUDIENCE: Thank you so much for this panel.  My name is (?).  I work with the International Center, a not-for-profit law where we are analyzing a lot of the laws that you speak about today according to international human rights standards. 

     So it was quite interesting to hear like we get really in depth into these issues, so it is nice to hear the big picture and the trends that are happening.

     So definitely one of the civil society organizations living in this bubble that the last commenter spoke about.

     But I wanted to kind of say that because this -- because of the topic of this session it has been maybe more government bashing, to quote the last speaker.  But we are often in spaces as well the same number of us where there is tech sector bashing.  And I think we would all agree that there is a lot of problems with the tech sector and social media platforms in general, especially those with an outside influence on public debate.

     Meta being one of them.  And so with all the criticism that's coming from government actors against these platforms we can agree that the -- that the approaches are not according to human rights standards but that the narratives have truth to them.

     And so that is why they resonate well with other members of government and also who might be more aligned with the civil society viewpoints but that narratives still resonate with them as well as sectors of the population. 

     And I guess I just want to say that I'm from the United States and we have a lot of problems there.  The whistle blower recently Frances -- I can't remember her last time -- that she talked about how whatever the problems are with Meta inside the U.S., they are much, much worse in other parts of the world.  And we have seen kind of a much different approach taken by Meta.  And it was interesting to hear about this -- the global principles because they are not applied equally globally in practice, however they are defined as a whole. 

     And so we have seen money interests and political interests dominating these platforms quite a bit in terms of how they are applied at the local level.

     And so I guess my question is what are the -- so in terms of supporting human rights standards within these laws, I would be curious to hear from the speakers maybe outside of the EU on what recommendations they are offering, if any, to government for how to kind of deal with this issue but in a rights respecting way? 

     And what recommendations have been proposed by you all?  And if there is any, you know, feedback or do you see any -- do you see that resonating with some government officials that you are communicating with?  Thank you.

     >> MODERATOR: Thank you.  That's a really interesting question.  Perhaps we go to the gentleman first, and then afterwards we can have one or two speakers to answer the question that was asked.  Thank you.

     >> AUDIENCE: Thank you.  My name is Owen Bennett.  I work for Ofcom which is the independent communications regulator in the UK. 

     I will first say that This panel is precisely the reason why I love coming to IGF because it so rare that you get to see so many different perspectives and comparative issues raised in one place.  So it has been for me tremendously informative. 

     If I may just give one comment and then ask a question building off it. 

     The comment I would have is that we have spoken a lot about trying to, you know, harmonize or voice divergence in the way in which different jurisdictions or regions treat these issues. 

     To be honest, at least from our point of view at Ofcom, we don't really think that's going to be possible or even in some respects desirable because I think one of the previous speakers mentioned the fact that there is always local context. 

     When I look at the duties that Ofcom has in the UK under the online safety bill, many of those are informed by some of the specific issues that have emerged in the UK around social media platforms over the last 10 years.  And they are going to be necessarily different from the ways in which other jurisdictions have experienced those issues. 

     That said, there is certainly a lot of ways in which we can collaborate at least around kind of regulatory toolboxes.  So one of the things that we are working on in the UK is trying to work with other regulators and other jurisdictions with seeing okay we might have substantially different laws, or we might care about different things but to what extent can we align the tools that we are using? 

     So, for instance, increasingly many jurisdictions are using things like risk assessments or transparency reporting or information notices and even though the legislations may be different if we can somehow try to converge around shared understandings or shared ideas of what best practice looks like, then there is a good chance we can avoid some of the divergences. 

     But I do think, for instance, the speaker from Meta mentioned the need for global standards.  I think that we are a long way from that.  And I think for the tech sector it is never going to be as easy as it has been in terms of global compliance. 

     The question I would have for the panel is that we, for instance in the UK we have an online safety bill and Ofcom are starting to implement that bill in some respects.  And we are engaged with many of the regulators around the world who are interested in understanding the approach and how the approach could be applied. 

     And obviously our approach is informed by basic we have legal standards we have to adhere to, we have certain -- we are bound by the Human Rights Act.  The question I would have for the panelists is when we are engaging with other regulators who are seeking to implement these regimes, how should we do that to make sure that the things that make these laws work and make them rights protective also get transferred over and they aren't kind of dropped in the process?  Thank you.

     >> MODERATOR:  Thank you.  That's a really good perspective that you are sharing.

     Perhaps one thing that I would throw to the table in terms of discussions is, of course, standards on the global level might not be possible if negotiated by governments because obviously there is a lot that people don't really agree with each other, and we interpret things differently. 

     But how about standards that are being led and initiated by industry and especially big tech companies like Meta where there is no such constraint in terms of government differences and the politic of powers behind it.  That would be something I'm keen on hearing your perspectives about. 

     I know that Juan Carlos said that he wanted to reply to some of the questions.  And then I will hand over to Usama and Gbenga. And then we'll go back through a round of questions and interventions.

     >> JUAN CARLOS: Thank you for the interventions especially the last one gives perspective on some of the things we need in terms of mutual learning. 

     Even in countries where we do not necessarily have specific regulators for things like online platforms, I think the general principle is a good thing to learn. 

     I wish to respond to some of the things that were commented and asked from the non-EU non-U.S. perspective as well.  Understanding at the same time again the Latin American experience has very divergent starting points even in terms of the political landscape that feeds into the regulatory discussions.

     But there were a few questions on so what do we do.  So how do we also connect with the narratives that do resonate with the population and with some of these decision makers?  I wish I had all of answers for that because I would probably be sitting there like actually providing the solutions instead of discussing them.

     But I think I -- we need to also acknowledge that the rest of the stakeholders or the nongovernmental stakeholders do have very diverging views even there are some diverging views within civil society.  Even within each region.  For very specific things like how does this operate?  What are the mechanisms to engage with the accountability of companies and platforms? 

     So those are the typical questions.  And because we do not necessarily have a one single vision, one of the things that we have proposed many times when we have seen this kind of initiative of like regulating online platforms in a haphazard or broad way is that what governments or legislatures need to do is open up processes of discussions.

     Basically to bring this discussion to the largest amounts of experts, invested stakeholders, and the public in general as can be possible. 

     One of the things that was very kind of hurtful when we saw one platform bill presented in Chile a couple of years ago or a year ago -- it has been a long time -- was that the special rapporteur for freedom of expression of the Inter-American System of Human Rights, he was asked to come to a hearing in Congress.  And one of the things -- two things that he said were, one, that the bill he could not support its adequacy to human rights standards in the Latin American region. 

     But second, and most importantly, that he was -- he decried the fact that a work that had been done for over 10 years of developing human rights standards in the Latin American region and the InterAmerican system for online platforms was not taken into account.  That it was ignored, that the previous works, the previous work by the InterAmerican experts and their office was ignored. 

     So also one thing to mention there is that there was convergence between the non-governmental, non-legislative stakeholders against this bill. 

     And I think even if we might not be able to bring aside from our regular role as advocates and intermediaries of tech policy with the public in terms of explaining what all of these rules and proposals mean, at least what we can propose towards solutions is pathways that hopefully are as open, transparent, participatory and based on evidence as possible including by assessing what other countries and regions have done before to see if those are -- have been solutions that have been able to, to some degree be one -- on one hand respectful of human rights standards but on the other hand effective against the things that it wants to control or effective in making platforms accountable. 

     This is a very exploratory experimental probably period in history for this very fast-moving subject.  However, to go back to the standards that have been developed that also principles of democracy and participation I think are key elements towards trying to be like consensus that also has legitimacy with the public which is also expected.  Thank you.

     >> MODERATOR: Thank you.  Can I just say that even though with the pandemic we've all lost our notion of time. 

     We just have over 15 minutes left, and I would like to spare the last five minutes for the speakers to sort of provide some quick fire last minute conclusions at the end.  So if you can please keep your interventions short. 

     And I will be ruthless from now, if you speak for more than one minute I will interrupt you. So sorry about that.  So, Usama, back over to you.

     >> USAMA KHILJI: I feel like I'm back at Model United Nations.  Okay, so I will be quick. 

     Just one thing on government and platform relations.  The reason governments do clamp down more in the majority world is because platforms are failing.  Platforms are failing in moderating in local languages.  Platforms are failing in allocating resources to understand local contexts.  And platforms are failing in prioritizing the issues and conflicts that are taking place in the majority world. 

     And that creates a vacuum for the state to fill in to bring in these regulations.  And with budgets bigger than most governments, I'm sure the platforms should be able to manage that quite effectively if they prioritize this.  So I think that is one necessary thing. 

     The second thing I want to speak to is about exchange between regulators and I think that is great for regulators to learn from each other.  But I would implore regulators to learn from the citizens and the rights holders whose rights are affected consistently.  And I think that multi-stakeholder engagement for regulators is essential. 

     We have seen how laws brought in in Europe such the Net-ZG in Germany and Avia law in France were cited by governments saying oh, look, Germany's doing it, France is doing it, why can't we do it. 

     And then Germany have a process where you get rid of that law, but in other countries authoritarian regimes that are regimes that are learning towards authoritarianism will bring those in and say okay, these Europeans are doing it so why are you disagreeing? 

     But you need to understand that, you know, the local contexts are different but also the rule of law environment is different.  So --

     >> MODERATOR:  Sorry --

     USAMA KHILJI: So copying pasting stuff cannot work.  And that's it, I'm done.

     >> MODERATOR: Sorry, I was a Model UN kid so...Gbenga, over to you.

     >> GBENGA SESAN:  I tend to gloat when I hear governments complain that civil society is talking to themselves because for many years governments talked to themselves before they listened to civil society.  But we are better than that. 

     So the doors are opened and I'm glad that this is multi-stakeholder, and we are talking to all stakeholders.  And to be honest, there are so many platforms and conversations on the continent that governments are invited to, and they ignore. 

     But I trust, I'll talk to my brother after and I will, you know, give him a list of the places where governments refused to go to so we can have the conversations about governments also being in the room. 

     Why is Africa on the table?  For one, we are in Africa right now.  And it is an African country that at an Internet Governance Forum prevented people from taking their phones.  That's like this is an IGF, Internet Governance Forum.  Why shouldn't I take my phone into an opening ceremony because one person is scared that they may  want to attack him.  But it's all right.

     Ofcom, I just did have ten seconds for what you said.  The biggest problem in translating expertise is hypocrisy.  When you struggle, you come into the room and say oh, this is what you do and you pontificate, dictators use what you have done as an excuse.  And like Usama said, when you correct it the struggles are not communicated.  You come into the room and give examples of what works.  And I think we need to be more open about the struggles.  What are the struggles you had with your process?  And how did civil society react to the online safety deal and the conversations you had and pushback and all that?  Let that pushback be part of the conversation. 

     It doesn't help when you come into the room and say this is how we did it in the UK because what you don't do is hide the entire path where you struggled to get where you are.  I think the need is let's move away from pontification and good east versus west, let's come to the table and have honest and difficult conversations.  We are all struggling.

     >> MODERATOR: Thank you.  Sorry.  I have like this really bad tendency of unmuting myself.  I think it is like three years after online meetings. 

     So we're going to go back to the interventions.  I promised to the gentleman.  One minute, please.

     >> AUDIENCE: Just to reiterate the reason why government isn't in the room -- and there are government people in the room -- because they choose not to be.  It's that simple.  They can't complain about nonsense. 

     The thing that I picked up in the initial presentation and most of the discussion as we continued is that we are looking at regulation of platforms. 

     Historically, and part of how the internet kind of grew the way in which it did was that there was a notion of safe harbor and treating network operators and service providers as not publishers unless they were given notification. 

     So the entire premise was there is a legal safe harbor.  If the content is bad and you know that it is bad, you have to act or else you lose your safe harbor. 

     The hostage taking -- and I think that's the perfect term for what is coming up in some of these laws -- actually puts the responsibility onto the platform.  It is the diabolical opposite of what we are sort of supposed to be doing. 

     What I'm concerned about, though, is that there is an incentive particularly on malevolent states to go after the platforms because they think of them as an easy target. 

     And that undermines individual accountability.  If somebody commits an act of hate speech in a country, that person is who is accountable and who should be accountable.  The platform should only be accountable insofar as they fail to remove the content on knowing of its existence. 

     I'm quite concerned that if we don't in our engagements in how it is analyzed and reports and so on look at the history of safe harbor laws in the United States, particularly Section 230 of the obscenity and so on.  And speaking of obscenity, I don't know how Russia can have an obscenity law and still have content that includes Vladimir Putin. 

     But we need to differentiate that from more active moderation.  And I think we do need to sort of go back to what safe harbor came about, what takedown actually entails.  And this idea that a platform ought to enjoy legal protection and we ought not to view somebody as a publisher merely because they are disseminating unless they know that it's wrongful content.  And that's just my critique to Chatham House on their report and so on. Thanks.

     >> MODERATOR: It's always useful to have feedback.  Thank you very much for that. 

     I thought that there was a gentleman back here who wanted to speak.  I don't know if he is still here. 

     If not, I know you wanted to ask a question.  Oh, you are -- after -- so we have a question from here and then I will perhaps take one last question.  Okay, one last person here, and then we will go to the quick fire round from the speakers.  Do you want to come here to --

     >> AUDIENCE: I could talk here.

     >> MODERATOR: I don't know.

     >> AUDIENCE: Okay.  Thank you very much for this opportunity. 

     Actually, a few good news we have heard during these few days apart from the multi-stakeholder approach that we are seeing here. 

     One of them was the UNESCO approach to provide universal global kind of a framework for regulating social network.  Whether or not it is possible, I think it is very ambitious and it is good to have such an idea. 

     But I would like to add to your very interesting comments that the gap that the governments is going to determine as created essentially by the tech companies themselves.  And we have some shared concerns that regardless of the different context or sociopolitical context, we have some shared common concerns that tech companies they could do something to fill the gap, a very common gap.  One is them is being depoliticized and not to act in government sandbox. 

     For example, we have seen some complaints from a few countries that some theories, education, training, or some kind of the hate speech very clear cut and very, very symbolic cases of the hate speech is happening over Meta platforms.  Instagram and Facebook.  But they are not doing anything in those countries because of their political approach. 

     And also they are implementing some unilateral sanctions, for example, imposed by the U.S. because of their complying with the U.S.-based regulations and laws.  And it is something that has created a lot of problem.

     >> MODERATOR: Sorry --

     >> AUDIENCE: That's all.  Thank you for giving me your seat.

     >> MODERATOR: Sorry for kicking you out.  So okay.  So we had one last intervention from here.  Did you have an intervention?  If you can borrow someone's microphone.

     >> AUDIENCE: Sorry, everyone.  My name is (?) I'm from (?) of South Africa. 

     I think one word I have been hearing -- well, two terms I have been hearing, I mean we're speaking on global framework, obviously looking for a global standard, et cetera, which I mean it works. 

     But I think the ideology of one hat fits all I think is quite problematic.  And I will tell you why. 

     In South Africa we are now looking at community standards and guidelines on social media platforms.  Hate speech might not be the same terminology in the U.S. or in Ghana or in any other country.  And the problem with that lies in when it is a serious digital offense that needs to be taken down but however the guidelines do not act accordingly to the community guidelines.

     But on the side note, I wanted to add and ask on this was that we are speaking on government.  And sometimes government itself doesn't have the literacy and knowledge to actually engage when you are talking regulations. 

     They themselves don't understand the acts and legislations that they are speaking on and implementing.  As much as civil society does come into play and the other stakeholder to try and to help them understand why we need to implement such.  But at the same time we still find them obviously having that kind of push over effect and then taking lead in it.  Thank you.

     >> MODERATOR: Sorry about being the grinch here. 

     So now as we conclude our session, I swear we could go on forever.  I think it is my favorite session in the IGF.  I ask you to share a 30 seconds quick fire conclusion.  I hope you warmed up.  Meg, if you can share your 30 seconds quick fire conclusion, please, thank you.

     >> MEG CHANG: I think I will focus in on standards itself because I do think that there can be a global standard.  Industry is working on it.  And when we say standards, I think a lot of us have different ideas of what standards are and the level of granularity. 

     So what we are trying to do from an industry standpoint is create at least a global baseline standards.  And examples of this could be like be ensuring all companies have a human rights policy to address the issues that you guys are all talking about here, right.  That there are guiding principles around human rights that all companies should be abiding by is one example of a global standard that could be set out that all companies do. 

     Another one is transparency standards.  That's a starting point for accountability is providing transparency into how different platforms operate.  And use of the empowerment could be another one.  So in terms of standards, I think there is a way to go about it that can work as a global framework.

     >> MODERATOR: Thank you.  Sorry about that.  Aman, you are next.

     >> AMAN NAIR: Yeah, thank you.  So my I guess last point would just be on not thinking of platforms as opposite entities. 

     When we think of regulation in the Indian context we are extremely skeptical because we often don't think of regulation as being well meaning or furthering the rights of individuals. 

     I think we spoke a lot about what the government could do in regulating platforms.  I think platforms have an equal responsibility in ensuring that the services are properly functioning within the majority of world states, properly funded, and that they push back against provisions and regulations that do violate human rights, specifically surveillance provisions. 

     In India where Twitter and WhatsApp are trying to push back against surveillance.  I would urge for them to be more active participants in the discussion.

     >> USAMA KHILJI: For me the intra-operability of the global internet is very important, and we are moving towards a world where there is too many borders on the internet. 

     We have seen how Russia has been cut off since the conflict.  China cut itself off.  And we are seeing how governments are censoring in particular ways and that's problematic.  And global standards are possible, and I think our community standards and rules on social media platforms are pretty good.  The issue is local context implementation and I think that is where the solution can possibly lie.

     >> MODERATOR: Thank you.

     >> GBENGA SESAN: We definitely need to talk to each other more and break the silos and talk more to each other.  Government and civil society and private sector and everyone.  That is one. 

     Secondly, when we let ourselves western values, I get worried because human rights is universal.  And the truth is that in the African context we can talk about (?), these are locally understandable concepts that defines human rights.  It's not western values, it is human values.

     >> MODERATOR: Thank you.  Juan Carlos.

     >> JUAN CARLOS: Thank you.  So I agree that standards are good and possible at least to some degree.  But kind of empty without implementation and accountability. 

     So for regulatory initiatives to be as good as this panel in terms of depth and knowledge, those regulatory discussions that are taking place everywhere need to be also as open as this one and also more open.  Also echoing that we need to include even more voices.  And also they all need to be very sensitive to local frameworks and local understanding as well.  Thank you.

     >> MODERATOR: Jackie.

     >> JACQUELINE ROWE: Great, thanks, Yasmin.

     Just to echo other panelists really, I think we can acknowledge that different jurisdictions are trying to solve different problems.  And maybe in global minority countries child safety might be the big priority whereas in other contexts then hate speech and marginalized languages is the most pressing issue. I think we can acknowledge those differences but also advocate for a human rights centered and human rights principle that works and that's where I think the global forums will be really important.

     >> MODERATOR: On that note, thank you so much, everyone, for coping with me. 

     Thank you for the panelists for joining us and for the really really stellar interventions from everybody.  And I hope that we can all stay in touch.  Thank you very much.

     >> Thank you to you for moderating it so well.