IGF 2021 WS #280
Regulating (Dis)content

Organizer 1: Jason Pielemeier, Global Network Initiative

Speaker 1: MARTIN RAUCHBAUER, Government, Western European and Others Group (WEOG)
Speaker 2: Emma Llanso, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Agustina Del Campo, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 4: Alex Walden, Private Sector, Western European and Others Group (WEOG)
Speaker 5: KS Park, Civil Society, Asia-Pacific Group


Jason Pielemeier, Private Sector, Western European and Others Group (WEOG)

Online Moderator

David Kaye, Civil Society, Western European and Others Group (WEOG)


Jason Pielemeier, Private Sector, Western European and Others Group (WEOG)


Round Table - Circle - 60 Min

Policy Question(s)

Content moderation and human rights compliance: How to ensure that government regulation, self-regulation and co-regulation approaches to content moderation are compliant with human rights frameworks, are transparent and accountable, and enable a safe, united and inclusive Internet?
Protecting consumer rights: What regulatory approaches are/could be effective in upholding consumer rights, offering adequate remedies for rights violations, and eliminating unfair and deceptive practices from the part of Internet companies?

Governments have been scrambling to develop different ways to regulate "online harms." These efforts exhibit a variety of approaches, but one common trend is the outsourcing of content adjudication to online service providers. Unfortunately, some of these efforts exhibit a lack of consideration for international human rights law or sufficient understanding of the underlying protocols, standards, and services that make up the global internet. As a result, there is a risk that these efforts may create unintended consequences, including pushing content moderation responsibilities to service providers ill-equipped to carry them out effectively or proportionally. This then creates significant risks for privacy, freedom of expression, and interoperability. This session will reflect on some of these efforts and foster discussion among stakeholders from government, the private sector, and civil society on how they can work collaboratively toward content regulations that address harms while protecting freedom of expression, privacy, and other fundamental rights.


16. Peace, Justice and Strong Institutions
17. Partnerships for the Goals

Targets: Our session proposal is intimately linked to the goals of SDG 16 (Peace, Justice and Strong Institutions), in its emphasis on promoting a rule-of-law (16.3) based and participatory (16.7) approach to the development of content regulations, with a particular emphasizes on accountability and transparency for content-related adjudication (16.6) and protecting fundamental freedoms (16.10). In addition, as the world's leading multistakeholder initiative focused on digital rights, GNI takes a partnership and collaboration based approach to all of its work that is consistent with and representative of SDG 17 (Partnerships for the Goals). In particular, this session is designed to foster broader policy coherence across developed and developing economies (17.14) and encourage further multistakeholder collaboration (17.17) in the area of technology regulation.


Governments are developing different approaches to address challenges related to "online harms" (which we refer to as "content regulation," as compared to "content moderation" conducted by private service providers). Meaningful differences in definitions and approaches taken by governments allow for comparative analysis and learning. The multistakeholder membership of the Global Network Initiative recently studied two dozen different "content regulation" initiatives from around the world, analyzing them through the lens of international human rights law. The resulting “Content Regulation and Human Rights” policy brief, published by GNI, provides practical, proactive guidance to governments as they continue to contemplate how best to achieve legitimate public policy goals through content regulation. This session will use lessons and recommendations from that brief to foster a rich conversation between speakers representing governments, academia, civil society, and service providers.

risk creating unintended impacts on users' rights, as well as on interoperability and In 2022, Section 230 in the U.S., the Digital Services Act in the EU, and several other regulatory proposals will be at the fore of public debate. GNI will use its recently published The Brief offers recommendations on how to implement content regulations that are effective, fit for purpose, and protect and enhance fundamental rights. Alongside the expertise of the panelists and the audience, this session will reflect on the current regulatory landscape and discuss proactive next steps for advocating for human rights in content regulation. Together, session speakers and participants will elaborate on next steps in advocating for an international human rights lens for content regulation.

Expected Outcomes

We expect that the session will provide a space for GNI, speakers, and the audience to reflect on content regulation globally and its impact on human rights. The session will also spend time discussing proactive steps that stakeholders can take in their respective organizations and countries to better advocate for the inclusion of a human rights framework in the formulation of content regulation. GNI will catalogue suggestions and learnings, in order to inform our ongoing advocacy and future, related events.

The session will be designed as an interactive conversation between the moderator and speakers, drawing on all their vast expertise to enrich the discussion. We will make use of the upvote Q&A function to allow the audience to participate as we go, with the moderator weaving in relevant audience questions as different speakers make their points.

Online Participation

Usage of IGF Official Tool.