IGF 2020 WS #285 UGC platforms: Towards a platformization of the regulation?

Thematic Track

Organizer 1: Henri Isaac,
Organizer 2: Henri Isaac, Renaissance Numérique
Organizer 3: Jessica Galissaire, Renaissance numérique
Organizer 4: Lucien M. CASTEX, ISOC

Speaker 1: Max Senges, Private Sector, Western European and Others Group (WEOG)
Speaker 2: Bilal Abbasi, Government, Asia-Pacific Group
Speaker 3: Karolina Iwanska, Civil Society, Eastern European Group

Moderator

Lucien M. CASTEX, Technical Community, Western European and Others Group (WEOG)

Online Moderator

Jessica Galissaire, Civil Society, Western European and Others Group (WEOG)

Rapporteur
Format

Debate - Auditorium - 90 Min

Policy Question(s)

1) Toxic content and Online safety Topics: disinformation, terrorist violent and extremist content (TVEC), deep fakes, hate speech, freedom of expression, platforms, inclusive governance, Human rights How to integrate the diversity of platforms in the challenges of moderation, in accordance with our fundamental rights and freedoms? 2) Online platforms regulation and Democracy Topics: freedom of expression, democracy, platforms, inclusive governance Through what mechanisms can we give the billions of users of online platforms a voice in regulation? 2) Platformization and Regulators tools Topics: disinformation, terrorist violent and extremist content (TVEC), deep fakes, hate speech, freedom of expression, democracy, platforms, inclusive governance, Human rights How can regulators organize regulation according to the same principles as platforms, by constructing principles of collaboration with the ecosystem and adequate regulatory tools (indicators, algorithms, etc.), and by imposing new obligations on platforms (interoperability)?

The digital platforms that host user-generated content (UGC) are diverse. There are many differences between them: type of content (text, video, live streaming); mediation/scheduling of these contents; services/functions (private chat, marketplace, etc.); organizational model; economic model/monetization model; size and geographic presence; relationship with users; moderation practices; etc. In the fight against the spread of toxic content, the attention of regulators is focused on a handful of these platforms only; those which, due to the prevalence of their use among users across the world, contribute greatly to restructuring public space, democratic space and the contours of public debate around the world. Though it is necessary to consider these actors, this attention doesn’t allow us to tackle the entire problem. All of the platforms hosting user-generated content have a responsibility and must be considered in the formulation of platform regulation. Because these questions relate to the widespread uses of digital platforms by citizens, consumers, and businesses, rethinking the regulation of such issues without integrating them in one way or another, results in denying a part of the transformations carried out by this platformization of our online space. Introducing users of digital platforms into regulation is only the strict counterpart of the fact that they are co-contributors to the creation of value on these platforms, including by sharing and exploiting their data. Through what mechanisms can we give the billions of users of these platforms a voice in this regulation? How can regulators organize regulation according to the same principles as platforms, by constructing principles of collaboration with the ecosystem and adequate regulatory tools (indicators, algorithms, etc.), and by imposing new obligations on platforms (interoperability)? There are multiple options that involve more stakeholders, be it users of the digital platforms, citizen associations (consumer advocacy, rights advocacy, etc.), researchers, even the inclusion of competitors. How can we ensure that democratic institutions favor their collaboration and retain control of the regulatory processes? How can regulation escape the current bilateral relationship between governments and major platforms?

SDGs

GOAL 9: Industry, Innovation and Infrastructure
GOAL 16: Peace, Justice and Strong Institutions

Description:

The common thread of the discussion: in the fight against toxic content and in the era of the platformization of our online space, regulators must adapt their approach, both in terms of the means at their disposal (skills, tools, etc.) and regarding their principles of collaboration with stakeholders. The discussion will focus on concrete governance mechanisms for the regulation of digital platforms that host user-generated content (UGC). For this, it will draw from two recent publications of the think tank Renaissance Numérique (to be published in May 2020): the first concerns the reexamination of moderation practices and the regulation of these practices; the second concerns the regulation of so-called “structuring” digital platforms (this “structural” status is due their prevalence among citizens, consumers, and businesses globally). The aim will be to test these concrete recommendations and to consider how they can be implemented at multiple scales, from the international to the local level, and according to regional contexts. The discussion will have two stages: the first step will consist in considering the diversity of the digital platforms which host contents generated by users and the limits of current regulatory approaches; the second step will aim to question these methods, and debate the merits of more fully integrating users in the regulatory processes.

Expected Outcomes

The diversity of the panellists — who represent government, civil society, the private sector, and an intergovernmental organization — will allow the recommendations of the think tank to be compared to realities of these actors in their different contexts, in order to assess the potential impact of these recommendations and capacity for adoption. Depending on the outcome of this discussion, they could be promoted more widely through a publication by our think tank.

The recommendations under discussion will be shared ahead of the session with the speakers and the participants in the session if possible through the organizers of the IGF, so that everyone can prepare for the discussion and contribute in a relevant way to the debate. The principal moderator will frame the discussion around the different issues that we want to address. The second moderator will serve as a timekeeper, in order to help the main moderator distribute time equally between the interventions: the introductory remarks of the speakers should not exceed 5 min and the interventions of the participants or speakers in the debate, 3 minutes.

Relevance to Internet Governance: The subject of this workshop is directly related to Internet Governance, since it questions the modes of constructing the regulation of digital platforms that host content generated by users (in a sense, the ‘governance of regulation’) and questions the capacity of this governance to effectively integrate a broad spectrum of stakeholders.

Relevance to Theme: Trust in the online world requires more democratic, participatory modes of regulation, which are not founded only on representative bodies distant from the users (including those which are supposed to represent them). On digital platforms that host user-generated content (UGC), users are essential pillars of value creation and also share strong responsibility. This active role must be recognized in the regulation of these platforms, at several levels. By debating concrete recommendations, this session will contribute in the objective of the track to promote best practices.

Online Participation

 

Usage of IGF Official Tool.