IGF 2020 WS #133 Content moderation and Competition: The Missing Link?

Subtheme

Organizer 1: Maria Luisa Stasi, ARTICLE 19
Organizer 2: Gabrielle Guillemin, ARTICLE 19

Speaker 1: Oli Bird, Government, Western European and Others Group (WEOG)
Speaker 2: Melanie Rivest, Government, Western European and Others Group (WEOG)
Speaker 3: Sebastian Schwemer, Civil Society, Western European and Others Group (WEOG)
Speaker 4: Nicolo Zingales, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 5: Dorota Glowacka, Civil Society, Eastern European Group

Moderator

Maria Luisa Stasi, Civil Society, Western European and Others Group (WEOG)

Online Moderator

Gabrielle Guillemin, Civil Society, Western European and Others Group (WEOG)

Rapporteur

Gabrielle Guillemin, Civil Society, Western European and Others Group (WEOG)

Format

Round Table - Circle - 90 Min

Policy Question(s)

• Is a centralised or a decentralised system the best model for content moderation online? • What role can economic regulation and competition play in addressing the challenges of content moderation online? • How to address bottlenecks on social media markets? Will unbundling between hosting and content moderation activities on social media platforms help?

We intend to address challenges with content moderation on social media markets and we would like to explore the role that economic regulation and competition could play in solving them. We would like to do so with the stakeholders directly concerned (social media platforms, regulators and civil society) and with the support of academics. In particular, this session will explore the pros and cons of instruments such as the unbundling of hosting and content moderation activities, as well as other contractual and technical ways to deal with content moderation issues (for instance, market investigations and codes of conduct) as an alternative and/or a supplement to existing platform responsibility regimes.

SDGs

GOAL 9: Industry, Innovation and Infrastructure

Description:

Social media have revolutionised the way we communicate, access and share content. Social media platforms deliver content that others have created, and do so by relying on automated content selection systems. The way content is selected and moderated by these platforms plays a key role in phenomena such as the dissemination of disinformation, hate speech, the creation of filter bubbles and the reduction in the diversity and plurality of voices that each user is exposed to. Therefore, worldwide, content selection & moderation raise growing concerns, with governments (and the EU) considering regulatory instruments to address these challenges. Faced with increasing pressure to assuage those concerns, some of these platforms are developing their own solutions. Facebook, for example, is setting up an oversight board, which will provide an avenue for appeal and further consideration of some of the company’s most controversial decisions to remove content. Twitter, by contrast, is working towards an open and decentralised protocol for social media networks, which implies opening up content decisions to third party services. It is crucial that we understand the different degree of openness that these two initiatives involve, their effectiveness in addressing the above-mentioned concerns and their likely impact on market structure. Drawing on the experience of competition law and regulation in “opening up” markets affected by key bottlenecks, this session will explore the important role that economic regulation can play in protecting freedom of expression and pluralism online, while also pointing out the challenges in applying traditional bottleneck concepts to social media. The moderator will set the scene, framing the key points for discussion and asking participants to explain their position on them and to put forward proposals. The diversity of participants in terms of stakeholders’ groups, experience and skills will ensure that various perspective are analysed and debated. The moderator will work with participants to try to reach a consensus on some of these key points and to strategize about possible ways forward. Additional Reference Document Links: https://www.article19.org/resources/why-decentralisation-of-content-mod…

Expected Outcomes

The session aims to facilitate a multistakeholder dialogue on the relevant topics; it is expected to build consensus on a number of key issues, to shed lights on areas where approaches are conflicting, identifying the reasons and to strategize about possible ways forward. In addition, the organisers will produce a summary report of the discussion and a brief list of pros and cons with regard to the main topics the debate focuses on.

The moderator will set the scene, framing the topics and posing a list of key questions to participants. The moderator will ask the invited speakers first to make their points on the questions; then she will turn to the rest of the table, providing slots to each participant. At the end, the moderator will briefly sum up the arguments and the proposals developed during the discussions and invite the speakers to make final brief comments. Overall, the moderator will organise interaction and participation aiming to build consensus on key points.

Relevance to Internet Governance: Content moderation online raises a number of fundamental challenges for society. A proper answer to those challenges cannot come but from a multistakeholder dialogue about which norms, rules and procedures are to be adopted to guarantee a free flow of information online and correct market failures.

Relevance to Theme: Trust is an essential requisite for the development of systems of content moderation online capable to guarantee the free flow of information, the respect of users’ freedom of expression and information, as well as their safety online. Trust needs security, stability and resilience of content moderation systems, but also their transparency and the inclusion of mechanisms for users’ empowerment. The proposed session will consider all these aspects while trying to reach a multistakeholders consensus on some key aspects of possible content moderation models. Among others, the session will try to identify best practices for protecting both systems and users and to define the appropriate roles and responsibilities of platforms, governments, and other relevant stakeholders.

Online Participation

 

Usage of IGF Official Tool. Additional Tools proposed: Zoom