IGF 2020 WS #351 Information Disorder and the Virus


Organizer 1: Civil Society, Latin American and Caribbean Group (GRULAC)

Speaker 1: Laura Tresca, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 2: Jan Gerlach, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Carlos Affonso de Souza, Civil Society, Latin American and Caribbean Group (GRULAC)


Debate - Auditorium - 60 Min

Policy Question(s)

How the fight against Covid-19 might change the debate over content removal and the role of intermediaries?

It is a commonplace to state that the Covid-19 crisis is also an information crisis. But what are the lessons learned from activists, governmental authorities, academics and the private sector when they reflect upon the role of online intermediaries in dealing with the information disorder related to the pandemic? The Internet ecosystem could provide the perfect conditions for disinformation to spread during a global health crisis. When there is a large amount of information in multiple communication channels, in a fast, continuous and repetitive manner, the challenges of content moderation are far bigger. However, this setting provides also for an opportunity to understand how different intermediaries play a relevant role in the way in which we create and receive information.


GOAL 9: Industry, Innovation and Infrastructure
GOAL 17: Partnerships for the Goals


Major Internet platforms have updated their content removal policy in the context of the fight against disinformation related to the Covid-19 pandemic. Several users, including authorities from countries such as Brazil and Venezuela, had their posts removed or flagged as misleading or false. What will be the legacy of the fight against Covid-19 in the way in which we understand Internet intermediaries’ roles and responsibilities? From the the foundations of Section 230 of the Communications Decency Act, in the United States, to the discussions about platform liability in an European Digital Single Market, the debate on the role of intermediaries seems to enter a new chapter in the fight against the new coronavirus. As the world turns to the Internet as a major resource for keeping families, companies and Governments connected throughout the crisis, online platforms can be either part of the solutions or either part of the problem, if they fail to enforce their own content rules in an accountable, transparent and coherent way. The challenge is up not only for the big commercial platforms, as many different providers are trying to find new ways to moderate content that misinform and to promote reliable information to and from their users.

Expected Outcomes

Map out a problem or issue area / Strategize with key stakeholders on paths

Time will be allowed for public intervention, enabling a concrete exchange of experience and reflection between them. The workshop will start with a 5-minute explanation of the topic's relevance and relevance, conducted by the moderator, and soon afterwards, each guest will have 10 minutes to present their opinions, arguments and share their professional trajectories. After that first moment, 20 minutes will be used for public intervention present, with questions directed to those present, and conclusions from each guest.

Relevance to Internet Governance: The liability of online intermediaries and their content moderation regimes are an important aspect of the debate around freedom of expression, a quintessential topic for Internet governance and regulation. The IGF meetings have served as a catalyst to a number of debates concerning free speech online, from Internet shutdowns to the rise of fake news. Therefore, it is only natural that a Forum dedicated to discuss state of art themes on Internet Governance hosts some debates on how the recent fight against Covid-19 might change the way in which content removal and intermediaries’ liability are addressed by different stakeholders, such as governments, civil society, the private sector and the technical community.

Relevance to Theme: In 2019, the European Court of Justice decided that Facebook could be ordered to track and remove content globally if it was found to be illegal in one EU country. This decision represents a major step toward forcing hosting platforms to take greater responsibility for what is posted on their networks. From rules of notice and takedown to rules of removal by judicial order, the extent of the platforms' liability for content published by third parties is a key element to the debate on freedom of speech online. The fight against Covid-19 have forced platforms to adopt new rules of content moderation and triggered efforts from governments to reflect on the role of such intermediaries in the public discourse. This workshop proposal aims at providing different views from different stakeholders on what are lessons learned and how the current health crisis might impact future initiatives on content moderation and liability regimes.

Online Participation


Usage of IGF Official Tool.