IGF 2020 WS #205 From Content Moderation to Censorship? How to safeguard free

Subtheme

Organizer 1: Victoria de Posson, CCIA Europe
Organizer 2: Jens-Henrik Jeppesen, Center for Democracy and Technology

Speaker 1: Emma Llanso, Civil Society, Western European and Others Group (WEOG)
Speaker 2: Helani Galpaya, Civil Society, Asia-Pacific Group
Speaker 3: Victoria de Posson, Technical Community, Western European and Others Group (WEOG)

Moderator

Victoria de Posson, Technical Community, Western European and Others Group (WEOG)

Online Moderator

Victoria de Posson, Technical Community, Western European and Others Group (WEOG)

Rapporteur

Victoria de Posson, Technical Community, Western European and Others Group (WEOG)

Format

Birds of a Feather - Auditorium - 90 Min

Policy Question(s)

More and more countries are unilaterally adopting new intermediary liability laws, while the Internet is global. What are the policy and legal implications of such measures on the freedom of expression and our democracies? What’s the role of governments, civil societies, industry, and users/consumers? What is the right balance between government regulation of speech and private company moderation of online content? How can we jointly work together to enable an open Internet that empowers individuals? How can we prevent or minimise the emergence of conflicting laws which companies must comply with?

* Difference and definition of illegal content and harmful content (dis/mis-information, hate speech, fake news): Stress the impact of the local contexts. * Platforms’ characteristics: No one size-fits-all. Stress the complexity of having different intermediaries, different services and different business models. * Underblock vs. overblock: Challenge of finding the right balance for intermediaries between over-block and under-blocks. * Responsibility of platforms towards ° Content providers: Opportunity to have its content amplified, or challenge to be seen. ° Users: Opportunity to get information easily, or challenge to differentiate ‘real’ information from fake news ° Human rights: Challenge to limite the abuses online (terrorist content, child pornography, cyberbulling, hate speech, defamation, etc).

SDGs

GOAL 4: Quality Education
GOAL 9: Industry, Innovation and Infrastructure
GOAL 11: Sustainable Cities and Communities

Description:

Today, it is unthinkable for a politician, an association or a company to be absent from the Internet. Most of them use online platforms to connect with citizens, voters, and customers. Online platforms create spaces for people to communicate with others, both locally and globally. The policies that platforms use to moderate content shape online spaces and potentially determine what content can be shared or amplified. These policies are not developed in a vacuum: they are informed by underlying legal frameworks, business priorities, values of the service providers, as by communities using the platforms. This workshop will focus on the importance of freedom of expression in the framing of online content moderation policies across jurisdictions. We will examine the ways that laws governing intermediaries’ liability for user-generated content affect individuals’ human rights and enable or interfere with different approaches to content moderation. This discussion is pertinent, as countries across the world are considering changes in their legislative frameworks for hosting user-generated content. We will discuss the approaches and trends in intermediary liability frameworks across multiple countries and regions, including the EU, the US, Southeast Asia, and Japan. This will include an exploration of the human rights risks in (i) proposals to fight illegal content; (ii) legislative initiatives restricting lawful but “harmful” content; (iii) renewed regulatory interest in content filters and other “proactive measures”. We will also examine the emerging best practices around different forms of transparency reporting and how these can support oversight and accountability in content moderation. The panel will discuss best practices for shaping company and government policy that strikes the balance between addressing illegal content, mitigating harmful effects that stem from lawful speech, all while respecting users’ human rights and preserving an open and free internet. INDICATIVE AGENDA A. Introduction by the moderator & Introductory remarks by speakers (30min) B. Discussion among panellists (30min) * Is the freedom of expression at risk due to disinformation and fake news? --> Question to Emma Llansó, Centre for Democracy and Technology Free Expression Director *Southeast Asia’s perspective on the interplay between the freedom of expression and content moderation policies [(i) illegal content and (ii) lawful but harmful content] --> Question to Helani Galpaya, CEO, LIRNEasia *Japan’s perspective on renewed regulatory interest in content filters and other “proactive measures” --> Question to Masayo Ogawa from the Japanese Ministry of Foreign Affairs *Google’s perspective on the challenges and approach taken by the industry to find the right balance in content moderation --> Google is very interested in participating, but they need more time to provide a name C. Q&A session with the audience (25min) D. Concluding remarks by the moderator (5min)

Expected Outcomes

(a) Recognise that rules need to be modernised to meet the needs of today’s digital realities while respecting the freedom of expression. (b) Contribute to on-going and future multilateral and bilateral dialogues to establish common approaches to intermediary liability frameworks and to explore developing norms around content moderation. (c) Developing a joint CDT-CCIA white paper on the balance between content moderation policies and the freedom of expression.

Short three to five minutes presentations made by the speakers will open the discussions and encourage contributions. 80 % of the time of the workshop will be allocated to open discussions. On spot and online participants will be encouraged to present their views and possible solutions.

Relevance to Internet Governance: Whether and how to regulate user-generated online content is a key issue at the heart of Internet governance. This session will explore the various modalities of Internet governance, including laws, company policy, and user behavior and norms, and the roles of different stakeholder groups, including government, industry, and civil society. We will focus on how the decisions made by these different actors, through these various mechanisms, directly and indirectly affect people’s enjoyment of their fundamental rights.

Relevance to Theme: People’s ability to enjoy their rights to freedom of expression, access to information, freedom of association, and privacy online all depend on trust. We must be able to understand the laws and policies that will be applied to our online speech and to trust that they will be enforced fairly and transparently. We must be able to trust that we will know when governments have been involved in restricting our speech and access to information, or obtaining information about us from private companies, so that we can hold our governments accountable. Companies must provide clear and honest explanations for how they determine what information we do and don’t see online, so that we can choose whether to entrust them with our speech and personal information. Ultimately, a lack of trust can exert a strong chilling effect on people’s willingness to participate online and can disproportionately affect already marginalised groups and individuals. A clear understanding of how governments and companies should act to promote people’s rights to free expression and privacy is essential to building and restoring trust in the Internet as our predominant communications medium.

Online Participation

 

Usage of IGF Official Tool. Additional Tools proposed: We would like to have a twitter hashtag through which the audience could interact and ask questions.