IGF 2020 WS #143 Don’t just delete, discuss – moderating for online trust

Thematic Track

Organizer 1: Michelle van Raalte, RNW Media
Organizer 2: Anna Hengeveld, RNW Media
Organizer 3: Khalid Walied, RNW Media

Speaker 1: Maxence Melo, Civil Society, African Group
Speaker 2: Mazhar Tasleem, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Alaa Abou Ghrara, Civil Society, African Group

Moderator

Khalid Walied, Civil Society, Intergovernmental Organization

Online Moderator

Michelle van Raalte, Civil Society, Western European and Others Group (WEOG)

Rapporteur

Anna Hengeveld, Civil Society, Western European and Others Group (WEOG)

Format

Break-out Group Discussions - Round Tables - 90 Min

Policy Question(s)

Is online community moderation sufficient to build trust with a local community? What other approaches could be implemented? How can we scale up community moderation approaches to fit a wider audience of stakeholders (e.g. private sector platforms, governments)? How can the community moderation approach be sustainable? Can we create self-moderating communities, or is outside moderation always needed? How can a community moderation approach contribute to the debate around the challenges of content governance? And how do you minimise censorship? Can community moderation serve as a model to encourage responsible behaviour online? How can we ensure our community moderation approach promotes respectful dialogue on the one hand and, on the other hand, adheres to the fundamental right to freedom of expression?

The failure to put the brakes on online polarisation comes at a high price. We are seeing a rise in sexism, racism, agism, homophobia and xenophobia on the internet that was already present in the offline space. An unbridled Internet worsens socio-political and cultural divides. We can see the how those divides have resulted in sometimes dangerous citizen engagement and political leadership. However, excessive censorship is not the solution to this problem. In recent years we have seen a contentious discussion on strategies to moderate harmful content. Big Tech’s attempts at moderation have exposed the limits of current approaches. It is difficult to curb harmful speech through algorithms. Educational content, for example, may be wrongly censored and racist, hateful content slip through. At the same time, manual content vetting cannot contribute meaningfully to reducing polarisation. That Big Tech faces difficulties, even if one sets aside critique of the business model, is no surprise. It is challenging to create respectful conversation, especially in restrictive and polarised societies. The keyboard can be a ready vehicle for inflaming inter-community conflicts. Our proposed workshop seeks to work through the challenges described above. We want to tap into the expert knowledge of participants at the IGF, and at the same time, support IGF community members to learn from each other so we can take our expert knowledge to the next level. The fact that Big Tech is struggling with moderation is a well-known fact. Herein lies an opportunity. Collectively, we can work on a better solution.

SDGs

GOAL 5: Gender Equality
GOAL 10: Reduced Inequalities
GOAL 16: Peace, Justice and Strong Institutions

Description:

Trust is an essential element of interpersonal relations within communities. It’s no different for digital communities. Positive dialogue and constructive conversations are fundamental to creating a trusted environment for community members. This is especially crucial when some members belong to vulnerable or marginalised groups. The internet facilitates anonymity. That can mean people engage more readily in negative, stigmatising or discriminatory discourse and makes it easier to spread disinformation. When discourse takes such a turn, it can severely hamper trust building within an online community, and even put the community or its members at risk. RNW Media is the lead organisation proposing this workshop on trust. With a legacy of more than 75 years of experience in media, our core business is building digital communities. We work with young people living in restrictive settings to promote their digital rights and support them to use the Internet to bring about positive social change. Trusted online communities are particularly important in restrictive environments. They can offer safe, civic spaces for young people across political, ethnic, racial, regional or religious divides to come together in ways that are often difficult offline. For most digital platforms, moderation refers to the practice of identifying and deleting content by applying a pre-determined set of rules and guidelines. RNW Media implements ‘community moderation’, which aims to encourage young people to engage in respectful dialogue, allowing everyone to have a voice. Careful strategic moderation of online conversations helps build trust among community members who then feel safe to express themselves freely. This in turn nurtures diverse and resilient communities with strong relationships among members. The proposed workshop is first and foremost intended to be an interactive opportunity for linking and learning. We pose the central question, “how can you best build trust?” We want to discuss the state of knowledge and practical lessons learned along with best practices for keeping negative, harmful discourse at bay and for encouraging tolerance and acceptance of diversity—and eventually, if possible, common ground. We want to share what we know and learn from other practitioners, with the overall goal of advancing meaningful online communication and good Internet governance. The proposed workshop is a 90-minute breakout session. The agenda is as follows: 1. Scene-setting with a multi-media introduction of approaches to addressing negative discourse. This presentation creates an immersive experience of how community moderation works to promote trust (10 min) 3. Plenary Q&A (5 min) 3. Small group work: break out into groups of 5-6 members (25 min): Each group receives a case: a story involving a description of an online community and of two different members of that community, each with their own particular life circumstances, who engage in discussion on a topic (the story relates their conversation) The discussion turns into negative discourse. Participants are asked to respond to issues such as: the moderation techniques they would employ in such circumstances; the existence of their own or others’ guidelines for such situations and how these could be applied; the pros and cons of different moderation approaches; where the responsibility lies in dealing with the situation; and the risk of provoking further negative responses. Participants will also be given a set of our moderation guidelines to consider using in such a situation and invited to offer suggestions for improvement or alteration. 4. Presentations from small groups (30 min) 5. Open floor discussion (10 min) 6. Synthesis of knowledge shared (10 min)

Expected Outcomes

Outputs Knowledge and ideas exchanged on addressing negative, polarising discourse, including through the role of community moderation. Report on the workshop available to all workshop participants and the IGF. Outcomes Policy thinking and practice related to internet governance enhanced through knowledge transfer. Network of experts and practitioners on community moderation strengthened and expanded. The IGF is strengthened as a forum for linking and learning on building trust and internet governance. Improved RNW Media strategy for effective community moderation and trusted environments for young people to engage in respectful dialogue online through insights harnessed from the session.

After setting the scene by introducing RNW Media’s approach as well as Jamii Forum’s approach to moderate communities online to promote trust, the audience will get the opportunity to ask questions and make comments. Afterwards, the audience will split into groups to make the workshop actionable and applicable. These break-our sessions can also be organised online. In these breakout sessions, the attendees will be provided with a case study, after which they can discuss different approaches to moderate the discussion. This will feed into a short presentation by each group, and more discussion with the audience. and will feed into short presentations.

Relevance to Internet Governance: The approach to community moderation presented will contribute to new insights on building trust within online communities. By moderating discussions, instead of deleting comments, young people engage in respectful conversations leading to acceptance of pluralistic views and a trusted environment. This approach can shape a more inclusive use of the Internet and can be adapted and implemented by all stakeholders, such as governments, the private sector and other civil society organisations.

Relevance to Theme: This session addresses the trust theme directly (please see central question above). It does so by unpacking ideas, knowledge and best practices for building online communities in which members can safely contribute to the conversation. The springboard for the session discussion is RNW Media’s community moderation approach, which research has shown provides the opportunity for people across political, ethnic, racial, regional or religious divides to engage in respectful dialogue, allowing participants to have equal opportunities to express themselves. The session is designed to then grow into a broader discussion on trust building, through small group work. Participants are invited to bring their ideas, approaches, strategies and practices for addressing negative, polarising dialogue and nurture safe online communities where trust is a central value and characteristic.

Online Participation

 

Usage of IGF Official Tool. Additional Tools proposed: With the use of Zoom, the online moderator will ensure questions and perspectives from remote participants are included in the session’s discussion and that remote participants are given the floor equally to share their ideas. Online participation will be further increased through the use of digital tools such as Mentimeter, which allows for polling among physical as well as remote participants. This will feed into the discussion and ensure all views and opinions are represented.