IGF 2022 WS #70 Fighting the creators and spreaders of untruths online

Time
Tuesday, 29th November, 2022 (10:50 UTC) - Tuesday, 29th November, 2022 (12:20 UTC)
Room
CR3

Organizer 1: Molly Lesher, OECD
Organizer 2: Hanna Pawelec, OECD

Speaker 1: Mark Uhrbach, Government, Western European and Others Group (WEOG)
Speaker 2: Julie Inman Grant, Government, Asia-Pacific Group
Speaker 3: Rehobot Ayalew, Civil Society, African Group
Speaker 4: Laura Zommer, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 5: Sander van der Linden, Civil Society, Western European and Others Group (WEOG)

Additional Speakers

Pablo Fernández, Executive Director of Chequeado

Moderator

Molly Lesher, Intergovernmental Organization, Intergovernmental Organization

Online Moderator

Hanna Pawelec, Intergovernmental Organization, Intergovernmental Organization

Rapporteur

Molly Lesher, Intergovernmental Organization, Intergovernmental Organization

Format

Other - 90 Min
Format description: The session would take a workshop format (unavailable in the drop down menu) with a panel discussion and keynote speech by a leading academic in this area.

Policy Question(s)

How can policy help protect fundamental rights -- freedom of speech, thought and expression; the right to choose leaders in free, fair, and regular elections; and the right to privacy -- while effectively combatting the negative effects of untruths online for people and society? What other concrete approaches can complement policy tools to counter the circulation of the worst untruths online (e.g. disinformation)?

Connection with previous Messages: This event would explicitly address the following IGF 2022 theme: Connecting All People and Safeguarding Human Rights. The right to freedom of speech, thought and expression, and a free and independent press, are indispensable for the healthy functioning of democratic societies, as is the fundamental right to privacy. The right to health, especially in pandemic times, as well as the right to accurate information on grand challenges such as climate change, including its origins and impacts, are also negatively affected by untruths online.

SDGs

13.3

Targets: 3.d - Strengthen the capacity of all countries, in particular developing countries, for early warning, risk reduction and management of national and global health risks. Trust in the health information disseminated by entities such as the media, governmental bodies, and health professionals is essential, especially in pandemic times. However, some users of online platforms – including elected representatives – have taken to the Internet to spread misinformation and disinformation related to the global pandemic, thereby jeopardising our collective right to health. 13.3. Improve education, awareness-raising and human and institutional capacity on climate change mitigation, adaptation, impact reduction and early warning. Access to accurate information on key issues such as climate change, including its causes and impacts, is essential to raising public awareness about this critical problem and to changing individuals' behaviour. The spread of untruths about climate change hinders our collective ability to take the necessary steps needed to address this grand challenge. 16.10. Ensure public access to information and protect fundamental freedoms, in accordance with national legislation and international agreements. Article 21 of the UDHR grants citizens the right to choose their leaders in free, fair, and regular elections as well as the right to access accurate information about parties, candidates and other factors that may influence voting. Political untruths negatively impact a country's politics, causing polarisation among communities, and they also sow distrust in democratic institutions such as governments, parliaments, and courts as well as distrust of public figures, journalists and the media.

Description:

Lies, misleading information, conspiracy theories, and propaganda have existed for as long as there were people to create and spread them. What has changed the dynamic is the Internet, which makes producing and disseminating the collection of untruths that exist today much easier and faster. Stopping the creators and spreaders of untruths online will play an important role in reducing political polarisation, building back trust in democratic institutions, promoting public health, and protecting other fundamental human rights. To do so, the multi-stakeholder Internet community must leverage the power of technology and people in smart and new ways. This event will explore the different types of untruths online – disinformation, misinformation, propaganda, contextual deception and satire – and innovative ways to reduce the negative affects they have on people and society. The panel’s discussion will focus on concrete and practical ways to fight untruths online, including through digital literacy initiatives, the innovative use of AI and other technologies, content moderation approaches, and innovative measurement approaches, among others. A background paper on “Disentangling untruths online” would support the conversation. Keynote Dr. Sander van der Linden, Cambridge University Lecturer in Psychology, Director of the Cambridge Social Decision-Making Lab and a Fellow of Churchill College, confirmed Panellists Ms. Julie Inman Grant, Australia’s eSafety Commissioner, confirmed Mr. Mark Uhrbach, Chief of Digital Economy Metrics, Statistics Canada, confirmed Pablo Fernandez, Executive Director, Chequeado, confirmed Ms. Rehobot Ayalew, lead fact-checker at HaqCheck/Inform Africa and a media literacy and fact-checking trainer, confirmed

Expected Outcomes

The session will feed into the OECD's work on understanding and comabtting untruths online (measurement and policy).

Hybrid Format: The event aims to be interactive and full participation will be accessible for both onsite and online audience. Online and onsite audience will be able to ask questions to the speakers and interact through a live “social wall”, technology permitting.

Online Participation

 

Usage of IGF Official Tool.

 

Key Takeaways (* deadline 2 hours after session)

Pre-emptive actions (e.g. pre-bunking, digital literacy initiatives) are needed to protect people from the risks and harms of false and misleading content online.

There is no silver bullet to stop untruths online. A cocktail of approaches are needed (education, media literacy, resources incl. technologies like ML, collaboration among factcheckers, int'l collaboration).

Call to Action (* deadline 2 hours after session)

Governments and the multistakholder community need to pool resources (monetary, knowledge) to fight the creators and spreaders of untruths online.

Session Report (* deadline 26 October) - click on the ? symbol for instructions

This workshop explored the different types of untruths online – disinformation, misinformation, propaganda, contextual deception, and satire – and innovative ways to reduce the negative effects they have on people and society. Molly Lesher opened the workshop and set the scene, focusing on the OECD’s work in this area, including the  “Disentangling untruths online” Going Digital Toolkit note. She highlighted that while the dissemination of falsehoods is not new, the Internet has reshaped and amplified the ability to create and perpetuate content in ways that we are only just beginning to understand.

She recalled why access to accurate information important, including in the content of the fundamental rights to freedom of speech, the right to choose leaders in free, fair, and regular elections, and the right to privacy, which are essential for healthy democratic societies. She highlighted that false, inaccurate, and misleading information often assumes different forms based on the context, source, intent, and purpose, and that it is important to distinguish between the various types of untrue information to help policymakers design well-targeted policies and facilitate measurement efforts to improve the evidence base in this important area.

Sander Van der Linden then discussed his work to study false and misleading content online, and how from a psychological perspective false and misleading how misinformation infects our minds, how it spreads across social networks, and the ways in which people can protect themselves and others from its negative effects. He discussed how people can build up “immunity” through “prebunking” – that is, by exposing them to a weakened dose of misinformation to enable them to identify and fend off its manipulative tactics.

Julie Inman Grant discussed the Australian eSafety Commission’s work on addressing online risks and harms facing adults and children from the circulation of untruths online. She highlighted some practical approaches, programmes, and initiatives to address online harms, as well as differences in the impacts on and interventions for children versus adults and men versus women.

Rehobot Ayalew then gave remarks from the perspective of a seasoned fact checker who fights against untruths online daily. She underscored the importance of fact-checking, as well as its modalities and limits (e.g., scalability). She also highlighted the complications that non-anglophone countries face in combatting untruths online, and the mental health burden of having to research some of the more graphic and disturbing false and misleading content.

Pablo Fernández discussed Chequeado’s experience with the Chequeabot AI tool that facilitates factchecking in Spanish. He discussed how to find a balance between human intervention and digital technologies in the fight against untruths online, as well as how Chequeado usefully co-operates with the global fact checking community.

Mark Uhrbach then spoke about Statistics Canada’s efforts to measure misinformation so far, what surveys might be able to help us measure, and why surveys alone can’t measure everything, so we need to use some less traditional methods to fill in the gaps.

The panellists took several rounds of questions from the audience onsite and online. A key point from the discussion is that governments and the multistakholder community need to pool resources (monetary, knowledge) to fight the creators and spreaders of untruths online. Another important point is that pre-emptive actions (e.g., pre-bunking, digital literacy initiatives) are needed to protect people from the risks and harms of false and misleading content online.