IGF 2019 WS #85 Misinformation, Trust & Platform Responsibility

Organizer 1: Yik Chan Chin, Xi'an Jiao-tong Liverpool University
Organizer 2: Arthur Gwagwa, Centre for Intellectual Property and Information Technology Law, Strathmore University Kenya
Organizer 3: jinhe liu, Tsinghua University

Speaker 1: Minna Horowitz, Civil Society, Western European and Others Group (WEOG)
Speaker 2: Arthur Gwagwa, Technical Community, African Group
Speaker 3: Ansgar Koene, Civil Society, Western European and Others Group (WEOG)
Speaker 4: shu wang, Private Sector, Asia-Pacific Group
Speaker 5: Jinjing Xia, Private Sector, Asia-Pacific Group

Policy Question(s): 

What are the reasons for the proliferation of disinformation and fake news in different countries and regions?
What are the mechanisms used in disinformation and fake news mitigation? And How effectiveness are they?
What role should technology (e.g algorithm) play in disinformation and fake news refutation?
What roles should Internet platform play in disinformation and fake news refutation? What kind of collaboration could be created among Internet platforms and media outlets to fight disinformation and fake news?
What are the best practices in terms of disinformation and fake news refutation in light of freedom of speech and the necessary neutrality and legal certainty for platforms?
How can trust and accountability to the internet platforms and government interventions be restored?

Relevance to Theme: The IGF community is considering the potential risks to the security and stability to the Internet, and how to achieve the safety and resilience of a healthy digital environment. The session will contribute to the discussions of fake news, trust, accountability, and freedom of expression under the theme “security, safety, stability and resilience”. It will address those issues by looking at online disinformation and fake news refutation from different stakeholders’ perspectives. Specifically, the workshop will discuss: 1) the responsibilities of Internet Platforms and government regulators in fighting the online fake news and disinformation; 2) the role of technology (such as AI & Algorithm) in fake news and disinformation refutations; 3) how to hold Internet platforms and government accountable; 5) How to restore the public trust in the Internet Platforms, government, and the news media; 6) How can globally accepted standards and best practice be developed. The topics of discussions make this panel directly relevant to the theme “security, safety, stability and resilience.”

Relevance to Internet Governance: The proposed session will discuss the timely issues of fake news and disinformation, information security and online safety, responsibility and accountability of digital platforms, and function of government regulation and trust in platform and government in the Internet governance.
It will involve stakeholders from the private sector, civil society, and technical sectors at both developed regions (EU and US) and developing regions (China and Africa) to share their professional knowledge, experiences, best practices, policy framework in disinformation and fake news refutation. The proposed session will facilitate the global debate as well as shaping the evolution of norms, principles, best practices of online disinformation and fake news mitigation and model of Internet governance.


Panel - Auditorium - 90 Min

Description: The creation, dissemination and accumulation of information is one dimension of structural power. The vast majority of conflicts today are not fought by nation states and their armies; increasingly, they are fought not with conventional weapons but with words. A specific sort of weaponry—“fake news” and viral disinformation online—has been at the center of policy discussions, public debates, and academic analyses in recent years (Horowitz, 2019). Technology, including digital platforms, that enable connections and participation can be used for misinformation and fake news. In addition, what has been called the “emerging information arms race” (Posetti & Matthews, 2018, July 23) is plaguing mature and emerging democracies alike (Horowitz, 2019). A variety of approaches has adopted in different regions/localities to flight disinformation and fake news from content intervention (fact-checking and filtering), technical intervention (dedicated anti-rumor platforms, algorithm) to economic intervention (undermining advertising sources), legal intervention (civil and criminal penalties ) and etc. Different stakeholders including state actors, NGOs, platforms, news media are involved. However,
How effective are those approaches, what are the shared policy principles, norms and mechanisms?
What are the responsibilities of actors such as Internet platforms and government regulators?
What roles do technology (e.g. algorithm and bots) play in the process?
How can we hold the actors accountable for their interventions?
How can we encourage cross-region and cross-sectors collaborations?
What are the best practices in light of freedom of speech and the necessary neutrality and legal certainty for platforms?
How can we restore the trust of the public to the Internet platforms, news media and politics?

In this session, speakers and moderators from China, UK, Finland, Africa will discuss the above questions from diverse geographic and stakeholder’s perspectives.

Dr. Minna Horowitz, Docent professorship at the University of Helsinki; Expert, Digital Rights and Advocacy, Central European University, Center for Media, Data, and Society

Dr. Ansgar Koene, Chair of IEEE P7003 Standard for Algorithmic Bias Considerations working group; Senior Research Fellow, University of Nottingham, HORIZON Digital Economy

Ms. Jingjing Xia, The Bytedance Techkind Center, BYTEDANCE TECHNOLOGY CO, China.

Mr. Shu Wang, Deputy Editor, Sina Weibo, China

Mr. Arthur Gwagwa, Centre for Intellectual Property and ICT Law: Strathmore Law School, Kenya

Onsite Moderator: Dr. Yik Chan Chin, Xi’an Jiaotong Liverpool University
Online Moderator: Mr. Jinhe Liu, Tsinghua University

Intended Panel Agenda:

Setting the scene: onsite moderator, Dr. Chin, 5 minutes

Four presentations, each speaks for 9 minutes with 1 minute of immediate audience response
1) Minna Horowitz
2) Shu Wang
3) Ansgar Koene
4) Jingjing Xia
5) Arthur Gwagwa

Discussions amongst speakers 10 minutes, moderated by Dr. Chin

Interactive question and answer session, 30 minutes moderated by Dr. Chin and Mr. Liu.

the wrap-up of the moderator, 5 minutes

Expected Outcomes: 1) Facilitate the debate as well as shaping the evolution of norms, principles, best practices of online disinformation and fake news refutation and model of Internet governance.
2) Identify differing viewpoints regarding Internet governance approaches regarding AI to help the creation of an environment in which all stakeholders are able to prosper and thrive
3) Policy recommendations and key messages report to the IGF community
4) A collaboration amongst speakers who are from different stakeholder sectors, in fake news and disinformation refutation and researches.

Onsite Moderator: 

Yik Chan Chin, Civil Society, Asia-Pacific Group

Online Moderator: 

jinhe liu, Civil Society, Asia-Pacific Group


Yik Chan Chin, Civil Society, Asia-Pacific Group

Discussion Facilitation: 

The session will be opened by the onsite moderator to provide participants an overview of the topics discussed in the session, the professional background of the speakers, and the format of interaction. Each speaker will give short presentations providing basic knowledge to the audience of their topics. The moderator will ensure the audience from both offline and online being able to ask questions to the speakers immediately following their presentations to encourage active participation. In the third part, the session will move to the discussions and debate. The moderator will invite each speaker to express their views on a set of questions and guide the debate amongst speakers and the audience to foreground their common ground and differences. The workshop organizers and moderators will discuss the content of questions with speakers in advance to ensure the quality and flow of the discussion and debate. In the third part, moderators will invite questions from the audience and online participants, the question time will last about 30 minutes in order to provide sufficient interactions amongst speakers, audience and online participants. Online participants will be given priority to speak, and their participation will be encouraged by moderators. The onsite moderator will summarise the findings and recommendations and future actions of the panel.

Online Participation: 

The online moderator will participate in the online training course for the Official Online Participation Platform provided by the IGF Secretariat's technical team to ensure the online participation tool will be properly and smoothly used during the proposed session.


GOAL 3: Good Health and Well-Being
GOAL 9: Industry, Innovation and Infrastructure
GOAL 10: Reduced Inequalities
GOAL 16: Peace, Justice and Strong Institutions
GOAL 17: Partnerships for the Goals