Session description / objective
The reach of content published online is amplified through social media platforms at a speed never seen before. This has promoted democratic values by empowering individuals, giving a growing voice to those who have not been heard before. These platforms are a great example of how Internet-powered innovation have enhanced the way people participate in society from an economic, social, political, and cultural perspective.
Among these positives, however, there are increasing risks associated with the proliferation of hate across social media platforms, including the amplification of violent extremism, which has affected the way people feel safe and secure both online and offline. In light of events like the March 2019 Christchurch mosques attacks, there is a growing expectation for responses that quickly identify threats and bring effective action to prevent and minimize the damage that the viral amplification of terrorist and violent extremist content online can cause. At the same time, there are risks associated with such responses as different rights and freedoms come into play (such as freedom of expression, freedom of thought, conscience and religion).
Social media platforms have embarked on processes to develop their own community standards, incorporating the feedback from their community of users to deliver upgrades, new services and tools as well as decide what is acceptable content and behavior online and what is not. Industry Forums have formed to coordinate efforts and share best practices. Besides these efforts around self-regulation, other approaches to these challenges include co-regulation, working with regulators, etc. However, it is not entirely clear how these processes are strengthening the rule of law, following due process, and been consultative, inclusive and open enough.
This main session will focus on different responsibilities, responses, rights and risks involved in policy approaches to dealing with terrorist and violent extremist content online. It will consider regulatory and non-regulatory approaches by social media platforms, as well as how such platforms address violent extremist content uploaded to their services by end users.
Policy Questions
The session will focus around four main areas of discussion: responsibilities, responses, rights and risks.
Responsibilities: A holistic approach to addressing terrorist and violent extremist content online requires engagement and action by different types of stakeholders.
-
What are the different responsibilities of the different stakeholders to each other in developing solutions to the spread of terrorist and violent extremist content online?
Responses: Different governments have responded in different ways to terrorist and violent extremist content online, from creating criminal liability for executives through legislation to joining voluntary declarations, such as the Christchurch Call. Industry has developed collective responses, as well as platform-specific approaches to dealing with violent extremist content.
-
How have governments responded to the spread of terrorist and violent extremist content online? How has the private sector responded?
Rights: Laws and policies that regulate or moderate Internet content raise questions about freedom of expression and other human rights. While international human rights law is only directly binding on states, under the UN Guiding Principles on Business and Human Rights, platforms have a corporate responsibility to respect human rights, including freedom of expression.
-
What are the different human rights that are relevant to the discussion of terrorist and violent extremist content online, and why?
Risks: As mentioned in the Rights section, Internet content regulation raises freedom of expression and other human rights concerns. Regulation may also have technical impacts.
-
What are the potential risks to different human rights posed by terrorist and violent extremist content regulation and how are these risks being addressed?
Speakers
-
Gert Billen. State Secretary. Federal Ministry of Justice and Consumer Protection, Government of the Federal Republic of Germany. Germany.
Government. Male. WEOG.
-
Dr. Sharri Clark. Senior Advisor for Cyber and Countering Violent Extremism, U.S. Department of State. United States.
Government. Female. WEOG.
-
Paul Ash. Acting Director, National Security Policy Directorate, Department of Prime Minister and Cabinet. New Zealand.
Government. Male. WEOG.
-
Courtney Gregoire. Chief Digital Safety Officer Microsoft Corporation. United States.
Private Sector. Female. WEOG.
-
Brian Fishman. Policy Director, Counterterrorism, Facebook. United States.
Private Sector. Male. WEOG.
-
Eunpil Choi. Chief Research Fellow from the Government Relations & Policy Affair Team, Kakao Corp, Korea.
Private Sector. Female. Asia Pacific.
-
Professor Kyung Sin Park. Korea University Law School. Korea
Technical Community (Academia). Male. Asia Pacific.
-
Yudhanjaya Wijeratne. Team Lead - Alghoritms for Policy, LIRNEasia. Sri Lanka
Civil Society. Male. Asia Pacific.
-
Edison Lanza. Special Rapporteur on Freedom of Expression, Inter-American Commission on Human Rights. Uruguay.
Intergovernmental Organization. Male. GRULAC.
Moderators
-
Moderator: Jordan Carter, InternetNZ, WEOG (New Zealand). Technical Community. Male
-
Remote participation moderator: MAG member Lucien Castex, WEOG (France). Civil Society. Male.
Agenda
11:15 – 11:35 Introduction and opening statements (20m)
-
The Moderator explains the policy problem, provides an overview of the session agenda, and introduces the panellists
-
New Zealand PM Jacinda Ardern video address
11:35 – 12:15 Responsibilities & Responses (40m)
Professor Park (Academia), will introduce this section of the agenda, focusing on responsibilities and responses when addressing Terrorist and Violent Extremist Content online. The panellists will discuss around the following:
-
What are the different responsibilities of the different stakeholders to each other in developing solutions to the problem of Terrorist and Violent Extremist Content online?
-
How have governments responded to the spread of Terrorist and Violent Extremist Content online?
-
What is the private sector doing to address Terrorist and Violent Extremist Content online?
-
How has the civil society responded?
12:15 – 12:40 Rights & Risks (25m)
Edison Lanza will introduce this part of the discussion, focusing on rights and risks associated with addressing Terrorist and Violent Extremist Content online.
12:40 – 13:00 Audience Interaction (20m) Moderator receives interventions from the Floor & panellists respond.
13:00 – 13:15 Looking Forward (15m)
-
What are your policy recommendations going forward for addressing Terrorist and Violent Extremist Content online?
-
What role can the IGF ecosystem play in making progress on this Internet policy issue?
-
Concluding remarks by Moderator
Plan for in-room participant engagement/interaction?
The moderator will provide details about how the audience can use of Speaking Queue for the Q&A on site as well as for the remote participation. The moderator will request the audience to use #contentgov #IGF2019 hashtags.
The moderator will encourage discussion among the speakers and take questions from the audience in between the different sections of the agenda. Questions from the Policy Questions list could be used to support the discussion. The floor will be opened for questions from the audience as well as remote participants. The moderator will engage with the audience and encourage them to ask questions, managing the flow of the discussion.
Remote moderator/Plan for online interaction?
There will be a remote moderator that will encourage the remote participants to ask questions and make comments and assist with how the online conversation is integrated with the discussion in the room. The plan for the session will be posted prior to the event, questions and comments will be gathered to enrich the on site participation. The organizers will liaise with the IGF secretariat to engage remote hubs to gather input prior to the event, in case the real time options are too difficult to handle.
Connections with other sessions?
From the 2019 program, the following sessions will be tackling aspects of terrorist and violent extremist content online:
The session will also be linked to the Main Session on Technical and Operational issues that was organized last year, which focused on Content Blocking and Filtering. The session covered the importance of definitions, due process, and technical implications (around 3 or 4 of the policy questions that we have listed here were covered on that session to a certain extend). Proposals & Report: https://www.intgovforum.org/multilingual/content/igf-2018-technical-operational-topics
From the 2017 IGFs, the session "A Net of Rights: Human Rights Impact Assessments for the Future of the Internet" also covered similar concerns.
Desired results/output? Possible next steps for the work?
This main session aims to approach the issue of Terrorist and Violent Extremist Content online by addressing the different responsibilities, responses, rights and risks that governments, private sector and civil society have to consider when they are designing and implementing policy approaches. The session will help to identify the approaches and challenges stakeholders face when coming up with responses that are in fact solving the problem, but also strengthening the rule of law, following due process, been consultative, inclusive and open enough. The main session will produce a session report documenting the contributions from the panellist, as well as the input from the audience, that could serve both as inspiration for the development of more holistic and integrated policy frameworks at national and regional levels, and as a contribution to strengthening the multi-disciplinary approach within the IGF. It is anticipated that a series of blog articles will be published following the session, from a variety of perspectives, to raise awareness among stakeholders about the challenges to address Terrorist and Violent Extremist Content online.
Co-organizers (MAG members)
-
Sylvia Cadena. APNIC Foundation.
GRULAC & WEOG (Colombia, Australia), Technical Community, Female
-
Susan Chalmers. NTIA.
WEOG (United States), Government, Female
-
Jutta Croll. Stiftung Digitale Chancen.
WEOG (Germany), Civil Society, Female