IGF 2018 WS #341
To regulate or not to regulate, that is the question

Subtheme
Issue(s)

Other
Sub-theme description: Online content regulation

Organizer 1: Richard Wingfield, Global Partners Digital
Organizer 2: Charles Bradley, Global Partners Digital

Speaker 1: Jason Pielemeier, Civil Society, Western European and Others Group (WEOG)
Speaker 2: Emma Llanso, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Grace Mutung'u, Civil Society, African Group

Moderator

Charles Bradley (Chair, Global Partners Digital)

Online Moderator

Sheetal Kumar (Global Partners Digital)

Rapporteur

Hector Selby (Global Partners Digital)

Format

Debate - 60 Min

Interventions

Each of the speakers will make a five minute speech for or against the motion. The primary proposer and opponent of the motion will also make a three minute summation speech at the end of the session summarising their key points and points raised by the floor. The four speakers are all experts on the issue of online platforms and freedom of expression.

In addition, around 30 minutes of the session will be focused on taking questions and comments from the floor relating to the motion. Speakers will be asked to make short interventions in order to maximise the number of points and perspectives heard.

Diversity

The four speakers represent diversity in terms of gender, geography and stakeholder group. By having two opposing perspectives presented, there will also be a clear diversity in policy position.

This session will be a debate on the motion, “This house believes that online platforms should be independently regulated”. Two sides, each comprising two speakers, will present two five minute statements either in favour of or against the motion. In doing so, they will make arguments as to why independent regulation of online platforms of some kind would be beneficial from a freedom of expression perspective. By remarking on the existing situation, as well as proposals for self-regulation, co-regulation or some kind of independent regulation that have been developed by governments, platforms and civil society, the session will explore the risks and opportunities to freedom of expression, as well as bring together different perspectives on the issue.

After all four debaters have spoken. questions and comments will then be taken from the floor before a final, three minute summation from each side. A pre-debate and post-debate vote will be conducted with audience members casting a votes on the motion that is either for, against or undecided.

Agenda

The debate will be facilitated by the Chair of the debate who will introduce the format and manage the allocated time among the two sides (proponents and opponents, composed of two speakers each).

3 minutes: Chair introduction (Chair introduces the speakers and provides brief context for the debate). The Chair introduces the format. The Chair will pose the question and take a pre-debate vote from the floor.

5 minutes: Proposer

5 minutes: Opposer

5 minutes: Second proposer

5 minutes: Second opposer

25 to 30 minutes: The debate will be opened to the floor, in which members of the audience will be able to put questions or comments to the sides. Questions must be addressed to all speakers and speakers who wish to respond will have a maximum of 45 seconds to do so.

3 minutes: Proposer summation

3 minutes: Opposer summation

1 minute: Chair will take a post-debate vote from the floor and present a closing statement

As noted above, the debate will be facilitated by the Chair of the debate who will introduce the format of the debate and manage the allocated time among the two teams (proposition and opposition). After all four debaters have spoken, the debate will be opened to the floor, in which members of the audience will put questions to the teams and make comments. After the floor debate, one speaker from each team will present a three minute summation. A pre-debate and post-debate floor poll will be conducted with audience members casting a votes on the motion that is either for, against or undecided. This format has been deliberately chosen to facilitate lively interaction among speakers and the audience (both remote and in-person).

The session will focus on the specific question of whether there should be independent regulation or oversight of online platforms for their content moderation policies. The issue of the role of online platforms, and their liability for online content, has been addressed in previous IGF sessions particularly session OF37 (Council of Europe - Internet intermediaries: shared commitments and corporate responsibility) and the meetings and work of the Dynamic Coalition on Platform Responsibility.

While freedom of expression online is a longstanding theme, the issue of whether there should be specific independent regulation of oversight of online platforms for how they moderate online content has come to the forefront of debates. In the last twelve months, new liability regimes are being established (such as the NetzDG in Germany), other proposals for regulation have been put forward (such as codes of conduct and mandatory transparency reporting in the United Kingdom) and suggestions for independent oversight of platforms has been mooted by, among others, Article 19, Global Partners Digital and the UN Special Rapporteur on Freedom of Expression.

There has been criticism in some quarters that the current terms of service adopted and implemented by platforms are inconsistent with human rights standards, with vague categories of content being prohibited, and seemingly arbitrary and discriminatory application of content moderation rules in practice, amounting to adverse impacts upon individuals’ freedom of expression online. Such arguments would suggest that independent regulation could ensure rights-respecting standards and decisionmaking. On the other hand, others have taken the opposite viewpoint, raising concerns that greater regulation of online content – particularly if undertaken by state actors in countries with high levels of censorship – could lead to greater restrictions on freedom of expression. And even increasing the liability of platforms for unlawful and harmful content, such as via the NetzDG in Germany, incentivises the removal of questionable content which may be perfectly lawful.

By facilitating an open, full and frank debate about the pros and cons of different models of self-regulation and regulation, this session aims to help policymakers and decisionmakers consider the impact on freedom of expression when such models are considered at the national level.

Online Participation

The online moderator will ensure that remote participants are able to communicate questions and comments to the Chair (who will be equipped with a laptop) during the floor debate. There will also be live-tweeting from the venue of the session and remote participants will be encouraged to join the discussion and to make comments via Twitter. Promotion of the event will happen in advance and will include information about ways in which individuals can participate remotely.