IGF 2018 OF #31 Assessing hate speech and self-regulation, who and how?

Salle VII

Freedom of expression extends to ideas that offend shock or disturb. The motive of calling for anti-discrimination and hate speech policies is to prevent the incitement or justification of racially-based hatred and violence, not to suffocate controversial political opinions or debate. The General Policy Recommendation No. 15 on combatting hate speech of the European Commission Against Racism and Intolerance (ECRI) recommends a coherent and comprehensive approach to combatting hate speech, covering legal and administrative measures; self-regulatory mechanisms; effective monitoring; victim support; awareness raising and educational measures. The “Network Enforcement Bill” (19/13013) adopted in Germany on 30 June 2017, calls on Internet providers to asses and remove hate speech content within 24 hours after being reported. Review of complex cases may take one week and can be referred to an independent body of self-regulation. It’s the first example of national authority enforcing legislation on combating illegal hate speech online and many of its modalities are still being shaped. In the backdrop of the EU ‘Code of Conduct’, and the German Network Enforcement Bill the question how to set up self-regulatory bodies that can assess complex reports of hate speech and decide on appropriate actions is becoming essential. Their assessments must take into account European and national regulations and give due regard to human rights principles and Council of Europe standards. In ECRI’s GPR 15 on Combatting Hate Speech recommendations 6 and 7 provide general principles for self-regulatory body, which should adopt comprehensive code of conduct that can be enforced; be transparent and known; include monitoring and complaints mechanisms with possibility for appeal; and ensure sufficient training of staff. A multi-stakeholder approach often strengthens such regulatory bodies. This open session invites for an explorative dialogue with relevant stakeholders to review blueprints for self- or co-regulatory bodies to assess reports of hate speech online to work, covering • Expected deliverables of a self- or co-regulatory bodies for Internet businesses • Modus of operation • Roles and responsibilities of stakeholders involved • Context needed for it to function, eg. legal status and safeguards, its independence, and legal oversight. The session welcomes representatives of legislators, law enforcement agencies, self-regulatory bodies, Internet and Social media companies, Civil Society organisations, ECRI, CoE secretariat and other International organisations.


Council of Europe - No Hate Speech Movement


Jeremy McBride – Consultant for European Commission against Racism and Intolerance on ‘General Policy Recommendation No 15 Combatting Hate Speech’

Miriam Estrin – Policy Manager for Europe, Middle East, and Africa at Google

Anton Battesti - Head of Policy at Facebook France

Tamas Dombos – Board member Háttér Society, LGBT rights organisation Hungary, member of EC group for monitoring of Code of Conduct implementation

Online Moderator

Coordinator No Hate Speech Movement Romania

Session Time
Session Report (* deadline 26 October) - click on the ? symbol for instructions

- Session Type (Workshop, Open Forum, etc.): Open Forum

- Title: #31 Assessing hate speech and self-regulation, who and how?

- Date & Time: Wednesday 14 November 2018, 9.00-10.00

- Organizer(s): Council of Europe – No Hate Speech Movement

- Chair/Moderator: Menno Ettema – Council of Europe - Anti-Discrimination Department

- Rapporteur/Notetaker: Elisabeth Schauermann (Notetaker)  / Veronica Stefan (online moderator)

- List of speakers and their institutional affiliations (Indicate male/female/ transgender male/ transgender female/gender variant/prefer not to answer):

  • Jeremy McBride – Consultant for European Commission against Racism and Intolerance on ‘General Policy Recommendation No 15 Combatting Hate Speech’
  • Miriam Estrin – Policy Manager for Europe, Middle East, and Africa at Google
  • Anton Battesti - Head of Polilcy at Facebook France
  • Tamas Dombos – Board member Háttér Society, LGBT rights organisation Hungary, member of EC group for monitoring of Code of Conduct implementation

- Theme (as listed here): Human Rights, Gender and Youth

- Subtheme (as listed here): Freedom of Expression online

- Please state no more than three (3) key messages of the discussion.

  • Self-regulation as a tool to assess complaints of Hate speech as an expression of discrimination is new for the Internet Industry pushed for in recent policy regulations, such as German NetzDG, EU Code of Conduct.
  • Self-regulation should address concerns and needs of Users, Internet Business and democratic society. Therefore is should provide for effective protection of those targeted, quick procedures, clear understanding of the reasons for take down or non-takedown for users; clarify Liability of the company visa vie self-regulatory decisions, relation between self-regulating body and law-enforcement/ regulators, balance between liability of those producing and those hosting; a democratic society calls for Independent (judiciary) oversight, transparency and appeal procedure.
  • More reflection and a study can help identify promising practices that uphold a fair balance between rights and needs of users and a democratic society as a whole.

- Please elaborate on the discussion held, specifically on areas of agreement and divergence.

Asked what Self-regulation on hate speech should deliver, participants mentioned: balancing freedom of expression with protection of dignity and privacy; clarity of definitions and sensitivity for national/regional contexts; concerted approach, not a self-regulatory body per platform. There was fear companies concede to governmental pressures; questions about company, government or user based regulation models and realization users need to (learn to) report hate speech for it to be effective.

The needs regarding self-regulation of users affected by hate speech can be summarized as: Quick, Transparent (both of procedure and argumentation for decisions), Accessible (meaning clear reporting system and no financial barriers)

Internet Businesses strive to have open platforms, clear user guidelines and notice systems. Facebook France’s invitation for French regulators to review their assessment system underlines this and aids more transparency. Self-regulation can complement internal assessment processes to address complex cases, this would require access to topic experts, time to make the assessment, clarity on national legislation and liabilities of the company when implementing decisions of a self-regulatory body.

It was underlined that self-regulation is only one of the tools to address hate speech as outlines in ECRI GPR 15 on Combatting Hate Speech. It can be quicker and less costly but should not substitute or block possibilities to start court proceedings.

- Please describe any policy recommendations or suggestions regarding the way forward/potential next steps.

Taking the EU Code of Conduct and the German Network Enforcement Law as examples, it was pointed out that self-regulatory bodies and platforms alike need to be transparent about their procedures, decision making processes and their outcome.

From the platforms’ side, the suggestion was brought up that state authorities as representatives of their population should work with platforms and self-regulatory bodies on achieving legitimacy and protect users’ rights.

The panel and participants voiced commitment to reconvene at the next IGF in order to follow up on progress and challenges with self-regulation in combating hate speech online.

- What ideas surfaced in the discussion with respect to how the IGF ecosystem might make progress on this issue?

If self-regulation and self-regulatory bodies are suggested as a new way for dealing with online hate speech, the IGF community offers crucial resources to gather expertise from all stakeholder groups, so that platforms, intermediaries, regulators and new bodies are able to take informed decisions that protect individual user rights and the democratic society.

The IGF can work as a forum for monitoring progress and challenges that arise with applying models of self-regulation and co-regulation.

- Please estimate the total number of participants.

Circa 30 participants + 4 panelists, 1 moderator, 1 remote participation moderator, 1 rapporteur

- Please estimate the total number of women and gender-variant individuals present.

Circa 15

- To what extent did the session discuss gender issues, and if to any extent, what was the discussion?

There was no direct discussion on gender issues regarding to women*, but one of the panelists offered a civil society perspective on the specific problems of people of the LGBTQ+ community with regards to hate speech and harassment online and participants were able and encouraged to speak from their personal experiences.