The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> MENNO ETTEMA: Good morning or good afternoon, or good evening, everyone, depending on where you are in the world. It's great to see such a big group already 48, and ‑‑ and counting, I see, joining this session of IGF 2020 online with the challenging title "Everything You Wanted to Ask About Hate Speech But Didn't."
This Session Is Trying to ‑‑ it's a one‑hour session where we are trying to address different aspects of comprehensive approach on combatting hate speech and gaining various experiences across primarily Europe. As the speakers are from a European context. IGF is a global forum and I hope we will bring in promising practices from other parts of the world and have an interesting exchange on challenges but also, I think very much opportunities to improve our approach to combatting hate speech within the framework and that's the setting for this discussion.
To recall, this discussion is livestreamed in plenary on the IGF YouTube channel and is recorded so that afterwards people can review the session. It's a one‑our session. We will have a short introduction in the minutes to frame the session debate. And then we will move into breakout groups.
Participants are free to choose their breakout groups. I will come back to that in a minute, how that works, and we will be discussing three main areas, the area of preventing hate speech and what can be done in that. The area of self‑and co‑regulatory approaches around content regulation and the use of criminal administrative that covers hate speech and the challenges that might be to implement them in the online space.
I'm joined by an esteemed panel of speakers who will be speaking in their own breakout groups. The breakout groups are not captioned, and they are not broadcast live, or recorded and I hope it's a safe space for everyone to actively engage in the discussion and I hope ‑‑ I count on our note takers to take notes and we have a very strong report afterwards.
With that, I want to, again, welcome the 57 participants that we have right now and I want to pass the floor to Bastiaan Winkel would is the vice chair of the committee of experts on combatting hate speech of the Council of Europe, and he's also working in the Netherlands as coordinating policy advisor on law enforcement and combatting crime within the Ministry of Justice and security Ministry of Justice.
Bastiaan, can you unmute yourself and give the frame of this session?
>> BASTIAAN WINKEL: Yes, Menno, my pleasure. Thank you very much, Menno, for this introduction. Thank you for organizing this session on this important issue, and for all participants a very warm welcome and I'm very glad that you find interest to follow this session and to participate in discussion on this matter. A wide range of policy and practice initiatives have been launched in the past years at international and national levels to address the risks online hate speech poses to human rights in democratic societies.
And the term "hate speech" has been understood in different ways at the national and international level, and within different context. With a bit of freedom of interpretation, we could roughly identify three ways the term is used. First, hate speech that is illegal, in line with international standards. Second, hate speech that is not illegal, but harmful to specific groups and individuals based on protected characteristics and third, hate speech that's not harmful to a specific group, but undesirable in a democratic society.
And it can be in a three‑layered approach as outlined in the UN plan of action.
Freedom of opinion and expression is a fundamental human right, essential to the functioning of democratic societies and the human rights system. It is anchored in Article 19 of the Universal Declaration of Human Rights and for the European on the human rights. The European court on human rights has developed a rich case law of hate speech and incitement to violence. In doing so, it doesn't have to balance competing considerations.
On the one hand, the need to preserve free speech, a cornerstone of democratic societies, and on the other hand, the need to protect individuals' personality rights, the prohibition of discrimination, mutual respect and understanding within society or public order.
In this respect, the court has noted that the failure to provide redress for insulting oppression ‑‑ expression could entail a violation of the positive obligation under Article 8 to secure effective respect to the rights to a private life.
Taking into consideration the human rights framework and the negative and positive obligation of the states, it becomes evident that different categories of hate speech required different, yet complimentary approaches to address hate speech and the impact on those targeted and wide other society.
The Council of Europe promotes a comprehensive approach to combatting hate speech. In the session today, we cover three different types of approaches, where in the recent years, lots of experience has been gained with equality challenges and questions remain.
First, preventative measures, for example, through education, the use of counter speech to raise awareness to the risks posed to democracy and human rights.
Second, self and co‑regulatory approaches to content moderation and online governance.
And third, implementation of national criminal and administrative legislation covering hate speech in the online environment.
Now, the committee of ministers of the Council of Europe have tasked a committee of experts to draft a new recommendation on the comprehensive approach to combatting hate speech, including in the context of an online environment, where the human rights framework on European right of human rights and drawing upon the legacy of the No Hate Speech Movement youth campaign.
As well as possible practical tools to give guidance to Member States and other stakeholders in this area.
The work of this committee has just started and will continue throughout 2021 and will include a public consultation for the summer of 2021. I believe this session is an opportunity to discuss between the many stakeholders present challenges, opportunities, and promising practices that can develop your and our approach to address hate speech and promote human rights online.
So far, my introduction, I give the floor back to Menno.
>> MENNO ETTEMA: Thank you, Bastiaan, for giving the frame and introducing us to the Council of Europe's comprehensive approach and reminding us of the human rights framework. I think it's worn. As mentioned by Bastiaan, we want to zoom into this session in three areas, comprehensive approaches, of course, very wide and I would also like to recommend to look at some of the Council of Europe's documents in that and the committee's framework discussion ‑‑ sorry, the committee's background notes, and give a nice overview of the comprehensive approach and I will share the link in a minute.
But for this session, we wanted to look into three areas specifically developed lately in Europe, and I am ‑‑ and discuss them further in breakout groups where the various speakers will introduce a little bit more, what are the developments and the issues and the challenges in those areas, and open the floor for further discussion.
I am posting right now the various breakout groups. To recall, the breakout records are not recorded or posted live and there will be feedback by the facilitator and the note takers. I want to introduce the speakers and thank them for ‑‑ already now for all the work they have done in preparing this.
And this is also a good moment for everyone to follow and decide which group they want to go to. The first breakout group is on prevention measures to address hate speech and it look into promising practices on using preventative tools such as awareness raising. The speaker are Martin Mlynar youth member of the no hate speech and Albin Dearing who works at the EU Fundamental Rights Agency and he will be joined by Irina Drexler who works at the Council of Europe Unit on Cybercrime in Bucharest. That's Working Group one.
And the second one will look into covers co‑regulatory approaches, and the speakers here Sejal Parmar, lecturer at school of law, University of Sheffield and Alexander Schaefer head of division for telecommunications and media law at Federal Ministry of Justice and consumer protection law in Germany.
The third Working Group is on prosecution and particularly using the national criminal and administrative legislation covering hate speech in the online and this will be joined Bastiaan Winkel, who is working as the coordinator policy advisor on law enforcement and combatting crime, justice and security and Alexandra Laffitte, vice chair of EuroISPA, and I will be note taking in that session and then in second session, will be Giulia who is the co‑Secretariat of the expert committee on combatting hate speech.
If I'm not mistaken, you can choose your own breakout session. The IGF session hosts will open the breakout sessions in one minute. You should be able to receive a button at the bottom of the Zoom when you can choose group ‑‑ Working Group one on prevention, Working Group two on self‑regulatory and co‑regulatory measures and group three, prosecution, which is on national law and how to implement them online.
The session will last half an hour and then we will come back for plenary feedback and further discussion in plenary.
I see that we should be searching for more buttons or breakout rooms button at the bottom. And to be honest, I can't see it myself.
>> I don't have the button either. I can't see it.
>> I'm sorry, Menno to jump in. This is Luis. At the bottom of your screen, you should see a button that says breakout rooms or sometimes there's no space and in that case it's like three dots and more and then there an option join breakout room.
>> I have none. I don't have breakout sessions button.
>> Yeah, I'm seeing several people have already joined the breakout room. So in organization, there's Alex, and Daphne Stevens. Okay?
Yes, it says ‑‑
>> Just to confirm, if you are using your phone, yes, okay. Continue, please.
>> MENNO ETTEMA: Luis, could you please ‑‑
>> I was muted. I'm sorry. I'm saying for people that cannot find the button, you the host should be able to host people. I mean, there are many people that have already joined the rooms.
Especially in protection. In protection, we have and prosecution there's Alexander, Amal, and Daphne. So for the rest of the people, you host, you can assign the people to the groups. Maybe follow instructions by Menno, but you the cohost, if you click on ‑‑ so you Menno, you can see, if you click on join breakout rooms you can all the people that have joined.
>> MENNO ETTEMA: I can't see it I myself either. Maybe we are suffering from the fact that I'm working on a Macintosh and or an Apple.
>> If you just touch on the screen. Okay. If you use the screen option, just put your cursor on the screen, you will be able to see. I think on your top left, breakout rooms. Breakout rooms.
Yes?
>> Hi, sorry. I think for many people, it says please wait to be assigned.
There are no options to actually select, and it could be actually as Menno said in relation to a Mac being used. In that case, are we able to provide our preference in the group chat and be assigned?
>> Yes.
>> Yes.
>> Yes, please write in the chat and you will be assign the people that is not able to join. I mean, many people have already joined several rooms.
>> It's probably the issue with Macs. I have the same problem.
>> MENNO ETTEMA: Okay. So for everyone this is going to be a little bit chaotic. If you could please send in the chat the preferred Working Group and then the host will have to ‑‑ the IGF host will have to assign us individually. This will take a few minutes.
>> So can you see them, Menno, getting group three.
>> Okay. Let me just do that.
Just a second.
Assign.
>> Okay. I'm the host. I will assign the people now. Menno, for example, I'm going to transfer you now to room three, okay?
>> MENNO ETTEMA: Thank you.
>> Please keep comment on the chat section. We have the three groups. Groups one, two, and three. Thanks.
>> So we are assigning at the moment. You should be assigned shortly, okay? Most of the people have been assigned already.
>> Yes, I will confirm even group two, there are around 20 people.
Okay. We have group, one two, and three. So group one is prevention, measures to address hate speech. And group two is protection, self and co‑regulatory approaches, and three, we have the issue of national and criminal administrative legislation. Please keep them coming.
Feel free to join any group. Okay. Thank you. Thank you so much for your cooperation.
(Breakout groups).
>> Thanks, Patrick.
>> Okay. Is there anyone that needs to be joined to some group?
I'm seeing Steven, Ivana are still unassigned.
Alexandra and Maritta, you write in the chat. The rest of the people have been joined to groups.
So if no one says anything else, let's keep it like it is now.
>> MENNO ETTEMA: Hello, everyone. Yes, I see breakout group two is back. Breakout group three is back. Sorry for taking one extra minute.
>> It's well.
>> I just want to close the groups is that okay?
>> MENNO ETTEMA: Yes, they are.
>> They should be joining shortly all of them.
>> MENNO ETTEMA: Mm‑hmm.
>> They need to be back. In the next few seconds, they will be back.
>> MENNO ETTEMA: Okay, everyone, welcome back to the plenary. We are joined also by group one. Welcome back. I hope it was an enriching half an hour. Just a little bit over a half hour. I think our group three couldn't manage to wrap‑up. So we were one minute late and I think group one had the same problem because you are two or three minutes late. I complement everyone in participating the discussion.
We have ten minutes left. And for purposes of recording, I think it's good to hear from the three different breakout groups and if we have a few more minutes we can discuss the overall discussion. I encourage people to use the chat. So if there are follow‑up questions or if you want to share data or links to report, please use the chat effectively.
Working on group one was on protection and I would like to grant the rapporteur for group one the floor, please.
>> MARTIN MLYNAR: Okay. Thank you, Menno. In group one, we had really fruitful discussions. Firstly, I introduced my points of voice encouragement and education and the resilience which matters with regards to hate speech.
Then Albin introduced his point of Anti‑Semitism and the importance of media literacy and education, even in times of COVID‑19 crisis and its impact on societies.
Then we ‑‑ we got to a few ‑‑ a few summary points that Carine suggested that pointing to great examples is important in the peer network, where it's coming from and also not only ‑‑ not only promoting and showing great examples but also giving them voice by pointing to where it's best practices and awarding them. They have ‑‑ in Germany, they have Smart Hero Award. And great point about polarization was raised that nowadays everything is quite polarized, and the people are a bit too far and it results in further fragmentation into smaller groups and to marginal groups, and it's important that we try to build bridges between those groups instead of ‑‑ instead of losing them.
And it happens that we lose people from social media. We lose them to smaller marginalized groups, and it's harder to combat those groups one we lose them.
And one great point was highlighted by Albin that we have to make people aware that what they do online matters and that they bear responsibility for that.
But that's a short summary. So if I should say two or three points, education, building the bridges and promoting good examples are the most important ones.
>> MENNO ETTEMA: Thank you very much, Martin. It's clear rich debate and I think already someone commented that the Working Group was too short. It only just started to heat up.
>> MARTIN MLYNAR: It was.
>> MENNO ETTEMA: So let's continue this in other settings. The second Working Group, can I give the rapporteur the floor. It was on self and co‑regulatory mechanisms.
>> SEJAL PARMAR: Thank you very much. There is an impulse to regulate in Europe, over the recent years in regards to various types of speech and noted that this impulse or the products of this impulse have been criticized because of the risks they posed to human rights and freedom of expression and at the outset, highlighted a couple of specific proposals that have emerged in policy discussions, namely the proposal of social media councils and a system of equals which could offer independent oversight of content moderation decisions.
Through the discussion, we looked at the German example and the implementation of the network enforcement act there. And the improvements that have been made. But also from the floor there were a number of concerns put forward about the general way in which regulation is impacting upon content.
So some of the problems that were identified include how states have been trying to find solutions on the basis of their terms of service, rather than their legislation and the example there that was given was a net DG. And often there's a miss link with the rule of law and the rule of law institutions, particularly courts and others such as prosecutors, something that the recent and rather embryonic proposal of equals could potentially address.
There was also a great contribution from representative of Google, who reflected how Google has been taking responsibilities concerning human rights seriously and gave some really interesting information concerning takedowns by the company for ‑‑ for, for example, content that violates hate speech policies.
Apparently 151 million comments were taken down in one quarter of this year, and how for the private sector, at least for Google, it's a multi‑sector effort to address these issues.
Also, the reflections by the Google representative indicated how this focus on larger companies needs to be broadened out and think about other smaller platforms, but also raise the question which is how ‑‑ whether through automated systems it's possible to take down all the content that should be taken down, including the human rights perspective and how machines can do this in a way.
There was some skepticism raised about self‑regulation and also co‑regulation, because ‑‑ and the actors were talking about our businesses and they are always going to be self‑interested, and do the minimum possible no to meet ‑‑ meet the legal standards necessary.
And ultimately, that this is an issue of some kind of actual regulation, and proper accountability for these organizations under some kind of legal frameworks.
On the other hand, there was a sense of hope in relation to co‑regulation which was conveyed by another speaker who indicated or suggested that this co‑regulatory approach has to be industry‑wide and has to encompass different companies, large and small, and the key is to look at various business models which essentially monetize incitement and hate speech.
I think this was a mixture of skepticism and hope, about various regulatory approaches and there was an interest in new and emerging proposals for social media councils and ecourts and I think there's definitely a space to watch there and everyone in the discussion was interested in that.
And finally, these kind of, you know, long standing questions about the extent to which online platforms can practically address hate speech, given the sheer volume of it, and how context can be properly taken into account and their policies and also their practices in line with international human rights law.
Thanks very much.
>> MENNO ETTEMA: Thank you, Sejal. It's clear that group two had a quite condensed discussion with many different points raised. Bastiaan, I understand you are reporting from Working Group three on prosecution, the role of national legislation and addressing online hate speech. Can I give the floor to you? Thank you.
>> BASTIAAN WINKEL: Yes, thank you very much, Menno. On the point of law enforcement, when I ‑‑ when I hear the summaries from the other rapporteurs, it's always a pity to choose between one of the sessions and you cannot join all. But I think at least that we ‑‑ that we should continue our discussion after this session as well. For example, within the framework of crafting the recommendation for the Council of Europe.
So on the point of law enforcement, I think the discussion we had was quite interesting because we have ‑‑ we have approached the topic from different angles.
The angles from the ISPs, who are telling us it's clearly that things on Internet are happening, that needs some law enforcement. So we cannot let it go. Something has to happen. But governments please be clear on what rules we should apply. And that was a ‑‑ that was a very ‑‑ very strong message from the ‑‑ from Alexander, from the ISPs. And so, of course, we want to cooperate with governments and law enforcement, but that's almost impossible if ‑‑ if all individual countries have their own regulations and have their own definitions. We really should work towards more European or perhaps worldwide definitions to be clear for everyone what is allowed and what is not allowed on the Internet.
From the other ‑‑ the point of us brought in by some of the people that participated in the discussion, was the angle from the users. We discussed the issue of should there be something of an obligation to make clear who you are talking with? How to deal with being on the Internet anonymously?
Of course, it's the human rights, but on the other hand, it's for law enforcement, it's a very difficult issue and in some cases, you do not want people to be anonymous.
And the third perspective was the perspective from the state. So where we first thought on the Internet it's something that the state should not intervene, we are now confronted with ‑‑ with illegal activities, with harmful activities and also with unwanted, undesirable activities on the Internet and we really have the discussion in all countries going on, like, where is the point where government should ‑‑ should intervene?
And on all of these different issues, we are phrasing ‑‑ we need to address them in a different way, with different measures.
And I think what was very positive from our discussion is that we all shared this idea of the questions that are on the table. We did not have the time yet to answer them all, but I ‑‑ as I concluded in the session as well, I really think we are on the right track to address all of these issues and hopefully come up with a very good recommendation for the Council of Europe.
Perhaps Alexandra and Menno, you have something to add to this short summary, some points I forgot?
>> ALEXANDRA LAFFITTE: I think you summed it up perfectly. We came in with questions. We came up with more questions and I'm excited for the work that's ahead. So thank you.
>> BASTIAAN WINKEL: Okay.
>> MENNO ETTEMA: Thank you very much, Bastiaan and Alexandra, I second what you are saying. We are one minute of time, but IGF gave us five minutes to wrap up which is very good. I take that opportunity to inform everyone that notice has been taken in all the breakout groups and together with the three note takers, we will draft a report. So the IGF asks us to draft a short report in 24 hours. So we will do that. But if you allow us and give us a week more, we will draft a bit more details report where various points were raised and we will share the chat so any links that were shared there can be reaccessed in the future.
I take also the opportunity to thank the IGF for hosting us and dealing with the technical challenges, specifically with the breakout groups and different ‑‑ using different platforms might have ‑‑ like Apple or Windows might have caused and captioning and the team that does the captioning of the session.
So the report will be online at this workshop session on the IGF platform.
And finally, I ask everyone to give IGF feedback. So after this session, you will be guided to feedback form and if you are not automatically guided you can find a feedback form in your personal schedule and then also give feedback.
As Bastiaan and others were saying, there's a lot of discussion ongoing in many different areas and I think this is ‑‑ it shows the need for comprehensive approach. Hate speech is a very complex ‑‑ complex situation. We need to really ensure human rights standards and human rights framework is protected and therefore we need a fairly tailor made approach to everyone concerned, national context, local context, personal context and this needs a very differentiated approach. This needs our thinking and I'm very happy that we had the time today to discuss this and share practices and I'm sure there are many forums where we can continue the discussion and move forward, including through the Council of Europe's expert committee and other forms. So please join us. I shared the links online.
Thank you, everyone. Thank you, speakers and see you in the next forum.
>> Please fill out the feedback form. Thank you so much and have a great morning and afternoon. Thank you.
>> Thank you.
>> MENNO ETTEMA: Bye everyone. Thank you.