Welcome to the United Nations | Department of Economic and Social Affairs

The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> AMALIA TOLEDO‑HERNANDEZ:  Good morning, everyone. 

My name is Amalia Toledo‑Hernandez.  I'm with Karisma, a Columbian NGO that works on the interception between human rights and technology.  I'm glad to be here and see that my room and session is full.  This is a session organized ‑‑ co‑organized with ARTICLE 19 Brazil.  So thank you very much for being here, for being interested in this discussion. 

We are going to have a very dynamic ‑‑ or expect to have a very dynamic session so we're going to have to first have a short presentation from the panelists on my side, and then we're going to break up in groups. 

So listen carefully what they have to share with us so you can then choose with whom you want to continue the conversation; and then we're going to come up with some ideas, recommendations on how to cope with speech.  So this is more or less the methodology of this session. 

It is clear ‑‑ the topic of this session is hate speech and discriminatory speech and freedom of expression.  We all know that racism, sexism, transphobia, homophobia has been around for a long time.

But the Internet, which has amplified the voices of women and LBGT, is a far from free zone for those who will trust them.

No doubt that discriminatory or hate speech disproportionately affect these groups, and it's really an ugly reality.

But there is still the idea for a lot of people that the Internet is this special place where we go and where we are not people of color, we are not trans, we are not minority or not women.  But we are now seeing this is the place where the differences matter. 

Free speech uses an excuse for bad behavior, has been conflated with the idea that free expression is an important value.  And there is a big clash with the right to privacy, to nondiscrimination, the right to speech of these same victims of hate speech and discriminatory speech. 

We have also seen that there is a belief the Internet is not real, that we can draw a line between what is real and what isn't.  This has become really scary. 

The level of visceral hate is astonishing, and it is like if there is a kind of label on how to take somebody's life apart just by tweeting some hateful comments and so on. 

So what strategy, practices, or action can be implemented to contain, to stop, or transform these behaviors?  This is what we want to see in this discussion.  Probably vision, regulations, or norms or more state actions, but from companies, from international organizations, from civil society, from academia, from us as individuals, what actions we can come up to confront these kinds of hate speech outcreated more violence.

So, first of all, I want to present you Guilherme Canela from UNESCO.  He's a program specialist. 

So what you can share with us about this issue. 

>> GUILHERME CANELA:  Good morning to you all.  Thank you for the invitation.  Pleasure to be here.

In some ways, this is an easy session.  I think everyone in the room is against hate speech.  But it's hard because everyone in this room has a different definition of hate speech.  So that's the core of this problem. 

We have this intuition that is a gigantic problem, and it's increasing with the expansion of Internet.  But we also don't know exactly what we are talking about. 

So in my three minutes that probably I will need a little bit more, but I will try to just give you some what's going on in the multilateral arena in discussing this topic and what is the difficulties on discussing this with the Internet as another platform.

We recently launched this publication, "Countering Online Hate Speech," in UNESCO.  We will launch it this Friday at some point ‑‑ no, actually, this Thursday.  And we will have another workshop on hate speech, particularly on the issue of youth radicalization.  So you are welcome to keep discussing this and other points with our guests as well. 

But I mean we have in the international arena the ICCPR with very specific definitions on the limitations of freedom of expression, and this has been used for different jurisdictional constituencies to fight hate speech.  But it's one thing to do this with conventional media, and it's another difficulty to do it with the Internet.  Although the principles are the same. 

You can find as well another element in international standards in the convention on racial discrimination, even in the convention on the rights of women in Beijing platform.  And if you consider soft law, you can find the robust plan of action in the special rapporteur on freedom of expression, special report on hate speech, including hate speech online, and the special rapporteur on minority groups as well recently published report on that. 

This is to show that the international and multilevel arena is very worried about the issue.  And all those different discussions and reports are, of course, going to some kind of conclusions on the main problems we are facing.  And all of them start saying that we have a definition problem, we don't know exactly what we are talking about when we are talking about hate speech. 

We also have this problem of is hate speech everything we don't like, that we feel offended by?  Because if freedom of expression is only about protecting things we are in favor, that's easy. 

So what Charles Hebdo was doing is hate speech or not? 

So we need to be clear on the problems we are facing here. 

So in this publication, what UNESCO tried to do which is interesting is to find some way different stakeholders are trying to cope with the problem of hate speech.  For instance, we have met civil society organizations that are media watching the hate speech.  And this is important.  We need to understand the phenomena before trying to find solutions for it.  What is going on?  What is the size of the problem?  More research. 

There's people that use individual peer‑to‑peer counterspeech, as Amalia said, which is an interesting approach, to do that in a free way.

There is also NGOs that are on identifying hate speech and informing relevant authorities about the problems. 

We also have people doing campaigns about the problem. 

We also have private companies dealing with the issue. 

And we have public policies trying to develop media and information literacy approach within or through the educational system trying to raise awareness about this problem and trying to make new generations to be more tolerant and whatever in this area. 

So I have different things here that we can discuss in the groups, but to finalize with some challenges that we are facing and summarize this in one minute, first challenge, how we move from just the legal discussion of it to public policies.  How we can develop public policies in order to cope with the issue of hate speech and hate speech online of course. 

The other problem is the problem of definition.  How we can have more research, more rebate, more discussions and find some areas that we can agree with, if it is possible. 

The other problem, of course, is the problem of jurisdiction.  Even if a judge identifies speech as hate speech and even if this judge decides that this content should be taken down, we know it's not that easy, depends on where the content is, even in the hate speech is affecting a group in this particular jurisdiction or that country.

And then we have again a challenge on research.  There is little research.  We are seeing that we have hate speech on the comments.  But we don't really know the impacts of those things.  We don't really know the real amount of hate speech we are talking about. 

Who is going ‑‑ what is the content of this hate speech? 

So then the other challenge, of course, is to find solutions for this either if they are public policies solutions or solutions for the other stakeholders.  The best case is that we have a mix of those different things.

And, finally, we have challenge in training on judges and prosecutors on those issues because they don't know really how Internet works.  They have old view on all those issues.  And these are the people that are supposed to implement those process of fighting hate speech at the end of the day because as Frank La Rue said in his report that we need to have a due process of law in taking down content even if it is hate speech, and these, of course, need to go through the judiciary power.  And we're relying in players that don't necessarily understand how the Internet works or the key challenges related to this problem.

So that said, it's a very complex issue with many different areas and perspectives; but I'm fully available during the discussions to go in deeper on those different problems we have identified in this publication.

Thanks a lot. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you, Guilherme, for your insights. 

Now, I want to present you with Fernanda Rosa, Ph.D. student at the American University in Washington. 

>> FERNANDA ROSA:  Thank you very much for having me.  Thank you for the organizers.  It's a real pleasure to be here and to discuss this urgent matter. 

I'd say that I'd like to discuss this in a policy perspective. 

And I'd say that when I discuss hate speech and ‑‑ we always know that we are talking about minorities.  We always know that we are talking about ‑‑ sorry. 

(Laughter) 

>> AMALIA TOLEDO‑HERNANDEZ:  It's a weird setting.

>> FERNANDA ROSA:  Yes.  So sorry. 

We always know that we are talking about minorities as target, and marginalized people.  And we know this. 

So it's not the thing that ‑‑ we need to conceptualize freedom of expression in terms of that.  This is an issue. 

But we also need to think that we are not talking only about minorities, but we are also talking about women.  And this is a big issue because the paternalistic society that we have ‑‑ I'm talking about Brazil, I'm talking about the United States, and I also invite you to talk about your countries. 

The thing is that we know that many times women are the focus of these actions, and it's a very tiny line between to have hate speech and to have hate crimes and to have sexual harassment. 

And another thing is that the more social markers we have, the more complex is the situation. 

So we are not only women, you are also black, or you are also LBGT.  So all the things together make the situation the ‑‑ the identity issue more complex. 

So based on that, I'd say that ‑‑ well, again, bringing the policy perspective, I'd say that to bring this problem to the agenda and transform it into a policy issue, we need to conceptualize it, as Guilherme said, to show its persistence and its patterns.

We need to bring in numbers because this is how policy makers talk about.  And because policy makers, including legislators, are not necessarily close to these issues, it would be good to bring also some solutions to the table.

So this is how we should address the question. 

The other thing that I bring is that is freedom of expression enough to frame this problem of continuing repression of some groups of the population? 

So we need to think about it because, again, we are talking about minorities, we are talking about women, and we know that not in the offline world we are in front of it. 

And as we ‑‑ as Guilherme said, we have different kinds of perspectives.

So, for example, in Brazil, racism discourse is a crime and you can be arrested because of that.  But in the United States, freedom of expression guaranties you that you can have this kind of discourse without being ‑‑ well, addressed like that. 

The other issue ‑‑ and I am almost finishing.  As we know, harassers are becoming more sophisticated.  In the United States, there are cases where they install software in the victims' computers to track their activities.  And then they will publicize that.  They also install GPS in the victims' cars to know where they are going and to have other kinds of activities because of that. 

Well, they are more sophisticated in their techniques. 

What we can talk about the victims?  How can we empower them? 

And my first thought about that and I'd like to discuss with you about this is that one important way is to be involved in the design of digital technologies. 

We know that we have a lot of affordances, a lot of characteristics in the technologies that can leverage this kind of behavior.  But to think about that, we need to be behind the scene designing these technologies.  We need to change the way that they are made. 

And for having this result, we need to have these people, these minorities ‑‑ women, LBGT ‑‑ in the other side making this. 

So it's only some points that I'd like to bring, and I'm happy to have some people to discuss about.  Thank you. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you, Fernanda. 

Actually, quite interesting to bring up that there is a need for number for studies that show that.  But it's even difficult to get victims to talk about it, to let the people know that they were victims of some sort of harassment or hate speech or so on. 

So it's also ‑‑ there is ‑‑ within it, there is a challenge in itself. 

Now, want Eleonora Rabinovich, Public Policy and Government Affairs Manager of Google, to share what the company is doing on these issues.

>> ELEONORA RABINOVICH:  Thank you for the invitation to this panel.  I share most of the opinions that have been said here.  It's a very complex issue. 

Most of the complexities come from the question of what is hate speech, what is discriminatory speech, what we have to prohibit or allow, what crosses the line between offensive speech and hate speech.  So those are the things that we have to discuss at the global and the local level.

So this is a really complex issue and one of the hardest problems that we deal with as a company.  And it's difficult because we have global platforms.  But we have policies in our hosted platforms like Youtube where we host content.  And it's hard because we have to balance free speech even when it's offensive or controversial and also trying to prevent hate speech. 

So our policies are aimed at preventing the most egregious speech, the that speech that promotes or condones violence, and the speech that has the primary purpose of inciting hatred on the basis of race, ethnics, origin, religion, disability gender, sexual orientation, nationality, or veteran status, for example. 

And we have other tools for flagging and making like some ‑‑ like to function as guidelines when you can find, for example, speech that it's not categorized as hate speech but it may be offensive.

And also as a company, we are always learning about these issues.  And, for example, this year ‑‑ we know that attacks and violence against women are a big problem.  So this year, we introduced a new policy on search to attend requests from people on revenge porn.  We know it's a tough issue for women worldwide, and it's a kind of harassment online.  So we also introduced that policy. 

I would like to highlight that content removal is not always the best answer because the content is still there.  And in the Internet, we are everyday seeing that new content is being put on the websites on the Internet.  So as Guilherme say, we have to move from thinking about only legal tools to promote policies, to think about cultural and social debates and creating policies to combat hate speech.  For example, thinking about strategies to promote counter speech, how can we answer that kind of hate speech, how can we as a community ‑‑ the private sector, the public sector, the NGOs, the organizations, how can we mobilize counterspeech? 

And also, the third and final thing that I would like to highlight is that as societies we have to legislature very carefully. 

I've seen many debates on hate speech.  For example, there's now a debate in Argentina where I came from about an antidiscriminatory law.  And my feeling is that human rights communities should speak like group fighting discrimination and groups promoting free speech should build bridges and try to find a solution.  It's hard sometimes.  But I mean as a society we have to speak.  And those groups should try to find new bridges. 

And it's also important when we craft laws ‑‑ as Guilherme said, we understand how the Internet works ‑‑ what is going to be the impact on that specific bill and that specific regulation and create legal tools that are narrowly built so we don't want to remove more content, the content that we categorize as hate speech, for example.  But this is difficult because of the functioning on the Internet.

So we have to take care of the things, be very cautious when legislating, and be able to speak among the communities with the policy makers to create these strategies. 

So I'm open to questions and continue discussing this. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you, Eleonora. 

Very interesting what you say of trying to bring together different human rights advocates to work on the same issues. 

We were not planning to have it, but Frank La Rue is here.  If you want to join us on the panel and talk for two minutes about your report before I give the word to Anna.  So I will appreciate you join us here in the panel. 

Thank you very much, Frank.  Sorry for this, to bring you in here when we were not particularly invited or directly invited to this; but I'm very glad you can join us. 

And if you can briefly talk about your report, the one that you wrote on hate speech, that will be great. 

>> FRANK LA RUE:  Well, thank you very much. 

I was planning to be public in the public and not in the panel.

But, anyway, I do it gladly and quickly few words. 

I agree with the panelists, that, number one, hate speech is a serious problem, but not always has to be solved with legal measures as Guilherme said and not always with forms of elimination. 

I think states have an obligation to define policies, to confront ‑‑ actually, article 20 of the ICCPR says prohibit.  Doesn't say criminalize.  There was no reference to use criminal law or have any criminal persecution per se on forms of hate speech. 

So the idea I think when it was drafted is to have a preventive approach.  And this was what was used by the High Commissioner, Mrs. Pillay, when also on the issue of religion being used as a confrontation to certain religious groups, she developed this sort of global consultation and the Rabat Plan of Action came up which is essentially a proposal of prevention, counterhate speech but also prevention, education and, actually, a better understanding, more knowledge amongst peoples of the world, which is one of the principles of UNESCO in its constitution, the free flow of ideas and knowledge between peoples of the world to generate peace. 

So, number one, as I would emphasize that, I think hate speech is a problem, but the biggest solution is in prevention and education to break the stereotypes of discrimination that we have of xenophobic education or cultures that we have in many countries around the world. 

Secondly, I think that hate speech as a concern of the State should be kept for the most serious cases.

And there is a debate, as Eleonora said, what is offensive speech which is within the realm of freedom of expression or what is hate speech that can generate harm. 

I always said that the State has an obligation to protect us from harm, not from offense. 

And in protecting us from harm, I was trying to identify from Article 20 the two forms of harm that can be produced, which one is violence itself, several forms of violence.  And women who have struggled on the issue of violence against women have insisted that violence not only has to be seen as physical violence but also consistent harassment can be psychological violence and intimidation as well.

One is to prevent violence.  The other is to prevent massive forms of discrimination, speech that generates and actually provokes discrimination, which are, for me, the elements interpreting Article 20. 

So it goes a little bit more of what some people understand in the first amendment where they only agree with physical harm and immediate harm, I added the element of harm, understanding harm in different ways.  Like I said, acoso sexual, a women being sexually harassed is a form of psychological harm and intimidation.

And, secondly, the elements of discrimination when speech actually is promoting in society or to policy makers to the State that massive discrimination of a certain social group, whether for race nationality, religion, gender or sexual orientation or disability or whatever else, the idea that there is speech that would, for instance, prohibit women from having equal rights or education or driving a car because, otherwise, it forms into forms of impurity and promote violence, that type of speech for me is already falling under the terms of hate speech.  Thank you. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you, Frank.  It was good to frame a little bit more what can be the definition of hate speech.  So thank you for that. 

And now, finally, I want to present you with Anna Frieda.  She's a Brazilian journalist.  She's at the end because she has the experience of being harassed and threatened online due to her job. 

And as a reporter, she writes on gender and feminism and so on.  So why I want her to talk now at the end of the short presentations so she can share personal experience. 

>> ANNA FRIEDA:  Hello, everyone. 

So I'm going to begin telling you all about the experience I had with online harassment.

I used to work ‑‑ I currently work for a newspaper, but I used to work as a freelancer.  I worked as one for three years.

Earlier this year, I submitted an article to one of the main editors I used to work with about how women and minorities were usually not welcome on online boards related to pop culture ‑‑ so comics, games, music, movies ‑‑ and why that happened and how that happened, how women and minorities were harassed in these online spaces.

I wrote from my own experience when I was a teenager taking part of these spaces, and I also talked to men and women who were also users on these online boards. 

I wrote the article.  I was praised by my editor regarding the quality of the article, but they refused to publish because they didn't want that kind of attention. 

I went to another editor.  I managed to publish it on Brazil Post which is the Brazilian chapter of Huffington Post.  And I immediately started to get online threats at my ‑‑ at social media.  I got thousands of messages with offenses on Twitter, Facebook, online boards ‑‑ rape, murder, beatings, kidnapping. 

And quickly the threats, they extrapolated the online environment so my personal data was shared.  My home address was shared.  My relatives' personal data and home addresses were shared.  They started getting threatened, too. 

And I started getting packages at home that were meant to make me feel psychologically shaken. 

I got worms, sexual patterns and T‑shirts with my photo on it, unflattering sentences. 

My editor that I worked with, because of these repercussions, they stopped working with me.  So I had to leave the freelance career and had to look for 9:00 to 6:00 job.  And I had to leave home for a few weeks because they had my address and they were sending things to my home. 

I also had to look for help of a professional organization.  That was ARTICLE 19.  And they helped me manage the situation and to get my life back. 

It had been a few months since this happened.  I still get a few attempts to hack my accounts and email Facebook.  Still happens sometimes.  I still get messages sometimes. 

And that story happened to me, but I ‑‑ because of that story, I got closer to other women and other LBGT people who were also in like similar situations who faced this same kind of harassment on the Internet.  And it's not that uncommon. 

So it got me closer to ‑‑ I used to cover social issues before.  But it got me closer to the kind of ‑‑ those kinds of stories. 

But it also ‑‑ even though I got ‑‑ I guess I got out of ‑‑ more ‑‑ I stayed stronger, but it's braver than before, it still makes me think twice before I pitch an article that talks about minorities.  So ‑‑ and that shouldn't happen. 

So that's why I'm here. 

(Applause)

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you. 

Thank you very much, Anna. 

You certainly raise very good points, how you connect with other groups that had suffered the same.  I'm pretty sure that helped you to go through the situation. 

Very important that hate speech has very important consequences on the victims like you are ‑‑ it's a chilling effect that you're having now that you don't want to even publish articles, you have to think it very clearly if you want it. 

But thank you for sharing your personal experience. 

The idea right now was to break up in groups, but this room is so full that I don't think that's feasible at this moment ‑‑ do you still want to do that?  But I don't think it's a good idea. 

So I prefer to open the discussion now with you to have this discussion.  I prefer you to do the questions, to come back to us with questions, with comments.  So if anyone has a question, please ‑‑ do we have microphones?  Yes. 

Thank you. 

Can someone help us.  The technicians, can someone come and help us with the microphones. 

>> AUDIENCE:  This is uncomfortable. 

I just wanted to ask:  You kind of ‑‑ the panelists kind of ‑‑ thank you. 

It's kind of assumed that harassment and violence against women is hate speech ‑‑ perfect.  Thank you. 

I thought it was kind of assumed that harassment and violence against women is hate speech when there are very clear international standards on what hate speech is in terms of intent and context.  So I was just wondering if you could talk a little bit more about that. 

I think it's ‑‑ and I'm taking into consideration what Mr. Frank La Rue just said that maybe hate speech is a great problem but not all hate speech has to have a legal response and what does that mean in the context of violence and harassment against women?  That would be my specific question.  Thank you. 

>> AMALIA TOLEDO‑HERNANDEZ:  Shall we take a few more questions. 

Luis, can you help me with the microphone going around the room, please. 

Now, we're going to pass the microphone.  So don't worry. 

First one, the guy with the great jacket. 

>> AUDIENCE:  Yes.  I'm Matjaz Gruden from the Council of Europe. 

I'd like to ‑‑ first of all, I'd like to say I think it's important, and I'd like to thank for the testimony of Anna.  I think it's important when you discuss the issues in a general theoretical way to be reminded that we are talking about people facing their difficulties, and I'd like to express my solidarity and thank you for sharing that with us.

On the definition, I'm getting more and more confused because we now have difference between hate speech and discriminatory speech and we have other types of speeches and then we have different categories of hate speech, some requires or ‑‑ enables or requires legal intervention from the State and others that doesn't don't.  I think we have to be more disciplined here a little bit.  And I think from the Council of Europe perspective also labeling is sometimes difficult. 

There's a category of speech which is not covered by the protection under Article 10 on the freedom of expression which means it allows or even calls for some sort of intervention from the State.

And then there's everything else.  It doesn't make it a nice speech, but makes speech that doesn't meet one of the criteria that would allow to trigger the State intervention. 

I think it's very important to say there is a category of speech where that intervention is necessary.  But because this is so, the threshold has to be very, very high. 

And the ‑‑ the European Court of Human Rights has clearly identified some of the parameters on basis of which you can define whether speech is covered by the protection of freedom of expression or it is not.  And I could quote, but I will not again.  It's ‑‑ if it is seen as a direct or indirect call for violence or is a justification of violence, hatred or intolerance.  Secondly, if the speech is attacking or casting a negative light at entire ethnic, religious or other groups.  That is identification point one. 

Identification two is on the manner in which the statements were made and their capacity, direct or indirect, to lead to harmful consequences.

So there are ways to identify that category.

Now, the big question, of course, is what happens with the speech when it doesn't fall into that category?  Because if it does not for one reason or the other constitute ‑‑ and I think it's important that it doesn't because it is essential for the protection of freedom of expression ‑‑ does that mean that we don't do anything about it? 

And the big question I think ‑‑ the shortcoming that we are facing across the world in all our societies that we don't know how to react properly in the speech that doesn't fall into the category of hate speech or ‑‑ and I think that's one area where we should be really focusing because it is a speech that can do a lot of damage that cannot and should the be sanctioned or censored because it's essential that freedom of expression is exercised.  But without any response at all, it will continue to have a negative influence. 

So I think that's where I think we should also be discussing and reflecting.

That's something that the Council is doing also because it's not only engaging in a legal approach, but a lot of our efforts are done on policy building and especially campaigning the No Hate Speech Campaign which is grassroots starting with young people.  And I think we should put all collectively more efforts there.  Thank you. 

>> AUDIENCE:  I'm Courtney Radsch, Advocacy Director for Committee to Protect Journalists.  Thank you for sharing your story.

It's something we see happening around the world happening to women journalists.

I'd like to hear from you specific solutions that you would propose.

But then I'd also like to ‑‑ I don't know.  I have a question about some of the things I'm hearing now because you're saying any minority groups and women and religious groups, et cetera, et cetera.  So how do you express yourself and your opinion against groups? 

For example, Scientology, they made a documentary about Scientology and there's a lot of discussion about that that's maybe a fringe religion, but it's still considered a religion. 

And then I think about the issue of ISIS and countering violent extremism and the attempt to prohibit any speech that's in support of that ideology which I'm sure every person in the room agrees with; but at the same time, throughout our history, we see there are struggles by different groups to get ‑‑ to become a leading ideology or to ‑‑ again, we have Martin Luther King who was being surveilled by the FBI and the ‑‑ so I feel like we're not able to really figure out solutions when we talk in these very broad swaths about speech that should be prohibited.

And I think it would be very helpful to take specific examples and talk about specific solutions.  It's good to hear people say that legal solutions are not always the right answer, but I think in most cases, they're rarely the right answer.  Right? 

And also what about States and the State's speech that constitutes hate speech and harassment? 

We have more than 200 journalists in jail around the world, not no mention who knows how many social media users who are in jail for expressing their opinion about the in general lack of democracy in their governments. 

And so, you know, this is a very challenging subject.  But I do think it would be helpful to talk in specifics about specific solutions.

Too often, international law is removed from the facts on the ground.  If we look at places like the Middle East, for example, there are no strong regional institutions that Latin America thankfully has, so what to do in those cases as well. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you very much. 

We're going to take these three questions first and then we go back to you to ask more questions or comments. 

Eleonora, do you want to answer on any of the questions? 

>> ELEONORA RABINOVICH:  It's applicable to all the questions.  It's about what is hate speech, what is discriminatory speech. 

And it's interesting that another challenge we have is that we have different legal systems.

I mean you mentioned, for example, the European Convention of Human Rights. 

While I work in Latin America, we are under the umbrella of the Inter‑American Convention and Inter‑American human rights system and see the special freedom of expression rapporteur there, it's not exactly the same the definition of hate speech and the boundaries of what's allowed and what's forbidden in our systems.  So it's a challenge. 

For the Latin system of human rights, which is the one I know most, we have a high protection for free speech, and the category of speech that we consider hate speech under the convention is the one that can incites to violence. 

So it's very limited. 

And the other one, which is discriminatory speech, maybe it's horrible and we don't like it.  It's harmful maybe.  It creates social problems, but maybe it's not forbidden.  So we are also discussing that in the Inter‑American Convention of Human Rights. 

So it's a challenge because we live in the global society and the Internet.  But we have systems and they reflect different values.  So how we harmonize that, it's a challenge. 

And that comes ‑‑ also, it's related to your question about like violence against women and what's the scope of the discrimination that also falls into the category of hate speech. 

But I mention like our ‑‑ for example, our revenge porn policy because I think it's important to learn from what's happening and try to offer solutions to what's happening as a company. 

So I think that we agree that like revenge porn is a problem.  Harassment against women is a problem.  I don't know if you can say it's not hate speech or not; but it's a problem, and we can't offer a solution without diminishing the freedom of expression which is another important value.  So that's the balance. 

>> GUILHERME:  I think in general terms, we agreed in the beginning with the way you framed the point.  The question like the lady said when talking about minority groups, our challenge here is that the groups themselves, people with disabilities, women, ethnic groups, you name it, they are facing the problem.  They feel themselves hated and offended.  Sometimes it's not hate speech. 

But here we need to deal with the real world.  We can't just say, "Look, it's in the convention.  It's your problem that you feel hated."

I'm not saying ‑‑

>> AUDIENCE:  (Speaking off microphone).

>> GUILHERME:  No, I'm not saying you said that.  I'm saying to the general audience that we have a complex matter in the sense that some groups, they feel offended and the Internet is escalating that.  Sometimes it's not hate speech. 

Our problem here ‑‑ I was in a meeting a few months ago with 1,000 feminists; and, of course, they are worried with revenge porn.  They are worried about many things. 

And in that particular meeting, and they were suggesting that one of the solutions is to kill anonymity.  And it's a wrong solution because this will create other problems for other human rights defenders. 

So I think the thing here is how we can discuss with all those groups and try to find the limits for that.  And, of course, I think in the case of Europe, try ‑‑ or Europe tried to find some of those solutions, but they're not consensual if you go to the American system.  In Europe, you can't talk that Holocaust does not exist or didn't exist.  In the U.S., you are protected to say that.  It's not consensual. 

And in the multilateral system, we need to deal with all the different systems ‑‑ I mean even talking about consolidated democracy. 

So I think that the real problems here is that more and more groups are feeling this problem.  But, of course, they don't have all the tools to deal with that in different ways; and I think our job here is exactly trying to find this.  And, of course, measuring the consequences. 

Anna mentioned her problem.  And one of the huge consequences of this kind of hate speech is self‑censorship for many people, not only journalists.

This is a gigantic problem for democracies.  If we start to self‑censor because we are afraid of the things we are saying, this will be a big problem. 

So I think one huge issue here is to try to show people that killing anonymity, for instance, is not the solution.  But I can understand why they are asking that. 

And that's ‑‑ I think we need to try to be coherent and comprehensive in the way the problems they are facing, and these, of course, are escalating with the Internet.  Because before with only broadcasting, the system of independent regulators, broadcast, it's easier to regulate than Internet.  This is not that gigantic problem.

So I feel that we have a very complex issue on our hands.  And some of the principles we dealt with before on the huge growing of Internet is still applicable, but others will need to figure out how to face the real demands and the real fears of those groups. 

Although I can agree with her if we started putting all the groups that feel offended in this basket, we'll just keep growing the complexity of the problem.

>> FRANK LA RUE:  I think some of the comments and the questions in a way complement what we said and not differ too much. 

The one first principle we have to apply is that any action by the State, and in this case on hate speech which would be an abuse ‑‑ and I think that's a bad name because there can be many expressions of hatred that should not necessarily be banned by the State because they're individual expressions that fall under horrible speech but it is legitimate speech. 

But in any case under Article 20 of ICCPR, the one principle that we have to insist is that freedom of expression has to be understood as an opening to all expressions in principle.  It is the broadness of this principle that we have to apply.  And that those expressions that really fall over the limit are the very small exception.  We don't want the State to intervene in content because this will limit exactly as you were saying, criticizing religions by a journalist, or by anyone, is legitimate.

Most universities have comparative religions, and one can have an opinion about religions, philosophies, ideologies; and this is absolutely legitimate.

So none of that should give excuse for States ‑‑ although they take the excuse, like the Charlie Hebdo cartoons, they supposedly felt offended and that justified a horribly violent act.  This should never happen in the world.

So, number one, is freedom of expression has to be understood broadly.  We're trying to open expression for everyone. 

Now, what are the few instances in which the State can intervene? 

The exceptions are there in ARTICLE 19 and 20.  And they have to be legal to protect other lives, necessary, and proportional. 

And in the case of Article 20 which is the so‑called hate speech, I think it's very clear definition of speech that will immediately generate violence, a form of violence.  Physical or psychological I added because of the harassment of women.  But will generate violence.  Or that will generate a systematic policy of discrimination.  This is my position. 

Not necessarily discriminatory speech, because that often happens all the time; and that still falls under free speech.  Tragically, it's a bad habit, but it falls.  But systematic incitement to policies of discrimination fall under that. 

Now, we have another category that we don't mention which are internationally recognized crimes.  For instance, the convention on the prevention of genocide has in the convention itself the obligation of all States to prevent the incitement to genocide.  And no one challenges that because that is a fringe area.  That is an area that will that will provoke immediate violence, the violence of genocide. 

And the same thing happens with the optional protocol on the Convention of the Rights of the Child, the optional protocol on trafficking of children and sexual exploitation that talks about prohibiting child pornography.  But not only prohibiting child pornography, but also talks about those that commercialize child pornography, those that promote child prostitution and those that traffic children.  So it has the whole obligation of the State to investigate criminal gangs that do this. 

But let's understand it in a very, very limited way.  We do not want the State intervening in the content.  And this is why we only qualify those that are the most egregious, the most extreme cases where definitely the State should intervene.

But beyond that, society has an obligation.  I say that the same way we don't want the State to intervene in content, we do want society to engage in content.  We want society to have a criteria to discuss the media and the press, what they like and don't like, or discuss speech or to criticize their politicians and to say whether they agree with policies or not.  And this is the part we forget. 

So we must simultaneously encourage more speech and more freedom of speech in a healthier debate in society to prevent these things from happening as well.

>> GUILHERME CANELA:  Another point that Frank is mentioning, some countries we are now watching members of parliament using the idea of hate speech to create new regulations because they feel hated because there are horrible things they do and they want to be classified and want to be protected.

So we know this is part of the regulation here.  Public figures, they are less protected than others when talking about freedom of expression or the regulated speech. 

So, again, I think the point here is helping to understand this and separating different kinds of speech and making very clear what you said, what is the narrow definition we can use.  And we, again, fully agree with you. 

But the point is the different groups, they're not agreeing with our definitions so we need to start the dialogue in a much more intense way to avoid these problems. 

But here in this region and in many countries, we're facing this, politicians passing laws using the concept of hate speech because people especially during election times, they go a little bit over the top on the discussions and this is natural in democracy.  That's life. 

>> ANNA FRIEDA:  Yes, so you asked about specific solutions.  I guess I can talk specifically about my case of being a journalist.  It wouldn't apply if it were a blogger, for instance; but being a journalist, I'm always publishing for someone so there's always a publisher. 

And I guess in my case, specifically, first of all, I got no backup from this publisher.  So the publisher ran away.  When they realized things were getting ugly, they were like, "That's not my problem." 

So first of all, I guess there should be, especially in Brazil and the way how journalism and media companies work here, work relations are very weak.  The union has little or no power.  And we are facing economic and ethical, essential crisis on what journalism is in Brazil and media companies. 

So I guess it would be great if during this process of rethinking what's the role in society is, which is what they're going through right now, it would be great if they realized the media companies, especially in my region, they realized that it's really, really essential that they back up the people that are working for them, being them freelancers or not because that was ‑‑ it wasn't the first time I was harassed online because of something I published. 

But first time was in 2011 because of an article that I published in a newspaper.  But I used to work regularly for them as a reporter, so they backed me up in every sense I could think of.  So I felt much more protected legally because they had a team of lawyers in a sense that is more like in the speech because they published an editorial backing me up, and that pushed the harassers away.  So they felt threatened, and they stopped harassing me within a few days; and things didn't escalate so quickly as they did this time. 

Besides that, I guess we are talking about a country where there's little or no data protection laws.  So my personal data was available for anyone to see if they pay a little money online.  And that's Porcupine. 

So we should have, not only for me but everyone, not only journalists, stronger data production laws in the sense that our companies that analyze credit, for instance, they can ‑‑ they are able here in Brazil to make everyone's data available.  So that shouldn't happen in my opinion. 

And the judiciary system, of course, it's another problem I'm facing because it's too slow and there's no structure or people to actually investigate and punish the harassers.  So we have proper crime here, but it doesn't fall in the gray area of hate speech.  It was not.

Because I was threatened, but it's like nothing happened because it's probably going to be archived ‑‑ that's how you say it, archived?  They can't find anyone to blame it on.  Because we don't have the officers, the team, they're not prepared to deal with online crimes in that sense.

So when I went to the station to report the harassing, they were very kind, but they looked at me like I was an alien.  Like, what the hell are you talking about?  We never heard of this before.  We have no idea about how to investigate this. 

And I was told that. 

So I guess these three areas in my case, journalist case specifically, would help to keep us safer in these kind of cases. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you. 

>> FERNANDA ROSA:  Okay. 

Thank you, Anna, and everyone, very much for sharing these experiences.  I think that Anna's case shows how sophisticated harassers are.  For example, if you understand how Google's algorithm works, you know how to make the results for her name, for example, appear in the first page of Google because Google works on links.  So if the harasser puts that link for that information in many pages, it will be much easier when you search for that information to have this case appearing in her name when someone, an employer, an editor, or someone else searches for her name. 

So this is I think an issue that we should think about privacy, freedom of expression, related with public understanding of technology.  It's very important in our era to think about this together because we know how it ‑‑ how it is painful when we have this kind of results.  And it's very difficult to forget these cases now because it's on the network, and it's totally different when it's offline case. 

I'd like to add that according to some interviews that I did with policy makers, they are ‑‑ they have a very limited understanding of technology.  And as we know, it's because of this, the answers for that kind of problem tends to be conservative and sometimes biased so it's also an important thing.

Someone asked about answers for that problem, how can you solve this.  I think one person that we need to dialogue with or one group is policy makers and the people that accessorize them, help them.  We commonly lose those people when we are discussing this kind of issue.

And my last thing would be what Mr. La Rue said, how can we engage people when we are talking about that.  I'm also, again, thinking about how technology can do that. 

So let's think that when you see a post that is a hate speech on your Facebook or whatever ‑‑ or other social media, let's think that instead of reporting that case to Google, you had a tool there where you could discuss this with other people and we could like ask people to join that discussion. 

Why we need to only have like comments, where we only have that button?  So this is what I'm trying to say.  We need to think about technologies as ‑‑ we need to think that technologies have politics.  And how we design it totally impacts the way that we are facing these problems. 

And if you are not there doing this, redesigning that, we'll not have the answer that we want.

And as someone said, companies are behind that.  Their interests are money.  It's not a problem for me.  But when we are discussing human rights, we need to have other things on the table.  And how we do that, how we understand these algorithms in a way that we can reinforce human rights.

So that's it. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you. 

I think that discussion is showing that this is a very complex issue.  The definition problem is coming up all the time.  Solutions ‑‑ we can think of several solutions, but definitely when you are working with the media and the media is not backing you up, there is a big issue.  And society has also to respond in a way. 

We're going to take some other questions.  The one over here.  Edison was asking me for one question.  So the second one.  And the lady in the red ‑‑ in the black jacket.  Yes. 

>> AUDIENCE:  Hello.  My name is Shreedeep Rayamajhi.  I'm an Internet Society ambassador.  I'm a blogger and activist. 

First of all, I'd like to appraise you for taking a stand because it matters you taking a stand.  It matters for me.  It matters for all the people who are here. 

I would also like to highlight on the issue that most of the times, it happens due to lack of awareness.  People don't know reality and the meaning about what hate speech is.  They just share it. 

In Nepal, we had a situation like this that one of the popular singers, she had made a comment about a religion.  And it was after six months that the video started getting viral, and people started sharing it. 

They didn't know what the consequences were or what issues she was going to suffer.  But later on, what happened was she was completely ‑‑ she had to refine herself. 

So I believe it's like ‑‑ we all need to step up, step up and talk about such issues in our community about ‑‑ regarding the role of media. 

I completely feel you.  I completely feel you when you say media don't back you because I also went through the same process in 2011.  I was a victim of not hate speech but for posting a certain report, I was attacked. 

And, you know, for three months there was no one.  The police would not identify.  There was no one. 

I know CPJ should have ‑‑ they tried to help me.  But reality was my government didn't understand, and I was bloody pissed because, you know, I served the country.  I was reporting.  I was doing my job. 

So I believe there is a lot of awareness campaign needs to be done.  And people like you, me, we all need to talk about it.  We need to talk about it out loud to say these are the consequences because people didn't react.  People didn't stop.  People didn't report. 

So I think a lot of issue about this is awareness.  Like he said, awareness needs to be done.  Awareness campaign has to be done.

Thank you. 

>> AUDIENCE:  Hello.  (No English translation).

I just say that ‑‑ I am Edison Lanza, special reporter of freedom of expression of the Inter‑American Human Rights Commission. 

I agree with the, you know, the feeling that Eleonora said that in the different way that understand the freedom of expression and the limits of the freedom of expression in the Inter‑American system and the European system. 

But I just to say that to ‑‑ about final of the year, the Inter‑American Commission of Human Rights, release a special reporter about violence against LBGT persons.  And we have a chapter, first time that the Inter‑American System Bill of Standards about freedom of expression and hate speech, in this case, about this particular group; but I think this is apply for all of the issues that you say. 

And the first thing that we said in this report is like freedom of expression and equality would almost be understand with contradictory terms.  We think that we have a complementary ‑‑ you know, a rule and then we have many tools to improve this vision.  And then we live to look the thing that are contradictory terms.

Secondhand, we try to have a definition of hate speech and incidence of violence because it's really ‑‑ it's a fact that the Inter‑American system don't have jurisprudence or standards to define the 13.5 of the convention that define in the broad form the discourse that it's not protected by freedom of expression.  It's like the speech that inside the violence against different groups. 

And then we tried to harmonize this article with the principle of condemn because we need to, you know, have a context for this definition.

And then we ‑‑ we ‑‑ some tools that the region is coming to start to develop.  For example, a public defenders of the audience in the millions in the case of Argentina and Uruguay that don't use the crime law against this case, but promote the debate ‑‑ the public debate about these issues.  Because within that more freedom of expression and more discussion in the public ‑‑ in the media.  But in a lot of the cases, it's very important to change the culture of discrimination.

And then I don't agree with that, but really we have the 13.2 or .2 of the convention that it's not ‑‑ we don't use the crime law, but we can use, for example, for systematic criminalization ‑‑ administration of civil sanctions, but I don't agree with it.  But it's possible to use that. 

And then what is sort of the ‑‑ thank you. 

>> AUDIENCE:  Hi, Anna. 

When I saw your article in Internet, I shared my Facebook account.  And I've shared it not only because I'm in there like you, but because I had a similar experience in 2012. 

I have a hobby that is clothes play.  I think you know what it is.  You dress like a character that you like. 

And I went to a convention of these cultural pop world; and a guy, he took some pictures of me wearing these clothes play and posted these pictures in a chat, not in a chat room but in a Facebook group.

And so as he posted these pictures, it started the guys in these groups and chats started to ‑‑ how can I say? ‑‑ to search my Facebook page.  And they found ‑‑ like ours, they found my Facebook page. 

And when they found it, they started to do ‑‑ to send a lot of ‑‑ change images in Photoshop like of the ‑‑ they change my breast size for bigger.  They put my face in porn actress.  They start to call me like whore and bitch and these things.  It start to ‑‑ how can I say? ‑‑ to ruin my life, like trying to make my life as hell as yours. 

I didn't receive any package in my home, thanks God.  But after that, I start to be a lot of activist and feminism and after that had a lot of other ‑‑ how can I say? ‑‑ hate speeches against me. 

Here in Brazil, abortion is illegal, and I am a favor of it.  And I have ‑‑ and every week, I receive a lot of hate speeches in my Facebook account or in my Twitter or even in my email saying that I would never ‑‑ that I should never exist, that my mother could have killed me when I was in her.  And this is so hard when you are a teenager because when you are a teenager, you are impressioned by changes.

You're always changing.  Your body is a stranger.  The world is starting to be new to you.  You discover a lot of things.  And when you are a teenager and receive these hate speeches, it's hard for you to accept it, to accept yourself, because you have your ideas, you have your tastes, you have your hobbies.  And because of that, because you are a woman, you don't have the right to be like a part of the circle.  You don't have the right to be a nerd.  You don't have the right to be a feminist to fight for your rights.  You want to have the right because you are part of a minority. 

I have friends ‑‑ I have friends that are ‑‑ that receive the same hate speeches.  And now I'm used to them.  When I receive one, I was okay.  One more for the list (laughter).  You start to live with this. 

And you think, well, you have to balance ‑‑ you have of these treaties of the U.N., of the cyberattack, about cyberbullying, and you start to think, why these doesn't work in real life.  Why we as women, why we as lesbians, why we as not straight, why we have to pass by these attacks. 

Why they still have this discussion about the definition of freedom of speech.  Because freedom of speech is not you say what you think only.  You can say you are having a freedom of speech when you are attacking the dignity of someone. 

That someone has a life.  That someone has dream.  That someone has problems that you don't know.  And you are attacking him just because he's not normal, just because he's part of a minority that you don't like, that you think doesn't have rights. 

And we are in the discussions of what is freedom of speech, hate speech, harassment, harms; and I think that in this occasions we forgot to say solutions, not only solutions but solutions that we can improve to the real life. 

Because when you are in these ‑‑ for example, in these advent of (inaudible), we are saying about thanks that we want for the future ideal world.  But that ideal world doesn't exist yet, and we have to make that ideal world come next to us. 

And we're not doing this while in the discussions, the definitions we have to say about solutions. 

And I think not only me, but others or other people here that pass for these harassments, for these hate speeches, and I remember that I ‑‑ 15, 14 years old when he ‑‑ how can I say? ‑‑ when they posted these pictures to me, that they posted my Facebook account in all the group and in the chat; and I started to receive pictures of sexual organs of old men from other countries.  And I was virgin and never saw that.  And, oh, my god, what I have done to deserve this?  I only was expressing myself. 

So I had my freedom of speech, and it was not respected.  But their freedom of speech was respected.

Why mine?  That was a freedom of speech doesn't hurt anyone why it violates like that but not respected?  And why that freedom of speech that they say they have that for me actually is hate speech was all respected by society? 

So that's my questions.  That's my experience. 

You are the next.

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you very much. 

I have seen this kind of discussions that it's very empowering for victims to talk about this.  So thank you for sharing your experience.

(Applause)

>> AMALIA TOLEDO‑HERNANDEZ:  Eleonora has to leave but wants to say a few words before.

>> ELEONORA RABINOVICH:  Because I have to speak at another workshop now at 12:30, but I'm here the whole week so I'm happy to continue discussing all these issues during the IGF.  So thank you for everybody here. 

>> AMALIA TOLEDO‑HERNANDEZ:  Frank.

>> FRANK LA RUE:  I have to leave, too.  But I just wanted to remind everyone that there are five specific criteria that we work with ARTICLE 19.  They actually had a ‑‑ I used some of the research that was established to define hate speech which helps.

One is the question of intent.  There has to be the intentionality to harm, and that is very important.

Secondly, the content has to be a broad‑based content, reach out to many people to actually produce a big harm.  It's not a one single comment or to one person. 

Thirdly, it has to be incitement to a specific harm, a very direct form of violence. 

Fourthly, it has to be an immediacy to that harm.  It has to be something that can happen very soon and not just a very vague general statement. 

And fifth is the context.  In a context of discrimination or in a context of hatred, any speech can have a bigger effect.

So there's never an easy solution.  This is still ‑‑ I think it will have to be done on a case‑by‑case basis with court decisions; but effectively, yes, there are elements to define hate speech. 

But, secondly, I think there is a call ‑‑ and this is the one reaffirmation ‑‑ to generate more speech but also to generate positive speech.  I would say that especially with the examples we heard.

Society has a responsibility which should not be passed to the State because then it becomes authoritarian decision.  Society has a responsibility to challenge the wrong forms of speech amongst ourselves as social practice, to debate and to criticize. 

So the same way we criticize politicians ‑‑ legitimately so ‑‑ we can criticize ourselves and the way we speak and the way we relate and the way there's speeches or coverage of the press or debates or that people use certain terms ‑‑ and I think it is this type of positive speech and engagement for building a culture of peace that we really have to enhance.

Thank you. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you very much, Frank and Eleonora, although she's not here anymore. 

One more comment from the woman there, and then we're going to close the panel. 

>> AUDIENCE:  Hello.  Thank you.  I'm Eve Salomon.  I'm an independent expert from the U.K.  And people have been asking for solutions, and I just wanted to share two little things. 

One is an example from the U.K. which I'm not going to defend because ‑‑ I'm not going to defend the U.K. because the U.K. does not have an excellent record on freedom of expression.  But we have done one good thing which was the result of society stepping up and complaining about the individual harassment that individual, the individual trawling, the examples that Anna and our previous commenter said. 

And in the U.K. now, there is a specific law about ‑‑ which criminalizes online harassment because the U.K. realized that online was a bit different.  And the court can order an end to anonymity in cases of online harassment, and there's a penalty of up to two years in prison; and people have been prosecuted successfully under this new law.

So that is something that civil society could urge their governments to do if appropriate. 

Just another comment.  I think Frank outlined the definitions of hate speech as set out in international law as we heard from the Council of Europe.  And there's this big gray area in between. 

I want to tell you a story that in 1998 a very right‑wing group in the U.K. published a pamphlet, both in hard copy and online, about how Jews control the media in U.K.  And I was named as one of the Jews that controls the media.  If only. 

And that document is still online.  If you search under my name, you will find it there. 

It is not hate speech because it's not saying cause any harm to these Jews that control the media, but it's certainly unpleasant.  And although I'm named, I would defend the right of those people to publish it because that's where freedom of expression is. 

Thank you. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you very much. 

Just a few short ‑‑ very short remarks because we only have four minutes to finish. 

So, Guilherme, if you want to start. 

>> GUILHERME CANELA:  Briefly to tell you two separate stories. 

A few months ago, Frank and myself, we were moderating a debate with Riss which many of you know is the cartoonist of Charles Hebdo for a very qualified audience.  It was like 600 people.  And I was the person that needed to separate the written questions from the audience to address the questions to Riss. 

And I was terrified for the kind of questions that people said.  And, basically, most of the questions were saying that what they did was hate speech.  And I'm talking about a very qualified audience. 

So one example to say that we have many things to do here. 

The other example, I'm sure that all of you in those family groups have a cousin that is, oh, my god, my cousin is a fascist and have those comments.  And this is, again, to say we need to more research and understand the size of the problem.  We are here saying that we need solutions and we need policies, but it's very difficult to build policies if we don't have data if we don't know exactly why these people are engaging in these. 

Because these people in our family groups, they're not connected to any neo‑Nazi formal groups.  They're just our cousins from twenty years ago and they're fascists and saying horrible things.  And fight these kinds of things is more difficult than fighting an identifiable neo‑Nazi group. 

And building policy for this is much more difficult when you have a particular group that is against that and those and whatever.

And we don't have this data, and we don't have this data in a way that we can use to build public policy.

So one of the things that we need to urge different governments and universities and researchers to start doing on a more permanent basis is have better diagnosis of the problem.  We know we have a problem but don't know really the characteristics of this problem.  And it's a growing issue. 

Thanks a lot. 

>> AMALIA TOLEDO‑HERNANDEZ:  Fernanda. 

>> FERNANDA ROSA:  Okay.  Thank you very much for the opportunity.  I learned a lot here. 

And only to answer to the ‑‑ her comments, the previous one who shared her experience with us, there is a trend that we can notice now that is to say that this kind of hate speech is part of the culture of the Internet. 

This word "culture" has a lot of meaning as we know.  And, for example, in some communities like games, the game communities, they say that they need to use that term, they need to talk with women in that way because it's part of the game community so if you change this, we would change the way that we do. 

Yes.  And this is what we want to. 

But only to say that when they ‑‑ the thing is that when they say that, they also are saying that the women need to be stronger to face that kind of hate speech.  So they are putting the responsibility of all this mess on women again. 

I'm talking about women, but we can talk about other minorities. 

So only to say that we have a lot of trouble to face and thank you very much. 

>> AMALIA TOLEDO‑HERNANDEZ:  Anna? 

>> ANNA FRIEDA:  I'd like to thank ARTICLE 19, Karisma, and you all for giving me the opportunity to talk about this and hopefully bringing light over the subject and making people aware and braver when they need to share their experiences. 

Thank you for sharing your experience. 

And thank you for the guy that's gone. 

And just one more thing I'd like to say.  Being a journalist and being harassed online, it's a thin line between freedom of speech and facing the possibility of trying to shut people's speech ‑‑ shut them up. 

And I guess we should always have freedom of speech as our goal.  I mean it's fundamental for democracy.  And it's not only something ‑‑ it's not only something I need to do my job being a journalist.  It must be our main goal even though some people are abusing this right and we should find ways to cope with this.  But freedom of speech should be our main goal I guess. 

>> AMALIA TOLEDO‑HERNANDEZ:  Thank you. 

Just to finalize, just to invite you tomorrow at 4:00, we're going to have a short session to present a journalist tool, an app mobile for the protection of journalists, the communication of journalists.  I invite you to the workshop. 

And we can think that activist movement before us and in the offline world created structure changes that forces our work communications to change. 

So it was not acceptable to us, for instance, using racist languages in front of others.  We cannot change everybody's mind individually, but we can make it so they cannot come after people as easily. 

So it's a society, as citizens, we can do some stuff and we cannot ‑‑ we can do stuff without creating more violence and more hate speech.  So I invite you to look for actions and initiatives to do that. 

And I want to thank the panelists for being here, for sharing experiences and insight, and for all of us to join us today in this session.  Thank you very much. 

(Applause)

 

(Session concludes at 12:36.)