OPEN FORUM BY THE ORGANISATION FOR SECURITY AND COOPERATION IN EUROPE (OSCE/FREEDOM OF THE MEDIA) IN COOPERATION WITH THE COUNCIL OF EUROPE
SESSION OF2 OSCE
15 SEPTEMBER 2010
Note: The following is the output of the real-time captioning taken during Fifth Meeting of the IGF, in Vilnius. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
>> ZENET MUJIC: Hello. can you hear me? Good morning you to this session. I welcome you. We had been warned that it would be packed. Well, probably they chose a room a little bit too big.
Let me welcome you to this Open Forum, which is the first Open Forum jointly organised by the OAC and the Council of Europe on this very ambitious topic of how to balance governance of hate speech with freedom of expression.
I'm Zenet Mujic. And together with the Deputy Secretary General of the Council of Europe, Maud de Boer-Buquicchio, and Dunja Mijatovic, they will set the Scene for the following discussion.
Lee Hibbard and myself will host the session.
>> LEE HIBBARD: As said, this is a unique event. This is the first time to have an IGF forum and freedom of expression and hate speech in the same sentence. It's a good cooperation between institutions. From what I know about hate speech and freedom of expression, when we talk about those issues, the documents, the resolution, the texts which are addressing hate speech are normally done by one stakeholder group, normally. But the unique thing about this event is that it's a multi-stakeholder dialog. So, we very much count on you to ask questions, to dialog with the experts on the panel, and to have a real discussion.
I'd like to welcome those who are connecting remotely. And we have -- we will take the remote feed during the course of the discussions.
Just to lay out the structure of the session, we are going to break into sort of three parts. The first part is going to set the Scene, with the two ladies in the middle. Then we will go to a series of question, which are going to be introduced and led by different experts on the panel at this point. Once we have asked -- introduced it and set the scene, we hope there is going to be a dialog. At the end we will go to a wrap up session with I hope some feelings from the audience, some take aways, what did you feel, what did you retain in terms of messages. Then we will have a wrap up by the Rapporteur, Yaman Akdeniz, on th
E far right-hand side. So without any further ado, maybe we will -- the panelists will introduce themselves and then we will start with setting the scene.
>> YAMAN AKDENIZ: My name is Yaman Akdeniz. I'm an associate professor in law at the Istanbul University. And previously, I was at the school of law in the University of Leeds, in the UK. And my work predominantly concentrates on Internet content regulation, and censorship and freedom of expression issues.
>> MERYEM MARZOUKI: I'm Meryem Marzouki. I'm an academic researcher. And I also have activist hat. I run and operate an NGO in France, dealing with the promotion and the defense of human rights in the information society. And this NGO, IRIS, is a member of the European Association of Digital Rights Organisation, which is called European Digital Rights.
>> SUSAN POINTER: Susan Pointer. I look after public policy and government relations for much of Europe and Africa for Google.
>> THAIMA SAMMAN: I'm based in Paris. I work on a lot of Internet related issues. But before that, I worked for ten years with Microsoft and I've been working a lot in the issue of Internet and freedom of speech versus the application of some other regulations.
>> MARK WEITZMAN: I'm Director of Government Affairs for the Human Rights NGO, and the Simon Wiesenthal Centre. I've been tracking hate speech online since the 1980s, I was a speaker at the UN conference on cyber 8 in New York.
>> STEFAN GLASER: My job is the monitoring of neonazism. We do a lot of research and we try to develop countermeasures.
>> ANDREI RICHTER: I'm head of the Department of the Moscow Media Law and Policy Institute.
>> ZENET MUJIC: We tried to get a wide range of experts of people working in this field, to have a very, you know, multilevel view on the problem they would like to discuss.
But before I give the floor to Dunja, I want to explain something about the discussion itself. We don't want the panelists and key participants to speak one after the other. We would like to, the way we envisioned it, was that each of them carries a question. And then engages with you and the participants participating outside of this room in a discussion. So we hope to make it a little bit more interactive.
And I'd like to give the floor now to Dunja Mijatovic. Please.
>> DUNJA MIJATOVIC: Thank you, Zenet and Lee. I'd like to welcome you to Vilnius. I'm happy to be here. And I'm hope to see the Office of the OSCE is paying attention to the freedom of expression, which for my office presents a very important signal. I will not have a long speech. I'm not going to bother you with too many details. The first thing I would like to say that I'm extremely happy to do this together with the Council of Europe, and I do hope that we will have the opportunity to continue this practice more and more on the issues that are common for all of us and for different international organisations, tackling the same issues and the same problems.
Since we only have two hours to elaborate and we have such a wonderful moderator, and experts, and not too many of you interested in this topic, but that should not in any way discourage us to discuss this issue further.
The first meeting organised by the OSCE to explore the issue of online hate and its relationships to hate crimes was convened in June 2004, in Paris. And this meeting was followed by an OSCE permanent council decision, entitled promoting tolerance and media freedom on the Internet. The decision as the participating states to fight intolerance and hate speech and to also ensure that the Internet remains an open and public forum for freedom of opinion and expression, and the states agreed to foster to the Internet.
It's unfortunate that states agreed on many issues, on many occasions. But again, when we look at a situation, and I'm talking about 56 participating states of the OSCE, the situation did not look that rosy, not just on the Internet but media freedom in general. A balance has yet to be reached, as there are major disagreements as to whether or how to regulate online information and hateful content. Also as to what kind of content should be regulated, meaning that there is no standardized categorization as to what the hate speech is.
The mentioned OSCE decision also tasked my office to promote both freedom expression and access to the Internet and free flow of information. This included a task to issue early warnings, when laws or other measures prohibiting speech motivated by racists and phobic or other related biases are enforced in a manner for political purposes. The aim is to prevent the freedom of expression of alternative opinions and views is impeded.
All in all, the OSCE, and other international organisations active in this field identified effective approaches for addressing the issue of hate speech on the Internet that do not endanger the freedom of information and expression.
This OSCE decision is one of many internationally agreed tasks that we all know that is aimed attacking the problem. Countries adopted legislation to protect the user and societies from potentially harmful content, to protect freedom of speech or both. The effectiveness of Internet legislation aimed at fighting the dissemination of hateful and racist content is being questioned, however. And besides all efforts undertaken so far, the problem exists and solutions at least global or regional solutions seem not to have been reached.
For me, this is quite difficult to say, that there are exceptions. Bearing in mind the role I have, and that is actually promoting and ensuring the freedom of expression is not in danger in any way, but it is needless to say that there are issues that we need to tackle, and that there are issues that we need to Faye pay attention to. And that is clear that we need to pay attention to, and that is clear for all of us, child pornography, trafficking, data protection and all the issues that should be tackled in a legal way.
And there are several issues and several questions that I would also like to raise here. And I think it's important to say that the legal frameworks needed as they are to protect these rights should be designed in a manner that furthers freedom and should not go beyond what is necessary in a society. These are doubtless challenging and sensitive topics in many parts of the world. But how to tackle these challenges without them becoming an excuse for governments, and we see that in many country, to violate the rights and privacy of those who use the Internet for satire or political criticism. Where and how to apply the proportionality principle. The last users have shown that a number of questions still remain open, have not yet been asked, addressed or identified.
It is also to be studied whether the adoption of new hate speech legislation has led to a decrease of online hate. And this is still to be critically studies. The questions to be asked are in particular: To what extent to recommend measures specifically addressing hate on the Internet fulfill their purpose and what is their possible impact on freedom expression?
One must also assess the value and consequence of suppressing harmful content on the Internet perceived or actual, and ask whether in cases of suppression, meaning blocking, not deleting, such content moves underground, thus making it more difficult for law enforcement agencies to track and monitor this content and for society to challenge this content.
Of course, these questions are not entirely new. Nor will we comprehensively answer them here today. But I do think that besides trying to find solutions, we should take stock of what has been achieved so far, how and with what results.
In this respect, would I like to see this forum, this gathering, also exchange successful examples of self and co-regulation of common and practical approaches combating hate speech. All the efforts of the last users to a certain degree have shown that a multilevel approach is needed. It will not be the governments or the industry or hot lines alone that will be able to fight this problem successfully. In this respect, we need to see what forms of self and co-regulation exist and what are the advantages and disadvantages.
The action should be taken also to reduce the threat posted to social cohesion by the creation of a divide between country, regions, and communities with and without access to new technologies.
We also have to note over and over again that there are parts of the world also within the OSCE region, when there is a problem with the telecommunication infrastructure, not to mention the level of penetration of the Internet.
It is very important to have coordinated policy approaches.
So with that in mind, I would like to pause and to end it here and to hand it over to Maud de Boer-Buquicchio and to thank you very much for organising this event jointly.
>> MAUD de BOER-BUQUICCHIO: Thank you very much, Dunja. Ladies and gentlemen, welcome indeed to this joint Council of Europe -- I should start, the OSCE. The joint OSCE Council of Europe Open Forum, which indeed reflects our organisations' joint efforts to promote human rates and democracy across the European continent.
The universal declaration of human rights starts stating that "All human beings are born free and equal in dignity and rights." Freedom and equality are indeed two fundamental values that nurture our identity as individuals and members of a group. It's not by accident that the worse pages of our history are people's refusal to accept the difference. The fear of the difference is sometimes cultivated to create the anxiety that will ultimately justify the domination or even elimination of the other as a self defense act.
Hate speech is a direct attack to the right to be and to think differently. It doesn't stop at rhetorics, as it has the potential to shape minds of individuals who believe that they have the right to undermine other people's rights. It can encourage people to enroll terrorist groups, to alienate a minority, to exterminate the difference. For the Council of Europe hate speech is something to be monitored and eliminated, including online.
This statement may offend, shock or disturb those who prefer preserving a boundless freedom of expression, but Democratic societies cannot afford the freedom to oppress. Indeed, instead we have to strike an adequate balance between rights and freedoms.
With the Internet, hate speech has acquired a huge potential outreach without interference and regardless of frontiers. International cooperation and common minimum human rights standards must be our starting point in balancing the governance of hate speech and freedom of expression on the Internet.
In 2009, the Council of Europe marked 60 years of defining the boundaries of the rights and freedoms spelled out in the European Convention on Human Rights. Already, in 1997, the 47 Member States of the Council of Europe agreed that hate speech on and off the Internet should be, and I quote, "Understood as covering all forms of expression which spread, incite, promote or just fuel racial hatred, xenophobia, anti-Semitism or other forms of hatred, based on intolerance, including intolerance expressed by aggressive nationalism and ethnocentrism, discrimination, and hostility against minorities, migrants, and people of immigrant origin." The European Court of Human Rights has since applied this term in its judgments.
In 2003, the Council of Europe adopted the protocol to the convention on cybercrime, which criminalizes the dissemination of racist and xenophobic material via the Internet as well as racist threats and insults.
In 2007, the Parliamentary Assembly proposed the criminalization of statements which incite hate, discrimination or violence against individuals or groups for religious or other reasons. The Commission concluded that in a democracy, religious groups must as any other groups tolerate criticisms in public statements and debates related to their activities, teachings and beliefs as long as the criticisms don't constitute deliberate or gratuitous insults.
Also, in 2007, the Council of Europe's European Commission against racism and intolerance recommended criminalizing expressions which can be considered racist speech, notably when it intention atly and publicly insights to violence, to hatred or to discriminate on the grounds of race, color, language, religion, nationality or national or ethnic origin.
In 2010, the 47 Member States of the Council of Europe adopted the common position to combat expression likely to incite, spread or promote hatred or other forms of discrimination against lesbian, gay, bisexual and transgender persons. They stated that such hate speech should be prohibited and publicly disavowed whenever it occurs.
So ladies and gentlemen, we can see that freedom of expression is not an absolute right for the Council of Europe. We are, of course, convinced that freedom of speech and expression protects freedom in Europe. It's the very essence of our European identity. Our liberal and secular cultures demand discussion of differences. Freedom of speech and expression is the means by which we are able to integrate different culture, tradition, religions, and beliefs into a common European home. But at the same time, we cannot allow incitement to violence and hatred. I vigourously defend the right to criticize government, political leaders, religions, their values and their ideas, whether in articles, television programme, art or cartoons or on the Internet, as much as the right of people to protest, peacefully, against such criticism.
As a watch dog for our democracy institution in Europe, it's the business of the Council of Europe to make sure that the Internet is a space in which we can work, learn, and play with confidence and trust, a space where people are neither threatened nor discriminated. So in striking a balance in governing hate speech and freedom of expression on the Internet, let me leave you -- Internet, let me leave you with a few questions to consider.
What responsibility do private sector Internet intermediaries have in the provision of communication services, such as social networking sites?
Do we need a code of ethics for private sector Internet intermediaries, just like we have for the traditional media?
Is the protocol to the convention on cybercrime, which criminalizes the discrimination via the information of xenophobic material and racist material the benchmark. And if not, what do we need to make it so?
The last question, are we doing enough to educate our children and adults alike to use the Internet in a responsible way, and to identify and combat hate speech?
Ladies and gentlemen, the first article of the universal declaration of human rights states that all human beings are endowed with the reason and conscience, and should act towards one another in a spirit of brotherhood. Let's use our reason and our conscience to make Internet the space we need, a platform for a universal brotherhood that guarantees both freedom and equality for all.
Thank you very much. (Applause)
>> LEE HIBBARD: Thank you for your opening statements. In setting the Scene for the discussion, I want to pass the microphone to Mark Weitzman, and to think about some of the questions that were raised already and to give you information about what you consider to be the nature of online hate speech.
>> MARK WEITZMAN: Thank you. First, I'd like to commend the organizers of this session for coming together to work on this as a topic of mutual concern. I think one of the things that those of us in the field often talked about is the globalization effect of the Internet in transmitting the messages of hate and extremism. And the answer and the solution, one of them is obviously to globalize the solution and the reactions to that.
And I think the effect of the OSCE and the counsel of Europe coming together and the platform offered by the IGF is a very important example of how we can begin to respond to this issue.
I was asked to sort of begin and help set the stage a little bit by kind of, even though the Internet is something that we look at as going forward, but to look a bit backward. And I want to begin by taking a few minutes to sketch the history of extremism online. I think one of the things we don't realise is how deeply entwined the technology and the hate goes together. If we look back at it for those of us who were tracking it for quite a while, we see that the beginnings of it began as far back as 1983, in the early dip up PBS system, when a neoNazi in West Virginia in the United States put a first, what we call Web site online at that point and it became a model, because it included both his writings, as well as the others as a Mein Kampf and other tracks.
And that began the pattern that we saw. 1983, 4, 5, another prominent neoNazi in the United States already began to use the BBS systems to post lists of what he called race traders. These were people that he considered traitors to the white race, intermarriages, or had been prominent liberals, and so on, as well as using the materials to try to disseminate notices of meetings and material.
He had received seed money for this from armed robberies that were carried out by a major neoNazi group in the US in the mid 1980s, and the court testimony at trial of the people showed the trail of the money that was used to develop these BBS systems.
The person who was involved with this was one of the focal points of the movement at that time, bringing together various arms of what we call, in the neoNazi movement, the skin heads, militia movement, the radical Arian nations and so on and that showed how the Internet was being used as a means of not just communication and recruitment, but also of bringing together and unifying disparate segments of a movement that was united only by its hate and by its applications of violence.
That continued. We really begin counting -- those of us in the field begin counting it from April of 1995, when we say the first Web site came online in this regard, and this was a site created by a man named Don Black in, now based in Florida. Black had a history of being part of the armed neoNazi movements. He was arrested as part of an attempt to take over an island in the Caribbean for a para-military training group to be used against the US government. While in jail, he was taught a trade and that was computer programming. When he got out, he used that to set up the first Web site storefront, which is still online today. The last time I looked at it, there are over 150,000 members. And it was appearing in different languages, more than ten languages online.
1990 saw the mushrooming of not only the neoNazi sites but of the vast number of conspiracy sites as well. The broad term conspiracy sites because they took on everything from the protocols of the elders of Zion, the Bible of antisemitic conspiracy theorists that claimed a small group of Jewish elders were claiming to take over the world, to the people who were worried about what was called the new world order, which was the sense that there would be a world government, particularly aimed at taking over the US government and manipulating it and controlling US citizens, to people who were worried about the millennium bug. You may remember that. There were people setting up buying cans of Tuna fish and setting up enclaves to survive, and worried about black helicopters taking over the world and all that stuff.
It was a means of communicating all the messages to people who were not linked geographically or physically in any other way.
What you may have noticed in my remarks at this point is that I'm talking about an Anglo context, particularly a North American context. At that time, even though there were the beginnings in Switzerland of the Internet, but at that time the majority of the focal points was based in North America. There were outreach to European neoNazi groups and far right extremist groups and we saw that in the late 1980s, and continuing to this day, particularly through games that were aimed, video and computer games that were aimed at kids. And those games, you notice that many of them were focused in Turkey, in Germany and aimed at Turkish guest workers. And it became an issue that we saw circulating. And some of the games continue. "KC manager" you can find it online as well. The model was taken from the holocaust but the victims were Turkish and other workers.
It showed the increasing focal point of targeting youth. Using that as again essentially realising that the Internet was youth based and oriented, and using that as a a means to recruit people at an early age to have adherence to these groups, before they had the knowledge or before they had the sophistication to realise that they were being manipulated in such a fashion. There were calls for violence increasing at this point. There were radicals who posted what they call the lone wolf. The theory was that one person that was not highly known and highly a public figure or connected in any way with a group could become an adherent of the group's philosophy, and act, because he wasn't being targeted by law enforcement, media, researchers and so on. They could have the freedom to act and create an intervention that would cause and gain publicity for his actions.
Timothy McVay was responsible for the Oklahoma City bombing in the US. It was a classic example of the lone wolf. Someone put up a point system on which they targeted federal judges in the US who issued civil rights opinions, which were worth X numbers if they were killed. Religious leaders, Jews, another system of points. Someone put up a Web site that targeted who performed abortions. Every time one was killed, a red line dripping in blood was drawn through their name. Address, phone numbers and private information was put online as well. We began to see direct attempts to urge violence that directed and targeted audiences.
Current theories include the protocols of the elders of Zion. Anti-Obama in the U.S. the Armenian genocide, shall any historical event and things of that nature. Post 9/11 also saw a tremendous upsurge in the growth of radical jihadist Web sites. They were not present in any fashion up until 9/11. 9/11 was planned using the computer technology. There was encryption and other forms of communication methods. But that has become an explosion as both the technology has become more available and accessible to those people, and people in those communities, and they have really taken to it tremendously. At this point, they have gone from simple use of it, to much more sophisticated. It's become an extremely popular tool of radicalization. I just read a report in the American newspaper about recruitment of Somali young people living in the United States for radical purpose, and one of the things -- this was coming from one of the Somali leaders living in the United States is that the Internet is one of the prime tools for radical recruitment in their community. There are calls for electronic jihad. There are uses of manuals that teach how to use cell phones for GPS tracking, how to create weapons, RPGs, and so on. There are calls for religious calls for violence for jihad, for use by women and children, legitimizing that use. There are games. And again, this is a segment, very often, of radical Islamists that targets Islam itself. There are a lot of criticism and aims and targeting here of mainstream Islam and Islamic institution and leaders.
And then finally, we have seen a huge growth with social networking and social media sites of, taking to it a new stage, but user generated material. They broke down the barrier that existed in the past, where material had to be posted on platforms for use and they created obviously, we all know this, the user generated material of Facebook, of YouTube, which allowed viral spreading of material with no controls, no editorial fact checking or anything of that nature and it took it to another stage. You can have games, you can have an Avitar and be an SS person and reenact whatever you want to reenact.
From what our research has shown over the past two decades, basically, the predominance is proportional. The extremism that we find online is generally anti-Semitic, racism, homophobic in that order, perhaps, in terms of the majority of things. But really, every group is targeted and every group has its own extremists online. There is no one that is exempt in any way from this issue.
You can find it in the most obscure type of place. You can find groups targeting each other within each other. And essentially, basically, as I said, it is a -- an issue that we all have to deal with in many ways, and particularly those of us who are concerned with its affect and impact on youth and the future of society.
So we were asked to frame this in the form of a question. I'd like to bring to it that question point and I'd like to basically throw it back to you, the audience, in the sense, because this is an issue that we're all involved with. And while some of us may have the expertise of studying from different angles and perspective,s we really don't have answers as much as questions, still.
And my question coming is: Having been involved in this debate for over a decade, whether at OSCE or the UN or other international bodies, where very often I've been the only American speaking about this issue or representing it, I've often seen that the issue is framed in terms of the polarity, a debate between two absolute Poles. One of absolute freedom of speech and one of regulation.
So my question is: If that has proven to be counter productive so far, because we haven't reached any kind of consensus, is there, perhaps, a third way between the two that we haven't yet explored or explored sufficiently? Is there a way we can bridge the gap between that. Is there something that we should be doing that we're not doing? From your own experiences, have you found ways of handling this, that perhaps none of us have thought of or broached yet? I'd like to suggest that as a means of really the first question and pose it to the people in this group and this audience, and see if there are responses or questions.
>> LEE HIBBARD: Let's open it to you the audience and also to those who are remotely connected about the absolutes. You're talking about the gray area between the absolutes.
>> Sally from Fiji. Just to sort of comment very briefly is, first of all, Mahathir Mohmad from Malaysia, he talks about human rights and responsibilities, because I think in terms of human rights, jurisprudence, sometimes people take it to the extreme. Yes I have the right to express what I want to express and I suppose in terms of the balance between how far can I do it? And of course hate speech, that is just intolerable. But what I really found interesting about the presentations was, I suppose one thing that I thought was absent was -- because you have jurisdiction, even in the Pacific, and minority rights are important, yes, but there are also instances where the minority, if they are the elite in whatever society, and if we have power or clout, we are also able to also oppress and also able to manipulate content and to change policies.
In fact, there are instances where we are actually witnessing this and people can actually speak out. But there are subtle degradations and the impacts.
I particularly liked the first speaker's presentation, because one of the things that came out strongly was the coordination of policies, which I think is critical. And just to go back to the third speaker, in terms of the entire presentation, I liked it, but I heard nothing about -- I heard about groups being able to, groups sort of having the tendency to conspire and whatnot, but nothing about governments. And I think that's a critical point, too. Consider Guantonomo, other issues of human rights, other jurisdictions which I'll leave nameless, but have any -- they sort of impacted content over the Internet.
And I think that is something that shouldn't be missed in a presentation. Thank you.
>> LEE HIBBARD: Thank you very much. Does anybody else want to comment before we -- you raise an important point about international harmonization and those issues to coordinate. And we're going to address that question a bit later. But are there any other comments?
>> ZENET: I suggest that we collect 3 or 4 more questions.
>> Katherine, I'm from the University of Leeds. Hello. You're my former colleague. Just a couple brief comments and a suggestion, to your question. I saw that gender is missing from hate speech. And this is an omission that has historical reasons. And to many of us, it's unacceptable. So this is one thing.
The other thing is the assumption that nonacting antiregulatory manner is equal to nonregulation. The truth is that nonpolicy is already policy in the way that we maintain status quo.
The third point is a suggestion, to give more capacity, technological capacity on one hand, but also education to users who perhaps stumble across Web sites and to be able through peer pressure, if you'd like, to report sites and incidents such as this directly to authorities, directly to organizations, directly to other peers.
So this is an element that has not been strengthened at all, because we think in terms of top down. Thank you.
>> ZENET MUJIC: Thank you very much. I think the topic of education is very important to mention. If any of you has examples of successful projects, it would be nice to hear them as well.
>> Yula Mores from the organisation against cybercrime, based in Strasbourg.
I have a question for the OSCE and the Council of Europe. Have you developed specific approaches or programmes concerning vulnerable groups of population? And I think about people with migrant and minorities and what do you think do we need these programmes? Because they are often targeted groups and there are concerns about this population. Thank you.
>> LEE HIBBARD: Can we stop the questions there, and can we ask -- do you want to respond panelists regarding those questions now? Or not? Dunja.
>> DUNJA MIJATOVIC: I can only briefly reply to the last speaker, just to say that the OSCE has a very unique -- so the Office of the representative on the freedom of the media has a very unique task. But in that task, and in the mandate, there is absolutely nothing that could or should in any way interfere with the content of the certain problem, whichever platform we are using. So we are promoters, we are protectors of free speech. We act and we have a direct contact with the governments in order to remind them of their commitments, but we do not interfere with that area.
There is another office that is tasked to tackle hate speech on certain levels, especially during the elections and issues that you mentioned, and that is ADAUR, our office based in Warsaw. This is working a lot on hate crimes and hate speech. We of course do cooperate, but we are very careful not to interfere with the content of certain media.
>> ZENET MUJIC: I think you had a question?
>> YAMAN AKDENIZ: I want to ask a brief question to Mark. I'm very interested with the impact of 9/11 on the spawning of hate organisations, and on the Internet. And could you give us some statistics or quantify how many before 9/11 you came across and now what sorts of numbers you come across in terms of hate organisations and hate speech online?
>> THAIMA SAMMAN: I just wanted to give my contribution to your question about education. Some companies, including the one I've been working for for a long time, Microsoft, has some education programmes that they are trained to carry on, which isn't easy because of the freedom of speech on the endowment. But at least to educate parents and kids to train on hate speech. I'm not a specialist myself, but there are on hate speech, there is a code for people to organise themselves. And at least to educate people for them to understand that they are in front of hate speech contents, for them to make a fine choice.
There are also a lot of programmes, and maybe we can link to you these kinds of programme, which the -- the easiest one to have on parents to kids and directed to kids. It doesn't mean that you have a judgment of value, that you are judging what is going on, but at least for recognition of what is going on, and to have a clue to make an educated choice.
>> LEE HIBBARD: Thank you. Just before you speak, Mark, I want to connect with somebody who is on remote from -- Peter Melner from the Central European University. And he has asked the question in response to the question, what is the context of hate speech. What are some examples of post 9/11? What is the context of hate speech? And he is referring to a 9/11 situation. That is one context, which is really actual. Mark?
>> MARK WEITZMAN: All right. In terms of statistics, I can give you some statistics dealing with the subject. It's hard to deal with it, because it's so fluid. It's difficult to get an idea of it statistically. What we found, the first Web site that most researchers count is a storefront in April of 1995 was the first. At the time of 9/11, there were a few thousand. They were mostly concerned in western society, mostly right wing and other types of extremist groups. Post 9/11 was a huge Surge. Our later count is over 20,000 Web sites currently. The majority of those are radical jihadist terrorist sites that have over taken the, you know, the more traditional forms of it. As I mentioned earlier, you can find examples of every kind of antiBahi, antiIslam, antiChristian, antigay, the predominance of anti-Semitism, racism and homophobism. There is no one group that doesn't have -- possibly the Bahi, it's own targets, it's own extremists. Those are the best that we can sketch them out. But they to remain sketchy, because of the fact that it's very -- particularly with the growth of the social media, there are thousands in those categories alone, YouTube and Facebook. And sometimes they are taken down as fast as they are put up or they reappear in different forms. There are different -- there was a boxer in the UK named Ricky Hattan who has no connection with any of these groups or movements as far as I know. Yet at one point we found that people were hacking into one of his sites and posting stuff on his site as well against -- without his permission, without anything there. So, we had to start counting that.
But that is the example of how difficult it is to track and to monitor and to count; as well.
The context of hate speech is really any issue can become a context. There are hot button issues. One of the things that the Internet does is it ratchets up the volume, because people who are posting on it are, first of all, there is no super ego as the written in Burundi won described. You can post your darkest thoughts online. Before, in a social context, you had to keep them quiet let's say, here in the privacy of your own bedroom. Now you post them and someone responds and ratchets up the conversation and it goes higher. And sometimes people don't know how to get out of that and they get sucked into radicalization of various forms, whether intellectual, emotional or leading to acts of violence. It's hard to give you a statistical analysis of that.
What I would like to go to is sort of respond to the question that I raised earlier myself, as well as the comment made about education, and to say that I think there are, in some ways, ways that we haven't pursued adequately yet. And one of the things that I've been convinced of, and I've been arguing this for a while, is that there is somewhat of a third way or moderate way between the two extremes and polarities which I mentioned earlier, and that is the responsibility of civil society. We tend to say that either -- either that it's all regulation or all freedom of speech.
First of all, to point out that even in the US, there is no absolute of first amendment rights online. There have been prosecutions in the United States of people who used the Internet to post terrorist threats to individuals.
One quick example would be threat a threatening e-mail sent to students or anyone on a college campus in California that had an Asian last name. And the person who sent those was prosecuted and found guilty because they sent it to individuals, targeting the individual by name. So there are no absolutes in either way. Certainly, even in the European context of laws regulating speech, we still find sites, as we will hear in Germany and I learned of a site in Lithuania that someone told me about. So there are no guarantees that those methods work 100 percent. So I do think when we rely on government or international bodies to regulate, we tend to abdicate our own responsibilities. We say that solution lies with somebody else and they will take care of the problem. And I think the point about education is critical.
The same way we educate to use the Internet in all sorts of ways, it's a form of taking out financial loans. It's a form of financial, you know, commerce, business, social interaction and so on there. Has to be knowledge that how do we use this critically? How do we process this information critically? How do we get past the point of just believing everything that we read? And that is something that we sometimes tend not to do or do well or not to do enough of. As well as the fact that we have ways of social activity, of coming together to contact ISPs. Nothing wrong, I done think there is anything wrong with people organising a movement to convince an ISP that they don't want to be known as a home for extremism, let's say. And those fall between the poles of letting it all go, or relying government to take care of everything.
I think there are a lot of ways and paths that we have to explore to take the responsibility ourselves, and I suggest that that is something that we have to make a critical priority.
The problem is not going to go -- it has not gone away. It's mushroomed, and the implications continued to grow for all of us.
>> ZENET MUJIC: That's a good point that you made. Civil responsibility and not just relying on someone else to fix the problem. Madam, you had a question or comment?
>> MERYEM MARZOUKI: Still on statistics and the different reports, because I think this is a critical issue. How do or can we quantify the importance of hate speech online? What is the exact nature of this kind of speech online and do we observe any evolution? Does it -- is there more hate speech online now than in previous years?
And we have different reports and we have different statistics. And it's a bit difficult to -- for one to find his way among all these reports.
In France, in my country, an official report was published last year by with some observations which are interesting. The first observation is that there was no real evolution in the quantity of hate speech online. But, what is clear is that there are some peaks at some time of hate speech online. And these peaks are related to political events in the world. For example, the Middle East situation, when it is exploding, as it is regularly unfortunately, then we have such peaks. When the president speaks with trauma, we also have such kind of peaks.
So, which lessons should we learn by this kind of evolution in peaks, but not a real evolution in quantity?
Also, what are the indicators to really measure this quantity and to really know this nature? Because there are different sources. We have civil society sources organisations that are working on this, and that brings some statistics. But, we also have things from hot lines, civil society hot lines or industry points of contacts, at least in Europe The association of ISPs has to set up a hot line for reporting about hate speech. And now there is a difference when some kind of hate speech is reported, is it really qualified by the industry as hate speech, so they have something to do about this or not?
>> Just briefly to add. One of the difficulties about hate speech online is multilingual. It's difficult to trace it. There are visual aspects to it, but in most cases it's text based. So, it makes it very difficult to monitor it globally. I think this is a problem for Mark as well. But that's why there needs to be much more local monitoring taking place to be able to access the nature of the problem or how to quantify it, as was stated.
>> LEE HIBBARD: There are many more questions than responses.
>> SUSAN POINTER: Perhaps to raise the tone of the discussion and the context about the Internet, I'm just back from having taken part in the global forum of the press institutes. 60 years celebration of defending free media and expression. It's humbling to be in a room surrounded by individuals who put their own lives on the line, to defend the right to receive and impart ideas in line with article 19 of the UN declaration.
I'd like rallying cries, yes, this is an important discussion and I think it's great that we are addressing it, look directly at hate speech versus freedom of expression.
But I think what I'd like to remind us all is not to lose sight of the contacts. The Internet is an incredible unprecedented platform for enabling free expression. In turn, enabling exchange of ideas, enabling greater openness, enabling the underpinning of transparency, democracy, good governance because of that access, it's inherently an open technology. It's inherently cross border. It's inherently multilingual. Because of the many, many access points, it allows the lack of no prior permission in order to access the communication channel, the end to end connectivity, all the great elements of the Internet technology that the IGF, we should be celebrating.
On the other hand, you know what that doesn't mean is a free-for-all absolutely.
And I think I should address the comments that were made about YouTube. YouTube is a platform. Absolutely like many of Google's services. We have a bias in favor of free expression, but not a total free reign. And in fact, YouTube, just a reminder that YouTube usually upload over 24 hours of video content to YouTube every minute. So that is the volume of content that you're dealing with. But YouTube has community guidelines. They are online, do a search. YouTube community guidelines. Those include a list of things that are unacceptable to us on the site. Those include things like child pornography, it includes things like harassment. It includes things like -- I'll read it, because I think it exists so why not just read it.
"We encourage free speech and defend everyone's right to express unpopular points of view. But we do not permit hate speech, speech which attacks or demeans a group based on race or ethnic or general, religion, disability, gender, age, veteran status and sexual orientation, gender identity." And there are flagging mechanisms and this comes into the education area. Around every video there is the ability to flag content as inappropriate. We look at it on a case-by-case basis. You don't want a flagging list system to limit free expression. But where the content is clearly in breech of those community guidelines, we remove that content on a global basis. We are up front about the rules, and what the opportunities are, and then what is done with that.
In terms of education, there are many many videos on YouTube in terms of how people can behave more responsibly in terms of using the Internet. How they can be safer online. And that information is there and I encourage you to look at those.
>> LEE HIBBARD: I'll open the floor again. I have a question here from this gentleman. Before we go to that point, I think there is a general drive in the IGF and elsewhere for Internet freedom. Hilary Clinton's speech at the beginning of the year was where are we in terms of connecting? And many cases in Europe are giving a legal aspect to this. So yes, there is a drive that yes, we need that freedom. A colleague of mine myself we visited Google flex in California and we spoke to your colleagues Bill Daley in YouTube. It's clear that there is a responsibility there. And at the Sarajevo film festival, we put up anecdotes. We put up a camera and asked people to go in front of the camera and say what they thought freedom of expression was for them. One woman said my freedom of expression is to speak until you tell me to shut up.
So we have to decide when is that limit? And we're looking at the limits.
I'd like to pass to Michael Reuter.
>> I'm representing the European Internet Service Provider industry and the German ISP industry. There were a couple of times mentioned the intermediaries, I guess the ISPs, and the hot lines, in particular. We were in a hot line in Germany, and quite very successful. There are more than one hot line, I know.
But the problem is if complaints are -- it's really a complaint service for all of the citizens. If there are complaints coming in about any illegal content people don't like on the Internet, the intermediaries or the hot line can react immediately by passing it on to the police or note it and do take down procedures. The problem is with things like hate speech, where to send those requests to. I mean, how should the ISP industry react once they run on to those sites? Who to give it to if it's not really illegal? And this is true for most of the European countries.
>> ZENET MUJIC: I think at this point we can use the opportunity to give the floor to you to talk about difficulties of implementing different rules and standards in different contexts.
>> THAIMA SAMMAN: I was just thinking that I was going to jump in. The discussion is taking another rhythm than the one expected. I've been asked to give a kind of testimony on the pain of the industry. I know nobody ever complained about the industry, but still let me tell you how difficult it could be and what solutions could be put in place, knowing that there is little chance that at some point we find an absolute solution. Because it would put into question those two fundamental elements which are freedom of speech and fighting against hate speech or other contents.
First, let me say that from my experience, my experience, the industry in general is pretty committed to use to those issues, and keen to find solutions and to participate in this debate. And solve most of the issues.
Not everybody came with the same idea, but at least the discussion exists. And I'm going to try to summarize the main issue that we are dealing with.
We have, I think, two levels of complexity. The first one being more simple than the other one. Let's take the issue at the national level first, because temporary to what we can think, the issue is not easy. From the national level, as well, as you were saying in Europe, it's probably easier than another part of the world. We have a principle of freedom of speech, with some -- with some very precise exception, which correspond more or less to a consensus regarding to the European area, I would say. So you have the right to say whatever you want, but you can't go that far that you undermine people's privacy, privacy being delimited. You cannot put online illegal content.
Child pornography is easy, because there is a consensus on this one. And hate speech, which is pretty well-defined, and the fact that you restrict freedom of speech for hate speech content is at the European level.
When we have a consensus on the substance, on the law, on the regulation, the issue came from the execution, how do you proceed? How do you make it work for the law to be respected where the law exists, coming back to your point? And that is what is not solved already at the national level.
So one of my suggestions would be, would certainly be to start by the national level, if we can find some ways and some process to have execution, law execution by the enforcement agency, if the industry would know who to address at national level, that would give us some learning, when we go to the next step at the international, when it's even more complicated.
And I want to refer to the European lines, and Microsoft has been involved with that and other companies. It was one of the ways -- it was one process put in place that was welcomed by the industry as a place to discuss and to discuss the concrete process in this area, to make it work.
The second level will be the international one, and the international came quicker than one could think.
Let me give you a very well-known example of this case. Years ago in France, Yahoo was providing a Web site, some tools, contents, et cetera. So you could think that the French rule was going to apply, because it was a French Web site delivering to French people. But it was Yahoo, a global company, which was just passing on some content that was available worldwide. So it was not -- it was not an easy one. And I was not working on this case at the time, but I've been writing a lot on this one, because that's where it came to fundamental elements of everybody's culture, where Yahoo corporation was explaining that freedom of speech was a constitutional right. So they were not only alone, but it was a fundamental element of the culture.
France and Germany, for example, were fighting against Naziism. Promotion was a fundamental element of the country and the way with their constitution and the way those two countries functioned after the Second World War. You couldn't ask any of them to give up on their -- on what they believe. If you just have to find solutions and that's where you have to go to be pragmatic in identifying how far you can go, I'm not sure how far it will go. You have the French Web side versus the World Wide Web site needs to be defined, to at least find some way to find solutions.
Another example I would like to mention is the existence of this global network initiative. I think Google is part of it as well as Microsoft. So it's really a consensus among the industry, which gathers industry but also stakeholder, public authorities, international bodies, academics, to at least discuss and find some way to go -- to move forward, coming back to your position that solutions should come from civil society.
As a law, I don't like when it's, you know, something everybody thinks, but it's not written somewhere. But in this kind of place, at least is a kind of tool for the industry, because you can discuss case-by-case. And if you have to take a difficult position, what do you -- what do we do about freedom of speech in China, for example, or what do we do with hate speech in some part of the world before taking any solution for companies, we tend to in any case respect local law because it is what is requested for them. It's easier when we talk about Europe, because we feel we have enough freedom.
But what are we to decide what are the applicable laws in other countries, that at least gives industry some possibility to share with other stakeholders, the rest of the industry and the other stakeholders to find a way or to make up their mind?
And if they have to take a tough solution, to at least have a kind of doctorine from other stakeholders. So that is the thing that we have to use to go forward.
My point is to make you understand that it's not easy for the industry. My message would be, because it's sometimes the tendency, but it's not the good one, don't let the industry decide for the rest of the world what is good to see on the Internet. So we have to have some process and some place for these kinds of things to be decided. And my question would be: Would you have some example and what would be your position to address the enforcement, or the position to be made and how to execute, this has been discussed for years but to my knowledge there is no consensus yet. So if there is some successful way to identify applicable law and enforce the law or some of the -- or some other way to make sure that we respect those two fundamental elements, freedom of speech and the fight against hate speech.
>> LEE HIBBARD: Thank you very much. Clarity in the rules? Clarity in rules across borders as well, it's very important to think about it.
Is the private sector doing enough? Are they contacting international organisations and doing more to drive illiteracy or addressing the questions of jurisdiction? I think we can always do more. Susan from Google has a remark to make before we carry on with the discussion. We should have a dialog with you. It's a multi-stakeholder dialog. Michael wants to talk about the human rights guidelines by the European Internet Service Providers, discuss a set of guidelines which guide the industry and how to deal with things like hate speech. Susan, do you want to make a sort point?
>> SUSAN POINTER: Thanks, just a very short point. If there is a general agreement that the Internet is a gain for human rights, and I believe it's a communication channel, what we have to be vigilant to is that when we are looking at things, whether it's global conscientious, things like hate speech and child pornography, that we are being extremely vigilant that those justifications for removing or restricting content don't creep into broader areas. And that's what we are beginning to see in subtle ways. This justification creep I like to call it. And that is where a reason is used that we all agree, so remove something that is based on child pornography or hate speech and you probably wouldn't find a person in this room that couldn't disagree with that. But if you look at it on a case-by-case basis as we have to, and you look at the detail, if that is in fact an attempt to restrict political expression, but is given a different title, then we need to -- we can't just let that through. And again, that's why I urge for the vigilance. This terminology can be used to justify things that shouldn't come under those categories. And so we have to maintain the communication channel, also, as a channel for freedom of expression and human rights protection.
>> ZENET MUJIC: I have five requests now from the floor. First Michael and then maybe you, you had a question, you, yes. Michael?
>> MICHAEL: Okay. Coming back to the notice and take down, where you said it was tried over the years. There was a trial in Germany for take down in child pornography, which runs until February of next year. And it has shown that if the hot lines speak to hot line, we can get the take down of child pornography within 12, 36, up to 36 hours. Let's say within less than a week. Even in the US, even in Russia, if the hot lines speak to hot lines, if police speak to police, it's more than half a year. Forget to take it down, because then it moved elsewhere. But that was just a remark to the notice and take down procedures which are alive. And if industry works together, I think it works better than if you go only on the police level.
But coming back to the human rights guidelines for ISP, we have developed them together with the Council of Europe and it addresses a lot of these questions. And we tried to put it to the Internet -- to the industry as having -- signing up for them. It has shown that it's a little bit weak, and we're reworking, doing now some rework on it in terms of more binding for both sides, more binding for the industry, maybe some kind of trade or something like that, more binding, but also more binding for governments to act, to react or to act, or whatever is appropriate.
And of course, it's not a document you develop once, because technology goes on and on, so we have to redo a couple of technological aspects.
But, it shows industry is willing to do something whenever they can. We're not just closing our eyes and saying let the Internet run, we are making it our business. But as Lee said in the beginning, it's a multi-stakeholder approach we have to take. And that's not only in our industry, it's a multi-stakeholder.
>> LEE HIBBARD: Thank you. We have another speaker.
>> Laren from the Munich University. Since we made the point of education and hate speech, and this example of the Jihadist Web site was made, I want to point out that the average Jihadist is a highly educated person. He has mostly been educated in western societies. He knows much about freedom of expression and all those Democratic concepts. And that's -- let's understand them. That is not attached to that ideology.
And the point on that -- and I agree with you -- is because we're having a multifactorily movement that has to do with cultural dynamics and economic dynamic, so reducing it to thinking that education will solve the problem I think would be enough. But that is not also applied to Jihadists. We have in Spain the same problem. We have four different ethnicities, and right now they are having a big debate and there is a lot of hate speech going on on the Internet for all of the ethnics. And again, the people talking on the Internet at all of those sites are highly acknowledged literates in Spain.
>> LEE HIBBARD: Thank you for the question. People just hide behind those words. Yaman, you have a quick response.
>> YAMAN AKDENIZ: What Susan said about the justification creep. A good example is the current pressures YouTube is facing in Turkey where I come from. It's been over two years now that YouTube has been blocked by the Turkish government, and the main reason for that is that YouTube is refusing to remove certain video, which I also would say that is regarded as political speech. And so -- but there is so much pressure in Turkey. You couldn't believe the Turkish transportation minister who is responsible for planes, trains and buses is is responsible for the Internet policy and has been calling on YouTube to set up an office in Turkey so they could be regulated by Turkish law and subject to the dubious censorship laws. And if they don't remove the content, the YouTube authority is -- would face imprisonment in Turkey.
So why would you set up an office in Turkey under those circumstances? And how can one country try to pressure an international company to set up an office in that particular country?
>> LEE HIBBARD: Does someone have a response?
>> ZENET MUJIC: I suggest that we take more questions. Thank you, Yaman. I think Andrei will tell us about the abuse of hate speech provisions.
>> AUDIENCE: I'm from Trent University, Italy. Nicholi. I think there are two sides of the problem. One side is what is hate speech? And the other is should we censor hate speech? It's the word censorship that we don't use gladly. But in the end, it's what we are talking about.
And I have some problems with hate speech. I mean, in defining hate speech Earlier, there was a definition likely to incite to violence.
Let's say that there is a John Doe who is a criminal. And I say I hate you, John Doe. You are a criminal. And I want you to die soon.
Is it hate speech? Probably -- I mean, I am saying that I'm hating someone. And is it likely to incite to violence if John Doe has many followers? Probably so. And probably to Me.
But I saw someone thinking no, it's not hate speech. And let's try to say that I'm not dealing with John Doe, but with Barack Obama, the Dahli Lama, the Pope, is it hate speech? And so probably there is a question about value, not just about likely to incite to violence. Violence from whom? And so I think it's -- first of all wee, should define what is hate speech. Otherwise we can't reply to the second question. Thank you.
>> LEE HIBBARD: Can we just take some quick responses from the experts before we carry on? Because we will run out of time. We have other questions, other speakers, in particular, Andrei. Very, very quickly.
>> SUSAN POINTER: Just a quick clarification to what was said earlier. The interesting thing about the YouTube case in Turkey and it builds on the point made earlier, that is a case where YouTube is an entirely based entity and YouTube is banned because it is not applying Turkish law to its global content. So, it's very, very interesting issues there that I'd be happy to speak about in greater detail. but really it's more about the jurisdictional issue.
>> THAIMA SAMMAN: I just wanted to make the link with what you said about the notice and take down, and don't try to guess what I'm thinking about with the notice and take down from what I'm saying.
Mostly, the issue is when you talk about child pornography, it's easy. The definition of child pornography is pretty easy. You can't define child pornography but you know what it is when you see it. The issue being more the monitoring part, and that is also something that we need to take into account. If you want to somehow make the Internet more moral, the monitoring part. And what to do with what you see is a bigger issue. For the other issue, when there is no such consensus, going come to which law applies, everything is turning around the same thing. We had, if I remember well, in France, this notice of intent and take down within the law as an element for Internet providers to react and to have some element to at least prove their good faith to the notice and take down. And if I remember well, your organisation or an organisation challenged that saying that if you want to -- if you want to prevent someone with the element of speech, it has to be a Judge position. So if you do that, it takes time. If we want to receiver this kind of issue, we have to have that in mind. It's just not that -- it's not that doesn't -- the regulation doesn't apply to the Internet.
It's just that it doesn't give an elemental enforcement that we all need to find some solutions. And it was clearly refused by the French Supreme Court that you can remove some content just by private decision, if it will have an impact on freedom -- on speech.
And coming back to the Turkish case, we again -- we had a very interesting discussion in particular with the Russian representative yesterday what is hate speech and how people will define a speech. It depends where you confirm what is -- not only what is your history and culture, but how you -- your society is reacting today. And the example from the Russian vocabulary was if you say the word "Nigger" it's not viewed as an insult. But if you say "black," it is. How can you have an international consensus? It's just, you know, that doesn't mean that we shouldn't do anything, but in defining hate speech, if we go to the international set, we really hurt for the hate speech. And it's a very difficult issue that we need to understand.
What we need is white lines not only on definitions, but also on process.
>> MARK WEITZMAN: To the point that you made about education. The historian Jeffrey Herf called about people using the tools of modernity and speaking in a technological sense. There always has been a core of leaders and well educated people, including, for example, not just Islamic Jihadist terms, but in terms of NeoNazis who are lawyers or Ph.D.S, and someone prominent, who use their knowledge and ability to recruit less educated and sophisticated members to carry out the things that they don't want to be implemented with themselves, but want to use tools for. Franklin Latel used the term technologically illiterate barbarian, that refers to Nazis. Maybe of the people that we see in the terrorist movements, Jihadist movement, are coming from an engineering or technical background. Less so in what we call a more humanities approach. And maybe there is a distinction that we need there. But that is the experience that I've seen so far in terms of it.
So the education I was talking about was an education to critically evaluate the Web sites, and there is clearly a sense in many places that one call tour is under attack from another culture, or maybe a majority culture and thus justifies all the response, including violence, that are endeny kered online.
>> LEE HIBBARD: Technical proficiencies --
>> MARK WEITZMAN: Or more cultural proficiencies.
>> LEE HIBBARD: We have 40 minutes left. We have to speak to other speakers. You have a response. We have lots more questions. So can we be as precise or concise as possible, please?
>> ZENET MUJIC: If you talk about media are Internet literacy, it's not just technological, it's textual literacy as well which has to be stressed. Please.
>> I'm Chris for Verlicto, I'm an ambassador this year. (Off microphone.) I'm an attorney, so I have a particular interest with regard to the law and jurisdiction. And when we talk about hate crime speech, we talk about like cross border crime. So while the Council of Europe has a number of instruments in place, -- well, especially the (Off microphone.) That has been ratified by very very few countries in Europe. And as far as I know, by almost no country in Latin America. So, it would be -- it would be great to hear from the panel what you think about like the jurisdictional issues that all those things arise, like hate speech, racism on the Internet, how could you find a way to really enforce the law or to make the international existing treaties enforceable? Because this is really a problem.
I mean, when we talk about all those crimes, this is cross border. So it's very likely that in many countries, like what happened like back in 1999, in Germany, Germany had like the first case done on jurisdiction with regards to content of speech. Do you remember perhaps some of you are familiar with the Frederick Turbin case that occurred in 1999, where a famous professor like uploaded some content from a Web site in Australia, and he was claiming denial. He was denying sort of like that the Nazi Holocaust during the Second World War happened those days. So Germany found jurisdiction on this case and decided to prosecute the case. But at the time, that case was not actually regulated in Australia so there are a number of cases that I could mention, but I'd like to hear your thoughts on this.
This is a very very important issue.
>> LEE HIBBARD: Thank you Christof. Before we get to that point, we have to fit everybody's voice. Harmonization and effective cooperation, clear roles across borders is something -- we will ask that question. Maybe we will tie that in with your comments, Andrei in a few minutes.
We are another question, Zenet you have a question in the back?
>> ZENET MUJIC: I have here to my right.
>> Thank you. I'm from the European training and Research Centre from the University of (Off microphone.) And I'm aware and the discussion has shown the dilemmas which are involved in this. And still, I think we have maybe to choose not the easy way. The easy way is the American way to say everything is possible. But anyway, we do not control it, and we have the first amendment.
And the not so easy way is consensus building. Can we agree on something somewhere in and certainly in the Council of Europe, I think it's easier. The Council of Europe has done a lot of good work in that field, they have guidelines. Also, with the game providers, and hopefully in the future also with search on search engines. Because actually search engines is also an issue which we have not yet touched much upon here but which is relevant. I have give you one example. In Austria, we have provincial elect, and there was an Internet game put on where you can -- elections, and there was an Internet game put on where you can shoot at mosques, it's not a gun you're using, it's whether you click on stop and then the E man turns away and the Minoret falls down. And this is in the context of local elections. And then the side, which was a political party was forced to put it down.
And the next day it appeared on an American Web site, which you can easily Google when you want to know where it is. And therefore that shows how many actors actually are involved in that. And one issue I missed a bift the discussion was also the EuroAtlantic cooperation. We know that we have the protocol because the US was not able to put -- to accept it as part of the Cybercrime Convention. But what happened after? Is there a dialog? Is there discussion in the European/US/Australian context where most of the problems arise?
And maybe a question in that context to the OSCE, and Dunja is returning, you have the management from the United States to Kazakhstan and the Europeans in the membership as well, so how to deal with it or have there been efforts to deal with it, racism, hate speech and so on, which is as we know very sensitive, difficult to define, in such a membership context? And for the Council of Europe, it would be interesting to see what kind of cooperation is being sought onen one side. In the private sector will there be no more? But on the international cooperation field, because most of the European efforts, as I gave you the example, not leading far if they can easily be circumvented to US or Australian Web sites.
>> ZENET MUJIC: I'll give Dunja some time and I'll give you floor to you now.
>> Ben Vard from the University institute. I'm representing the coalition on freedom expression and I think this might be an interesting subject for a lot of people in the room. So you're welcome on Thursday. Two points here. I noticed most strongly the title -- it was titled balancing. And I've been missing the balance a bit perhaps. Because we have been talking a lot about hate speech. We have talked about freedom of expression. But the difficulty of course is to balance the two. And I'm wondering the cooperation, as Wolfgang suggested, but other forms of reaching an agreement on how hate speech can be agreed upon in an international arena will lead to a further normative drive towards finding consensus on what we consider to be hate speech, which in turn is essentially at the end of the day inciting nation states, which perhaps are less benevolent than the Council of Europe states that we are talking about here to control the Internet past what they are doing now.
I'd be as careful as possible when discussing this, and as careful as possible on reaching some kind of consensus, because if the lowest common denominator is then used, as always is in the case in censorship issue, where countries in the middle eaves say if European countries are sensoring the Internet, why can't we? So we should be agreeing too much on what we think hate speech is, because other countries will follow and they will be less benevolent.
>> LEE HIBBARD: Andrei could you bring in a few comments?
>> ANDREI RICHTER: I'd like to comment on this. Yes. Russia and CIS country, I'll mention some of them, can provide both bad and good examples of the definition and the jurisdiction. Because Russia is part of the CIS and therefore transborder regulation is also quite important in my part of the world.
9/11 was a benchmark in the sense of definition, because until 9/11 and even during the Soviet times, there was more or less a clear definition of what is hate speech, both in the constitution of the -- the Soviet constitution and the Russian constitution and in the crime gnat nal court. After 9/11 the logic of the government changed. The logic is very peculiar. The logic was well, hate speech leads to terrorism. So, we could instead of fighting terrorism as such, we should look at roots of terrorism. And that is hate speech, but not only hate speech. And the Russian law at that time in 2002, defined as the roots of terrorism the so-called extremism or political extremism. Which includes hate speech, which includes hate crime, which includes Nazi propaganda. But the definition was already broud in 2002 and it became broader with each your, because they stated that it was refined and reviewed several times since then.
They introduced the notion of extremist material which was a basic element of what the government wants to fight with.
After Russia, similar law was adopted by Moldova and other countries, within five years adopted similar law. And then the last interesting law which was adopted, it was only last year, it is a model stated of the CIS on combating extremism. And it gives a wonderful, in my view, a wonderful cynically definition of extremism. I'd like to read it slowly, that so that it will be understood well.
Extremism is an attempt at the foundations of the constitutional order and state security as well as a violation of the rights, freedom, and lawful interests of a man around citizen that takes place as a result of denial of legal and/or any other accepted standards and rules of social behavior.
So in other words any violation of accepted standards and rules of social behavior is extremism.
And extremism again is a hate speech. It's a crime. It's a very serious ci. and this model stated, to which in legal sense, it's like the recommendation of the Council of Europe to the CIS countries. So the model recommends to the national legislator, first, the courts should do it. The Minister of Justice should compile a list of such extremist materials. If there is extremist material in the mass media or on the Internet, there should be first a warning by the prosecutor and second there should be -- outlets should be shut down.
If we speak about global network, then the jurisdiction is simple. The court makes a decision that a particular material is extremist. After such a decision, the Ministry of Justice adds this material to the federal list or national list, and it's available on the Internet. It's hundreds of materials now. And the court orders Internet Service Providers to block access to the particular material.
So there were several decisions already on the YouTube access to YouTube, but all of those decisions were to the parts of the YouTube where the courts considered there was extremist material.
It's also a crime, of course, and if you use, in Russia at least, if you use the Internet, or mass media to call extremist activity, you face up to five years of imprisonment.
Now, that is an extension of what is hate speech. Because the government intentionally mixed together Nazi propaganda and any unacceptable social behavior, to substantiate and to justify all this activity as a single crime. The government, just in Russia very recently, one year ago, every interior ministry Department in Russia established a counter extremism branch right after the branches and fight with organised crime were abolished in Russia. The government said we don't have anymore organised crime. At the same time, all those officers joined the antiextremism branch. So they have to produce results. The case is bizarre and very strange. It's not always politically motivated. One was commercially motivated and it's a great example of the American cartoon south park, which was broadcast on one of the Russian cable channels.
So, the prosecutor's office said that one of the series of the south park, Christmas songs, a religious hate speech, which was not a funny case because the prosecutor of the study to interrogate everybody could be involved in making decisions to broadcast this particular series on cable television, threatening all those people with two years imprisonment in the case that the series was proven an extremist materials.
The picture is dark. The extremist law is used as a main weapon against antigovernment political speech. At the same time, there are good developments, and the -- and a very interesting decision which was taken by the Supreme Court in June of this year about the role of the context for the violation of the law in speech. And the Supreme Court said, in particular, that all Judges in the Russian Federation, when considering whether there is violation of the law in the content, should take into account what is the aim, genre, style of the material, whether the material is considered as part of the political debate, and attracting attention of the public to a particular problem, whether it was an interview, and what was the position of the person who took the interview to whatever was said by the interviewed person?
The courts and the Judges should consider what is a social situation and political situation in the country, and in particular a particular part of the country. So in many ways, the Supreme Court instructed the Judges to follow the case law of the European Court of Human Rights, because most of the formulas that were taken, that were spoken by the Supreme Court were taken from the case law and the European court on human rights, particularly Le Roy versus France case.
>> LEE HIBBARD: We have ten minutes left. We have to yap wrap up with Yaman. A five minute wrap up. We have a question on remote from the central European University. Stephan do you want to speak at this point?
>> STEFAN GLASER: Just two remarks. Just two questions that are lying on the table right now. First one regarding the quantification of hate, you raised. And the second is how to tackle hate content. All from a very practical point of view.
And as I mentioned, we are monitoring hate speech on the Internet for the last ten year, I would say in the USCC and the right wing extremists on the Web site and what they are doing there. And what we experienced was a big growth of Web sites as well as content on social networking sites, media sharing platforms, et cetera, et cetera. But in our view and the problem I not so much the quantity, but the quality of what you can find on the Internet.
Just one example, four weeks ago, our team discovered a CD spread by video platforms and blocks. The city was made for kids from four to ten years. And it contains cover versions of well-known lullabies. The neonazi that produced the CD made new lines. And she spread propaganda against all people and he called for violence and incited to kill. It was illegal in Germany but not widely spread through hardcover, but it was spread through Facebook and YouTube. It's not the amount of hate that you find on the net, but the way it's prepared. Web 2.0 has changed the potential range of hate content on the Internet. And especially those services that are used by millions of people as you all know.
And just as a secondary remark to that, this is not a trend specifically in Germany, but also reported from our partners of the international network and cyber aid from 17 different countries. How can hate speech be tackled? I just want to give' short impression of how we do it, because we do it with a twofold strategy, based on a culture of shared responsibility. It's what you called the multi-stakeholder approach.
On the one hand we try to get content removed, which is easy in Germany, because we have strict laws concerning hate material and providers are responsible for illegal materials spread through the service as soon as they get it. But this is important. It also works in countries such as the US, which have a broader understanding of freedom of speech where such material is not illegal. Your providers act on the basis of their own terms of service, which does not allow the dissemination of hate speech. So from our point of view, this is an important statement of the provider, saying that the freedom of speech has to go hand in hand with the respect of the rights of other, otherwise it has to be restricted. So removing hate content in that view is an act of solidarity and respect.
We have a twofold strategy, we foster media interest. If we have a look at the content that we assess during the year, only 20 percent are illegal. It can be tackled by a law or other measures. The others are protected by freedom of speech. So we have to live with them.
So the second strategy part is foster media literacy and critical thinking amongst youngsters. So we try to -- we're doing workshops where they can learn how to assess information that they obtain. And we analyze the structure of hate and they learn to use the Internet as a tool to tackle hate speech by practicing counter speech.
>> LEE HIBBARD: Thank you very much. That's very interesting.
We, again, we're fighting the clock. And we have five minutes. And we started with sluts and all I can see is gray. The context of hate speech, you wanted to make a comment in the context of a question you had? Can you wrap that up and sort of give us a reflection on what your question was, but without asking the question.
>> MERYEM MARZOUKI: I want to get back to three comments that were made. One from Michael, the one from the gentleman from from Trento and Christof's comment. Michael told us that it takes, at most, 26 hours to have a hate speech or illegal content removed when hot lines talk to hot lines. And it can take six months when police or law enforcement authorities talk to law enforcement authorities.
I agree, this is a price to pay for a little bit of bureaucracy. But this is also the price to pay for the democracy and for the respect for the guarantees to respect human rights and freedom. Because the real question is what is hate speech in
This is not a qualification that we can all agree upon. Because we are not talking about the same issues. Andrei right now explained to us that what is called, so-called extremist speech, is qualified as hate speech. But what is extremist to you is not probably not extremist to me. So, we have different approaches and different political approaches. Because it's a political issue.
Is holocaust denial hate speech? And a follow-up question is should we deal with holocaust denial through legal means or should we deal through Democratic debate with historians and everyone in the conversation? And is the defamation hate speech? I would say no. Because hate speech should be -- and this is a point that can be agreed upon, hate speech is really an attack on a human being or a group of human beings, for what they are. Simply not for what they think, not for what they do, but for what they are. And this is the core of what I would define hate speech. And if we agree not on the definition, but on the need to distinguished between these different forms of hate speech, most probably the additional protocol of the Cybercrime Convention could get more signatory, more signatures, more ratification because we could find a good group of countries agreeing on limited definitions on hate speech.
And then they could sign-on this. And then we would have made some progress.
Then we could agree that the definition of religion is not hate speech. So no one, no country would, at least in Europe I hope, would sign-on on this. So we have served more so we have solved this issue. And now we have definitions in between. And here comes the very important issue of the definition. If from France I can ak rest says and read hate speech Web sites hosted in the US, does this necessarily mean that the French (Off microphone.) Is competent for this?
And back to the Yahoo case, what have we learned in more than 15 years and I'll stop here. And it's important to decide if we made progress or not.
>> LEE HIBBARD: So many question, I think we need a whole day at next IGF just to talk about these questions.
But, I think the questions are right insight follow to what we need to address and I learned a lot. But Yaman as the unenviable response.
>> YAMAN AKDENIZ: Last night we talked about 20 minutes. Now it's one minute. Look to the few tu, I think it's, to sum up today's discussions, I personally believe that rather than harmonization of rules at the international level we will witness more fragmentation, based on the protocol experience at the Council of Europe. It was open in February of 2003 and now we are in September 2010, nearly eight years later, only 18 country, Member States out of 47 Member States of the council of Europe ratified the addition at protocol. In some cases, like in Portugal and Netherlands, it took them 7 years to ratify. So the whole process is very slow and I don't expect many more ratification, because the definitional problems, as highlighted today, it's very difficult to get consensus at an international level, or even at a regional level, because variations continue to exist at the Council of Europe level. Some countries criminalize denial of Holocaust, and others don't. So that creates jurisdictional problems as we tried to highlight today. Ultimately, until wider agreement and consensus at the international level is reached, which might never be reached, we might have to rely on self regulatory measures. But the success of the measures will depend on substantial improvement of the existing system, including the development of ISP course of conduct. And I was aware of the Council of Europe ISP guidelines for human rights, there is -- there was -- as highlighted by Susan there was a complaint and other mechanisms by social networks and Web sites. But -- and there are hot lines working in this field. There will be need for more transparency and accountability by the industry, I think.
And blocking and take down decisions need to be subject to due process principles, preferably by courts rather than individual industry decisions. And we will need more coordination and cooperation at the national level between all stakeholders.
So we try to represent today all stakeholders in this discussion. But that needs to be the case at national levels going downwards, but also internationally going upwards at the Internet Governance Forum or organisations like the Council of Europe, OAC and the European, they need to gather all of the relevant stakeholders together. Because civil societies needs to be represented for human rights principles. The industry needs to be always there, and industry, it used to refer to the Internet Service Provider, but we have now search engines, social networks, and they all need to be represented.
And law enforcement agencies as well, and the judiciary as they are part of the solution as well.
And so I think we need to talk more to get to the bottom of this problem.
And thank you for that.
>> ZENET MUJIC: Thank you very much. I think you kept it well in your three minutes what you wanted to capture in 20 minutes.
And I apologize for having only two hours for this discussion. We realise we need more. And certainly there will be more opportunity to engage in discussing this topic.
I'd like to add one issue I think what is also needed is more civil courage to engage in discussion. I think that can't be overstressed, really, and education. I'd like to thank all of the panelists and the remote participants. Thank you very much.
OPEN FORUM BY THE ORGANISATION FOR SECURITY AND COOPERATION IN EUROPE (OSCE/FREEDOM OF THE MEDIA) IN COOPERATION WITH THE COUNCIL OF EUROPE