The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
>> COURTNEY RADSCH: Welcome. My name is Courtney Radsch. I'm at the Committee to Protect Journalists. Thank you for joining us at the Dynamic Coalition on freedom of expression and freedom of the media on the Internet. Shall we all introduce ourselves?
>> KARMEN TURK: My name is Karmen Turk from Estonia. I've been coordinating the Dynamic Coalition for past five years, alongside Ben Wagner and Angela Daly. You might know them. So this time we were thinking of focusing -- merging new forms of censorship that are just very characteristic to the online world, and so we really, really, really wait for you participation, and so we really are not intending to talk ourselves too much. We really want to discuss it with you. I think it's 4:00 in the afternoon so we have all heard enough talking so let's discuss instead. Thank you.
>> FRANCISCO CRUZ: My name is Francisco Brito Cruz. I am Brazilian and I am director of InternetLab, which is a think tank on Internet policy that is based on Sao Paulo. I think it is.
>> DOMINIC BELLONE: Hi, I'm Dominic Bellone -- is this working? Dominic Bellone from Freedom House. I am here escorting a delegation of 12 Internet freedom activists and scholars from around the world here. In addition, we have also distributed our report on freedom on the net this year as well and showcasing that.
>> COURTNEY RADSCH: So I think what we'd like to do is just start off a little bit with a review of what's happened with the focus over the past couple of years in the Dynamic Coalition and then talk about what we want to get out of this meeting. We'll get a brief overview of some of the key trends that are emerging on censorship online from Freedom House based on their research, and then a discussion of what's happening in our host country, and then what we would love to do is throw it over to you to hear about what you think are the key issues related to censorship online, and go from there. So why don't you start us off, Karmen.
>> KARMEN TURK: Hi again. I will really, really try to be really brief, and I hope Courtney will raise a hand when I go over like four minutes. However, during the last year in this Dynamic Coalition we have focused a lot on intermediary liability issues. The reason for that has been case law that was running up in the pipeline of the Court of Human Rights in Europe, so it's only in 47 Member States so it's not the whole world. However, the Human Rights Courts regardless of their location, they do have an impact. So this case it's Delfi versus Estonia. In order to be transparent and for the purpose of full disclosure I'm not -- I'm not really objective because I am a defense lawyer of Delfi, so just keep that in mind when you think that I might be a bit biased against one or the other side. Please do raise your hands if you know the case that I just mentioned. So one Estonian knows it Google and some more.
I'll give a brief overview. Delfi was a news portal in Estonia, biggest in Estonia, 3,000 articles a day, so they had a commenting platform. In the commenting platform they had different safeguards in place. Basically they had automatic filtering system, they had notice and take down system so a button up here next to each comment, so everybody who read it could push the button and the comment would disappear. And in 2006 an Article was published about a shipping company breaking an ice route between mainland and the island. When you think about what is an ice route? It's just a road made of ice that is very characteristic in Nordic countries so don't give it too much thought. It really is a road on the scene. Of course we break it. Some of the islanders were a bit -- how to say, not happy about it because they had to buy a ferry ticket. So they had a few comments on the Article, in total about 200.
Six weeks later we got a claim from the owner of the shipping company saying that 20 of those 200 comments were defamatory to his reputation and honor and asked us to remove it. So Delfi removed the comments that same day.
However, the shipping company owner also claimed some damages, and this was refused so the court battled again. By 2015, June 15th, the court battle finally ended with a judgment from the court of Human Rights, and in essence the court said that, well, regardless of the fact that we are talking about commenting platform, that Delfi would still be a publisher, there might be different little (?) and tacks regarding that, but still it is a publisher, a traditional publisher responsible for the content as if it was its own.
Secondly, the court tried to limit the application of the judgment because they did get the idea that it's a huge Internet issue, and there was a lot of interventions and support for Delfi in the court -- court proceedings. So they said that, well, this is only applicable in case we talk about enterprises with economic purposes. So if you exercise your freedom of expression as a hobby it wouldn't apply. So this was a very interesting -- that kind of distinction.
And so in the end what the court said, that regardless of being an intermediary, what Delfi was claiming, like MCA in U.S., what we have in Europe, regardless, Council of Europe different declaration saying that intermediary should not be liable for the content they are not offering, regardless of Frank LaRue having a report that same year saying the same, they still found that they are obliged on their own initiative to seek further information that is blatantly unlawful. So not all the content should be controlled or surveyed by their own initiative but just the ones that is really, really unlawful. So basically like a hate speech or incites hatred or violence, so really that kind of extremely illegal. Of course when you have been on the Internet, all of you who are here obviously have, then you know that, well, how do you make sure that out of 10,000 comments you have something that's extremely legal. Of course you have to moderate 24/7, so even though in legal theory the case was better than the ones that were before that, like the section decision of the court of Human Rights or the Supreme Court decision of Estonia, which said that you're supposed to monitor all the content you are hosting. So -- but in practice it didn't really make a difference.
And by now it has had some effect. Many news portals are now foreclosing commentary platforms in case the articles are on a subject that is very controversial, just to keep it -- to keep the hate speech out of it. And of course when your intermediary and you have -- on your neck there is a civil liability -- criminal liability for publishing hate speech, then of course more often than not you do delete the comments or the use of generic content on your platform. So there was a dissenting opinion that I really strongly agree with saying that even though the court said that this should not be seen as a decision saying privatizing censorship, then the dissenting opinion said that how else should we name it when we see that intermediary should seek on their own initiative and delete the comments.
So this is what I wanted to start with because this is a follow-up from the last three Dynamic Coalitions we have had in (?) for intermediary liability. So what happened in the court of Human Rights, basically they opened up the intermediary liability subject that seemed to have kind of reached some kind of standardization throughout Europe and the UN as well, so this is what we want to go forward with this Dynamic Coalition.
>> COURTNEY RADSCH: Thank you. So now that that case has been wrapped up the focus in this Dynamic Coalition we thought could focus now more on censorship, and in order to think about what are those dynamics of censorship, there are so many different ways that censorship can manifest itself online. There's obviously public censorship, in which the state or governmental authorities exert censorship, require either legal censorship or through the privatization of censorship, which is definitely a trend we see, which is essentially outsourcing the state's role to private companies, and of course then there is typically very little due process, very little transparency.
There is of course privacy censorship, which has to do with how terms of service are implemented with how private companies and platforms decide to remove content or censor content, change content. There's self-censorship which can be the result of -- for example we've talked a lot about violence against women and minorities online which can lead to self-censorship, the self-censorship that can emerge when you have state policies that encourage -- and algorithmic censorship, and there are key major issues we could explore which include antiterrorism and encountering violent extremism online and the impact that that is having on online speech, on online protest, and on the right to assemble and associate online, the right to -- rather online blocking, surveillance, hate speech. I mean, there really are a plethora of issues, and what we'd like to propose for the Dynamic Coalition is to get a better grasp of how those specific dynamics are playing out across the world in different contexts, in contexts where you have strong legal systems with due process, and in contexts where you don't.
So, you know, we could think about, okay, could somebody look at how antiterrorism -- the censorship resulting from antiterrorism policies is playing out in private actors. Someone else could look at how it's playing out in public -- you know, we see, for example, in the Middle East a trend towards the adoption of cybercrime legislation and national security laws that have a censorial impact. There are ways to get around this. I want to turn it over to Freedom House, to Dominic from Freedom House to talk about some of the trends they saw in their annual survey of freedom online so we can help understand some of those dynamics a bit more.
>> DOMINIC BELLONE: Thank you. In a nutshell the trend is not good, as we all know. We've seen an increase in censorship since last year's report. According to the -- our research, 61% of global Internet users live in countries where criticism of the government, military or ruling family have been subject to censorship. Some of the top -- topics for censorship, not surprising, are criticism of authorities, coverage of conflict, corruption, accusations of corruption, political opposition, blasphemy, social commentary, mobilization for public causes, ethnic and religious minorities. A couple of other issues we don't always think of but are emerging, I guess, one is satire, and that's -- we have nearly half, 45% of global Internet users living in parts of the world where posting satirical writings, videos or cartoons can result in censorship or jail time.
Images. For example, we saw in Iran a cartoonist get 12 years for posting a drawing of members of Parliament depicted in a very unfavorable light. Also, in China we had some very funny Winnie the Pooh imagery that was poking fun at the President that got -- it got censored after the image was shared more than 65,000 times on a popular social media Web site. Also, LGBT issues, 34% of global Internet users live in a country where LGBT voices have been silenced or where access to resources have been limited, so that community is increasingly coming under targets of censorship. And then following up on what Karmen was saying on the intermediaries, we have this -- this disturbing problem of private companies being pressured to take down offensive content, in effect privatizing censorship.
We found authorities in 42 of 65 countries assessed required private companies or Internet users to restrict or delete Web content dealing with political, religious or social issues. That's up from 37 from the previous year. This obviously unfortunately provides a perverse incentive structure for private companies to police offensive content and to become censors in their own right. This places them in a situation to err on the side of caution when laws are vague or there is a weak rule of law. They are obviously subject to local regulations, which can often be vague and obviously they don't want to be shut down in the company, so that puts them in a position of policing content.
The other issue that's on the increase, obviously, is surveillance. In a previous panel, I think you were on it, Courtney, someone used the phrase "the rise of the surveillance industrial complex." Governments in 14 of 65 countries passed new laws to increase surveillance since June 2014, and many governments have upgraded their surveillance equipment. This is the sort of flip side of traditional censorship because mass surveillance leads to self-censorship of activists, journalists and others who depend on expression, association and assembly.
I would also add that self-censorship is one of the most pernicious forms of censorship because it allows content which governments deem undesirable to not be expressed, and therefore no need for arrests or overt censorship which might create messy international PR. Self-censorship, of course, is also very difficult to document. This has happened across the world, including America, from -- we saw the American Penn did a study last year where they noted among the members they surveyed they were self-censoring among revelations about surveillance. Closely tied to that is privacy, and encryption, of course. We found that although more and more users and companies are making encryption available, governments are trying to undermine those forms of encryption. We also see stigmatization of encryption as a tool for terrorists, and of course we saw the vice news journalist in Turkey.
So finally, we have surveillance laws and technology on the rise, governments targeting encryption and censorship is on the rise as well.
>> COURTNEY RADSCH: Thank you, Dominic. So I want to turn now to talk about the situation in the host country, Brazil, and I also want to remind everyone that you can tweet us with the hashtag IGF 2015 or specifically for this panel hashtag #DC for Dynamic Coalition, FOE for freedom of expression, DCFOE, and of course there's remote participation so if you're participating remotely please get ready to participate.
I want to turn it over to Francisco -- or to Karmen for one moment.
>> KARMEN TURK: Thanks. This is actually because I see in the audience Mr. Marco Pancini from Google. I am sure he would like to share their perspective of what are the new and innovative forms of censorship that you are seeing from day to day, probably. So if he could get a microphone. To Marco Pancini, the second row from the back.
>> MARCO PANCINI: Thank you, Karmen. Can I introduce myself? I'm Marco from the team of Google in Brussels, and actually this is a great discussion already. As you know, I followed the work of this coalition. I also followed the Delfi case. It's great to be here also because it gives us the opportunity to meet with a lot of people from NGOs all over the world, and we hear a lot of interesting stories, and we really understand and feel their pain and as they struggle to try to avoid surveillance. I think it's something that we are focusing a lot at Google in this moment, is really to make sure that all our services are encrypted and all our services, therefore, are, as much as possible, technically possible, are not -- not -- are not easy to be surveiled.
Encryption is very important. During a lot of discussion at IGF, and that's where I heard about the (?) encryption and a possible conflict between encryption and child safety or possible problems for the tag codes for -- for deriving the information because the device are encrypted. I once stated that for us encryption was the most important tool in order to make sure that if law enforcement needs to talk with us, come and talk to us through the front door, because we -- as we said, there is no back door at Google. We don't want to have any -- any -- any risk of, you know, activities that are outside the rule of law, but if you really want to make this -- we need to put strong tools in place that are making sure that this is happening. Encryption is the first, and the foremost of this tool. So actually I would like to hear from -- also from the audience what they think about encryption and how we can make sure that this is becoming a standard for every and each device or for every and each service online.
>> KARMEN TURK: Thank you, Marco. I totally agree that this self-defense is the first frontier for every user, but as Courtney said, now we turn to the host country, the lawyer, Francisco, I'm sure would like to share their -- their share of censorship-related case law. So we are glad to hear you.
>> FRANCISCO CRUZ: Thank you, Karmen. Well, in my talk here, brief talk here, I would like to begin saying that every -- like many people see Brazilian Marco Civil, as a role model, as a thing that should be seen as an inspiration, especially because it has -- and it actually has -- an important intermediary liability model and a summary of rights and principles to Internet. So many people say that the freedom of expression is Marco Civil's flagship, but yesterday when I was discussing the application of the Frank LaRue's and APC framework to evaluate freedom of expression, one thing that we actually perceived when we applied the framework is that freedom of expression is not only an intermediary liability model. It's a lot of stuff. And so Marco Civil solved some problems, but just started on other ones. And I would like to tell for you some case law in Brazil, and I think three points -- two topics would be I would like to put in the discussion. The first one is the most traditional censorship made exclusively now by our judiciary, because like Marco Civil tells that, if you want to remove a content that is not a nonconsensual sexual image or a copyright content, you should do it on -- by the judiciary.
The second thing is the private censorship, and the discussions about the terms of services. We had some problems here in this kind of discussion, problems of content removal from the platforms because the content violate the terms of services.
And the last one, which is very important, is actually the use of attack strategies to target minority groups and to try to do -- get these groups to stop talking. These strategy are actually not used by government here -- are not mainly by government or not by companies but by hate speech groups, and I think it's an important kind of censorship. You're actually doing something very bold to these minority groups that actually need to speak and to protest.
So the first thing I would like to say in our conference here is the discussion about the implementation of the right to be forgotten in Brazil. It is funny here because we are trying to frame the discussion in a European way, but we are -- in our Congress, but the politicians that are doing this are mainly involved in corruption scandals. And it is funny, they're actually doing this. They're actually defending the right to be forgotten. Why are they trying to forget? Your own scandals? (chuckle) And this is even more funny because we are -- we are a country with a -- a very recent totalitarian regime, and we are doing discussions about the right to memory, the right to truth, and it is, I don't know (?) that we are doing two discussions at the same time, and you could imagine -- so what is the framing of this discussion in Brazil? It's a very conservative framing. They are not trying to save the honor of a common Brazilian citizen, but they are trying to protect their reputation as politicians.
So the second thing is the discussions about terms of services. We have a recent case on Facebook. Our Ministry of Culture has a very huge collection of pictures, and they're actually historical pictures. And one recent case the minister of justice posted on Facebook a picture of a Brazilian native person, and the picture was a woman, like topless image. And actually Facebook blocked that content for a while. And from this case the minister of justice protested against it, and we started to talk about, like what kind of policy Facebook is developing on nudity, like because does nudity have nothing of pornographic -- I know this term is difficult to say because you have like a lot of moral determinations, but well, it's not the kind of nudity that actually Facebook is trying to put away from its platform, I think, but it was the content that actually got in problem.
And what I would like to say on this case, when we talk about terms of services, and like policy for community, for content removal, we need to press for transparency. Some questions -- the problem -- like what is transparencies? Transparencies don't leave questions without answers.
Another case that is actually very interesting that happened last week, also on Facebook, deals with this kind of attack -- attack strategy to target minority groups. And this other case some questions were left without answers. A page, a very hateful page that is called heterosexual pride, got blocked, because it was -- like the page was publishing content against minorities and like hate -- hate speech content. And these guys, like actually a big community, started to mass report some feminist pages, and suddenly the pages were blocked to, the feminist pages.
And why Facebook did that -- if Facebook had the human control of content, as they say, it never -- it should never happen, something like that. Like they need to answer why the pages were blocked, because I don't know, the last time I saw a Facebook person talking about that, she said that mass report doesn't represent actual (?) to content. If you report once a person will see that and this will be solved. But this is a case of mass report that actually carry out blocking of pages, a temporary blocking of pages, but this is harm for freedom of expression as well, although it's temporary.
The last -- it isn't the last thing, but, well, passing from the terms of services discussion to another discussion, jurisdiction discussion, we had two cases recently in Brazil of blocking apps from Apple store and from Google Play. This happened because these two apps, secret, that you can send anonymous message, and What's Up. They don't have like a representation in Brazil, and so when they are asked to speak in court, to respond, they don't respond: So Brazilian judges started to build in a strategy of pulling out these apps from Apple store or from the Google Play. And what are the repercussions of that? Like there are many consequences to block -- the case of What's Up is actually different. What's Up received request for data of users on a child pornography case and don't respond the way the Brazilian authorities wanted, and so the Brazilian authorities ordered -- the authorities of one Brazilian state ordered the block of What's Up in Brazil, like an app that actually had millions and millions of users that used this app to communicate with millions and millions of people.
So this is very disproportionate, but it tells a message, a message that we need to talk, what is the importance of jurisdiction in our discussion? Like this app -- these apps that don't have representation in the countries have policies based on one conception of freedom of expression, and not necessarily this conception of freedom of expression is extendable to other countries, and on the other side, in many cases they're suffering disproportionate measures like the block of this entire app from Brazil through the infrastructure, the case of What's Up, they are requesting the ISBs to block What's Up to the infrastructure.
So this is another very interesting case. So we are not talking about content, we are talking about apps. But it used apps to express ourselves. So there -- in this case actually what it happened -- it happened at Trat for us to use a tool that we would express ourselves.
And the last thing I would like to say is one thing that -- I think it happens here, but it happens on many, many countries, is the lack of Internet literacy by the judiciary. Like many Brazilian judges don't know how the Internet works, and this is a problem. Not only judges, but authorities in general. One case that is -- I don't know, it's funny, but it's tragic, is a recent case that a Brazilian state, and state it is on the Amazon, it's near Bolivia, that is named Akli, one authority is like an office, I don't know -- what is the name of Cartorio. Do you know this? Yeah, it's like an administrative authority that -- what? A notary. Yeah, a notary. They asked for the state court to oblige, to be mandatory to bloggers register themselves, to blogs to go there and fill a form to say, I am a blogger, I live here, and I now have permission to activate in this state. And how do you do that? Like what are -- how do you know which bloggers are from this state and need to be registered on this notary? And not only they have to fill a form, but they have to pay a tax to be a blogger. (laughter)
So this demonstrates a lack of conscience, a lack of Internet literacy, a lack of capacity to understand the Internet or the online conflicts that we in 2015 need to do, and I think not only in Brazil but I think this is a good example of which things -- like we are talking about a very high-level and complex discussion like terms of services, but people over many places still think that actually you need to pay a tax to be a blogger. Thank you.
>> COURTNEY RADSCH: Thank you, Francisco, for this very saddening speech coming from the country of Marco Civil and the --
>> Sorry about that.
>> KARMEN TURK: (speakers overlapping) Internet principles. Let's get even more sadder and I will give the floor to Courtney.
>> COURTNEY RADSCH: What obviously you highlighted is there is no perfect country, and I don't know everyone in the audience but I see some representatives here. I saw someone from UNESCO, the Council of Europe, index on censorship. I'm sure there are people from many different contexts here.
So I would like to throw it over to the audience. We have a mic over here. So can you raise your hand if you could share with us what you see as the key censorship issues? And I think, you know, this is -- we heard some issues raised here, the extra judicial territoriality, the Google representative mentioned intention, the disproportionate response and the fact that the judiciary and law enforcement having a good understanding of how the Internet works, how technology works so they can have the right response. Can you please introduce yourself?
>> KARMEN TURK: Just to remind you that the hashtag is DCFOE.
>> MATJAZ GRUDEN: Does it work now? Yeah. Matjaz Gruden from the Council of Europe. Just as introduction I notice every time I speak here at this conference, you know, what I say is met with a little bit of suspicion as if somebody coming from Council of Europe, or Europe is speaking from a position of superiority, we got it all right and we'll tell you how you should do things, and I can assure you when it comes to freedom of expression, especially freedom of expression online nothing could be further away from the truth.
Incidentally, I just read recently that in one member state of the Council of Europe, I will not name it because I'm a cautious bureaucrat, of course, but there was a case where an iced tea company was fined severely for its ad for insulting a yogurt, which is considered a national drink, which may be a great victory for yogurt rights but it reflects tragically how the freedom of expression is understood in Europe. After Karmen's introduction on the Delfi case, of course that didn't help much to boost the credibility and the authority of the council, speaking on the issue. I will not go into that. I'll say that what is important with the judicial -- I think the role of the judicial is extremely important on this issue, and we have a court, I think, that is independent and is impartial. It may not be infallible. I'm not going to go and comment on the content, but I think that in considering that, I think one should nevertheless keep in mind the work that the European Court of Human Rights has done for defending and promoting freedom of expression. So one should at least keep the whole picture when assessing not just a judgment but the entire role and the importance it is playing in Europe. But that's not what I wanted to say.
On the question of freedom of expression online, and censorship in particular, we have done a survey similar to the Freedom House on the 47 Member States of the council and are now in the report in democracy of Human Rights and rule of law in Europe, and one important part of course was also on freedom of expression. What it turned up, and I'm coming in saying that I think Europe has very little to teach in many ways to the rest of the world, is that the situation is far more serious than we even thought before. When it comes to safety of journalists, it is adequately guaranteed in only half the Member States of the Council of Europe. When it comes to a question of protection from arbitrary application of law, defamation laws, criminal laws, which is very important in this context, the percentage falls even lower, 40% of Member States have adequate protection against (?) protection of law.
We come to other aspects where we have a different problem, freedom of the media, freedom of expression online where the biggest problem is we simply don't have data. And I think if you want to effectively counter the problems that we are facing when it comes to freedom of expression online and the question of censorship, and especially the question of self-censorship, I think one of the things that we need to do is develop methodology indicators to be able to properly detect and respond to the problems that are there.
I'm also responsible for platform for the protection -- online platform for the protection of journalists, which the Council of Europe has put up together with seven journalists, (?) for the protection of freedom of expression, and one of the things that we are not getting through the platform, but we know it's there, is the problems related to self-censorship, in particular the -- when it comes to online journalists, because of their precarious situation, because when it comes to online journalism, up to 70% or more of these people are freelancers. Their life earnings depend on not offending anyone. And we have no way really to in any accurate credible way assess the extent of the problem. We know it's there. We know it's serious. It's had an effect but we don't know how. I'd be glad to hear from anyone or to start a conversation or cooperation with anyone around here to how we would tackle this problem because I think it is one of the essential ones if you want to be effective. Thank you.
>> COURTNEY RADSCH: Thank you so much. I'd note that there is somebody who has worked as UNESCO, and UNESCO has done a lot of work on freedom of expression online, and I was wondering if you want to talk about some of the key issues that you guys are looking at UNESCO. There's a whole series going on. Can you mention what some of those are? The microphone over here.
>> Hi, yeah, I can just mention very quickly that there is a series of publications that started in 2011 called the series on Internet freedom. There have now been something like seven editions published, and so we're looking at issues around freedom of expression and its relationship with privacy, protection of journalist sources in the digital age, the role of Internet intermediaries and fostering freedom of expression, online hate speech, balancing privacy and transparency, the list goes on and on. And so it falls under the umbrella of this series on Internet freedom and also reports on world trends and freedom of expression in media development looking at the global level and also in different regions, with a special focus on gender. And UNESCO recently finished a comprehensive study on Internet-related issues and access information and knowledge, freedom of expression, notifies, ethics and the information society, which will be launched tomorrow at the UNESCO event so I encourage everyone to attend. Thank you.
>> COURTNEY RADSCH: Thank you. I think it's great to hear from the Council of Europe being very honest about the situation in European. I know the committee to protect journalists just put out a report on the EU's press freedom, which basically found it's a balancing act between so-called hard interests such as security and economic interests and soft interests around Human Rights and universal values. But what this report really shows is, you know, Member States are not necessarily being held to the same standards as session states, and that deteriorates the ability of the EU as to be a normative framework and to be a leader on these issues. The same can be said for the U.S. And I think the UNESCO studies show how important it is to understand the dimensions of the problem and we've heard in several sessions the importance of research.
But we don't want to just talk about problems. We also want to talk about solutions. And, you know, we're talking here at a UN forum with represents of these high-level, you know, processes, but I think on the ground what we see is many journalists, many activists, you know, individual Internet users can't directly access these high-level processes. So I'm wondering if we can hear from other people in the audience about what they're experiencing either in their countries or whatever their specific contexts are, what you feel are the main issues that we should be grappling with in terms of online censorship, and how we might go about thinking about solutions. So I see a hand back here. Can you raise your hand so -- great. And please introduce yourself.
>> Sure. Hi, (?) from radio (?) Syria. Actually I want to refer to a certain kind of censorship that has not been mentioned, which is censorship of information, access to information and services, which is forced usually by sanctions and embargos like in Syria, Iran and so on. This kind of censorship actually hurts the typical citizens because it doesn't really effect the government or the radical forces in the region, but mostly it affects the typical citizens and denies them from the right to access the information.
>> COURTNEY RADSCH: Go to Twitter. We have some questions via Twitter and then I'm going to come over here. Great.
>> KARMEN TURK: Okay, I take a second to review the question on the Twitter. Thank you for that. I think just the further we go with this panel, the sadder we all get because it seems that there are so many forms of censorship today that it's impossible to keep up. However, let's have next. Who has the microphone now? Who has the microphone?
>> COURTNEY RADSCH: She does.
>> ELSA SAADE: Is it working? Okay. For the record, my name is Elsa Saade. I'm an ambassador but I work at the gulf center for Human Rights. There's a point that is really important. It's the fact that some -- most people in the region, in the Middle East, North Africa and the Gulf in general, cannot be sometimes aware of the fact that there is censorship. Because when you actually know that there is censorship on a specific Web site or censorship on specific information or whatever, you can always try and find ways to overcome it. For example, in Iran there's siphon, there are several platforms through which or applications through which people can access the Web sites that are being censored, but because they are aware of the fact that there is censorship. But when old -- old people around the region do not actually know about the fact there is censorship, there is the challenge, I think, and I wanted to put that on the table.
>> COURTNEY RADSCH: What do you think that -- how could we help raise awareness among people, among the population that there is censorship?
>> ELSA SAADE: Okay. So there is a case in Lebanon, March, have you heard of March? They actually highlight Web sites or like movies or whatever that are being censored in Lebanon. For example, maybe create some kind of forum that would be available for anyone to visit in order to know what kind of Web sites are being censored, for example. It could make it easier for you to know what they don't want you to know, you know? (laughter)
>> COURTNEY RADSCH: Let's get into -- I mean -- okay, that is going to get to the people who actively want to bypass censorship so you get to kind of the same problem. You have to know that you're getting censored. So what other solutions can we think that are maybe let proactive on the side of the user and maybe more assistant --
>> ELSA SAADE: Well, that is exactly what I'm putting on the table. I mean, I don't have answer. I'm putting it on the table so that maybe we could give a chance for anyone who has an idea how to actually combat that, to put it out there. Thanks.
>> KARMEN TURK: That's a very interesting problem. Do we have anybody from U.K.? Please, let's raise some hands. No one from U.K.? Really? Come on! Don't hide it (laughter). Thank you. Just a question from teacher. It's from Ben Rauner, he's one of the coordinators of this economic coalition and of course this is already a second day of IGF, then we have all heard, of course, about U.K.'s communication data bill. So his question is, have you already discussed the new U.K. communication data bill and its impact on freedom of expression. Do you have an answer to that from U.K.?
>> Let me get Melody a mic.
>> Yes, here's one right here. And can you introduce yourself?
>> Yes, I'm Melody Patry. I work for Index on Censorship. So I'm not British but I am based in the U.K., so I guess yes, for that question. Yes, I don't know if it refers exactly to the new bill communications or the new bill on extremism, and some of them are actually related.
>> KARMEN TURK: He's referring to the first one that has those little -- some paragraphs on -- I just read it yesterday. There was a whole workshop yesterday. You don't know what I mean? Okay.
>> MELODY PATRY: Because they passed a series of communications in the past month, but I know that some of the new legislation -- so one thing that they did was criminalizing online harassment, for example, and by allowing -- upon decision of the court, to remove anonymity online to people who were being persecuted for online harassment. That is one thing. I'm not quite sure I can answer the rest of the question because it's broad.
>> COURTNEY RADSCH: I saw another hand back here and up here. Okay. Where is the mic? Okay. Let's go -- perfect. Take the mic here.
>> Hello, my name is Marianna (?) I'm a Brazilian attorney. I work with Freedom of Expression here, and in Brazil -- it's been said before here in IGF, but in Brazil anonymity is prohibited, and I see that it brings two issues for freedom of expression. The first one is that to obtain the identification data of a user, it's really, really simple. You don't have to show that -- any unlawful behavior. Simply the fact that that user is anonymous, if it's using a nickname or something, is reason enough to disclose their data.
And the second thing is there is no due process of law in this procedure. So just as the ambassador for (?) over there said, it's -- you don't know about -- that you are about to have your I.D. disclosed until it is already on the record.
>> FRANCISCO CRUZ: Just to say some thoughts that passed on my mind. The question about like how to combat censorship and how to do that -- like people know that is actually existing. Like this is what censorship is about (laughter), it's to get people not to know something, it's -- it's like obliterate the other's opinion, it's like obliterate in terms of pass this opinion to not exist anymore.
So how to do it? This is a very good question (chuckle), and I think a key to that question is context. Like I really don't know what is the political environment and the limits to do some things in different contexts. And we need to advance in terms of identify types -- different types of censorship, but in terms of developing strategies, what we can try to do, I think, is identify cases not only of similar censorship practices but cases of when the political environment are actually similar and can be mimitized, in some way, like in terms of strategy to combat it. And when we talk about context, and we talk about like what is happening at the base level, at like -- at the grassroots, we're talking about jurisdiction also. Sorry to be the jurisdiction boring guy, but I think it's very important, especially when we're talking about the enforcement of Human Rights.
Like when you talk about the possibility to say something, maybe the jurisdiction problem is not so important because we have platforms, like we create platforms. We create technology to do that. But when we are talking about the use of this technology to make some people to not speak, especially this kind of case of harassment, of mass report something, or these cases that I was talk before, we are talking about cases that maybe the best -- I don't know if I can use this word, but the best effort, the best remedy is to -- advocacy at jurisdiction, is try to convince courts that this is wrong. Because only they can enforce something. Only they can pick this person's that actually mass reports some page and make them respond why they do that and what are reasons to do that, because they need to, like they --
>> COURTNEY RADSCH: But, you know, we -- legal remedies can only deal with part of the problem, right? We don't always want to go to legal remedies. So I'd be really interested in hearing from people other solutions outside of law, because many of us -- I mean, our -- the colleague from the Middle East, for example, live in countries where there is very little rule of law. There is no due process. And so that's not very helpful. I see the mic in the back. Thank you.
>> Yeah, I'm Paul Casanuel (?) from Mexico City. And yeah, we have back in the country almost 99% with impunity, so law is mostly irrelevant. I would like to share, now you talk about context. We're seeing in many of the most violent contexts in Mexico, in Central America, a lot of self-censorship, because even though we have many activists and citizen informists and journalists have tried to keep their anonymity, in many of the -- of those contexts there is high violent activities against them. So it's either -- first of all, it's both online and off-line, mostly off-line, direct harassment and not only to the person but their families. And there has been some strategies. Maybe anonymity is starting to kick off and many of the citizen informants circles in social networks, but it has been extremely hard to tackle because if there's actually no rule of law, if there's nobody to help, and sometimes -- and sometimes not even the same media are the ones that are protecting their own reporters or are journalists. It has led many people just completely with no alternatives, and the only alternative is to maintain -- to maintain their silence.
There has been some experiments, and please just focus on maybe some of the journalists' approaches that El Faro has done in Honduras, while they're in Salvador, to just parachuting media and doing just -- write coverages and bringing people out and then just bringing the stories out. That is -- and there has been a big debate in very dangerous areas in which we cannot involve local informants or local sources or local journalists because they will eventually become in high risk afterwards, after their activity.
>> KARMEN TURK: There was a person on the second row who has been wanting to share his ideas for quite some time. To the second -- to the second row. Here.
>> COURTNEY RADSCH: Can you raise your hand?
>> It's Carlos, a Freedom House delegate.
>> CARLOS: Yes, I'm from Mexico as well. I wanted to share another kind of problem we're having in terms of freedom of expression. We have so many cases, of course, in terms of -- from copyright, from trade agreements, from violence, gender -- gender violence-related. But there is a new kind offer than -- at least new for us, we are seeing. For Mexico treater is particularly relevant. Politicians, senators and -- lawmakers, senators, even presidency, it has regards on Twitter as very politically relevant. And it has happened, it's not the approach that probably Turkey had in terms of going out public and censoring Twitter. What the government has done is to intoxicate Twitter environment, hiring so many people to create new tendencies.
So whenever it's -- another story of Human Rights violations, we have had so many. Last year we had 43 students disappeared and they are still disappeared, with the military involved and the police involved. And the problem is that whenever it comes to protest that may criticize the government you have this tremendous amount of information flooding Twitter and disappearing the conversation. And we are -- we're starting to document that. But the problem is that even -- even bringing the argument, it may be legal to hire those people. Of course not them, because they did it on a third-party basis, and so it can be untrackable.
But we are having this problem, and it probably the Twitter says, okay, this is new for us. You can ask Carlos Cortez and he'll say this is new. We don't have exactly that in mind, but when we have so many minority groups and risk groups in some places of Mexico going through violence, Twitter is the only way to sort of pass the extreme concentration of broadcasting, extreme concentration of newspapers, extreme concentration in radio that we have in Mexico.
So we have this problem. Even conversation is doing -- is being -- is being done with no law being violated. It's being there. How can you handle that? This problem. What we have done, that maybe answers your question, is actually organizing to counter that, but they have money -- they have more money, they have more people. We can -- we can make analysis of the clusters, of the networks. You can say, okay, it's 40 computers spamming from this place, but that's it. But this is really important. I know it may seem like irrelevant for (?) discussion at some point, but that's a strategy that may get to you at some point. So I'm just making a warning at this place.
>> DOMINIC BELLONE: Carlos, I just wanted to clarify that point about the Twitter. Are you suggesting that Twitter needs to have a policy about how these kind of like counter-messaging spamming campaigns happen?
>> CARLOS: No, what I'm suggesting -- I'm not sure that would be the way, because the first thing will be -- yeah, let's push Twitter to do that, or let's make a law or let's make (?). I'm not saying that. That's a really problematic issue. What we have done is to just arm ourselves and try to combat them by also tweeting against that. That's it. But the problem, to be honest, is that we have come and say, can you Twitter, be more transparent on your transparent reports, when you do them, can you give us more data, please, on how trends works and of course all kinds of censorship, solicitors remove content on tweets, we have increased so many. But in this case in particular we have to comment -- can you make a transparency report on his brother, on how is working your tendencies. Because it's important for media, and it's important for politics, it's important for people.
>> COURTNEY RADSCH: May I hand the mic back to the woman who handed it to you?
>> Hi, I am Ana. I am from Georgia. So I keep hearing people and Internet users especially saying that the governmental officials like judges, lawmakers, they don't have an understanding of the Internet. But the point here missing always, or in most cases, is that if someone doesn't know something, he will never -- or she will never know it until it's brought up to him or her. So the point I'm making is that a judge cannot understand Internet if someone doesn't help him understand it. So how do you do it? Who will bring the knowledge to judges and lawmakers, those who are at the Parliament?
So one way of doing it is if you are a local NGO, you have to -- you have to bring the issue up by organizing events and bringing Internet issues up so that it gets popular, right? And invite governmental officials, et cetera, et cetera. Of course it's for a small NGO with not a big reputation. It's a big -- it's a lot too much task. Another way of doing it is through international organizations like UN or ISOC who has, I don't know, a lot of events in different countries. But I did actually work with two organizations of two different governments in my country helping my governmental officials to be -- to get trained and et cetera and et cetera.
So the issue I came across is that the Internet has never been a priority. No one tackled that issue. So there is so far I know no project that would say, okay, let's train the judges on understanding -- yeah, let's raise their understanding of Internet so that the cases that is coming up right now about censorship, about Internet, so they rule on it with more understanding.
So what actually has to be done is that UN be, DP, whatever it is, representative in your country, they have to prioritize this issue, which they are not doing actually. So I don't know, it's not a question, as you see. It's more of a point. You can elaborate more on it if you want.
>> KARMEN TURK: I would just like to say one comment, that I think this is exactly the way to go, and I think there are some very good practices going on. For example, European UN and Council of Europe have a joint project in a country Ukraine, where they have a judges' academy, so they're actually preparing a handbook for the Academy of Judges on the Human Rights on the Internet, and so the judges can actually pass the whole module. So this is, I think, a very good way to go.
However, the second comment, a very short one is that even though we can educate the judges, but still how we regulate the interpret and how we apply the Human Rights, it's still very much a question of how we view the world. So if you view the world that the freedom of expression should be hierarchically be below or rights and interests, then it's very hard, even if you do everything right regarding verification and all of that.
>> COURTNEY RADSCH: I would just add, I mean, I think that there are lots of judicial strengthening programs. You know, the UN does it through a variety of agencies, the U.S., the EU, lots of local capacity building. So it seems to me instead of having a whole program over here, integrating it throughout, right? Because the rights online can't be separated from the rights off-line, and we see and again, many of the other discussions happening here at the IGF, the violence against women, the counter-extremism, is that -- you know, the lack of sophistication and understanding how the Internet works, how social media platforms work and what the variety of options are is one of the limitations to finding alternate solutions, than like a very rough bludgeon of like the law or outright blocking and that sort of thing.
I guess I still want to hear some solutions that aren't at the legal level. We heard one from Lebanon, you know. I see a hand. Excellent. Can we get her a mic? And then we'll go back there. Thank you.
>> Hi, my name is Sonia. I'm from Turkey, Istanbul, and I want to say that about the recent discussion about like training judges and how they lack information about -- or understanding of Internet, at least speaking about Turkey. I actually personally am glad that they don't understand it, because I think if they did things would get worse because maybe you've all heard of it and for the last two years it's been increasing very much, there is blocking of YouTube, Twitter, Facebook and the fact that they don't understand things like Tor and VPNs is maybe helping the cause. And if they were trained at it, they could block it more effectively. So that's what I wanted to say about that.
And also about what Carlos said, like on Twitter, how there are paid or unpaid users who will deliberately spam like a trending topic or just -- yeah, spam it in order to make valuable information and valuable tweets harder to find. It's happening -- personally I'm not sure if there's anything Twitter can do about it, but the same way, we're fighting back as like activists, even tweeting more and trying to drown their voices out, so it's an ongoing thing.
And finally as a tiny maybe solution to -- which is nonlegal, would be like there are small initiatives going on for like, you know, secure drop. There's a localized version of that for Turkish citizen journalists, or actual journalists, because we have the same problem with journalists being locked up. So there's these Web sites which uses Tor, that the sources can drop information or documents that they have, and more independent or impartial news outlets can pick it up there and use it without the risk of the provider getting arrested or whatever: Yeah, thanks.
>> COURTNEY RADSCH: This woman back here with the microphone?
>> Hi, I'm Riva with Article 19. I just wanted to share a tool that Article 19 has developed with the Open Rights Group. It's -- it's a tool to detect censorship and filtering at the ISP level, and it's downloadable, so you can find it on WP.censorship.exposed.
>> FRANCISCO CRUZ: I remember an example to say that is actually not at a legal level and is very good. In Brazil we have this public debate about our copyright reform, and the government launch it platform to people for them to come and say what they think about it. And what had happened is surprisingly there are too many comments pro copyright, more than people expected. And then a group of hackers started to conduct an investigation to try to discover why did it happen. Like it was surprised -- a surprise for them. So they tried to get the data and to analyze it. And the result was that they discovered that 80% of those comments were done by the same machine, by the same user, which was not a user, it was a robot. And the user was actually a part of the Brazilian office to defend copyright. It came from this computer.
And this is a kind of strategy to start to unravel this non-fair practices by some actors, they are conducting. Like we need to create this strategy to unravel this kind of practice and to denounce it. Like this Twitter example.
I don't know if it's an example that actually solved the whole problem, but I think it's like a hint.
>> COURTNEY RADSCH: Okay. One -- can I see everyone who still wants to make an intervention? Okay. And then --
>> VALERIA BETANCOURT: Thank you. This is Valeria Betancourt from the Association for Progressive Communications. I wanted to build on what Carlos exemplified. I actually think we are facing a new phenomenon in some of the countries in Latin America. We actually have a government that is funding, with public funds, 12 centers to dilute the public debate, the debate that impacts on Human Rights and the public interest. It's not only diluting and hiding the real and the relevant debate but it's also provoking a tendency of criminalizing and prosecuting journalists and bloggers and regular people that have an opinion on what is going on in the country. And they are being publicly embarrassed and ashamed by doing so. Even in my country, journalists at the moment have been declared public enemies. So I think we are actually seeing a more sophisticated version of an institutionalized strategies to censor voices, critical voices. And we need to get more sophisticated also in our responses.
And looking forward, I do think that we need to find strategies to reclaim the space, particularly social networks. I don't know how. I don't have the responses, but for me we really need to come up with some tools and some effective strategies to reclaim the space in a way that will allow us to keep using the Internet and particularly social networks, particularly Twitter, as a way to keep making visible critical voices and dissent.
>> COURTNEY RADSCH: Thank you. And I think that could be one thing this Dynamic Coalition could help do, is actually help identify some actual practices, best practices and worst practices. Can we get -- you have the mic?
>> RACHEL POLLACK: I didn't introduce myself before. I'm Rachel Pollack. I work at UNESCO and I'm here with the European Summer School on Internet Governance. I want to respond to the point about judges and the UN not doing anything, just to clarify that actually UNESCO has done quite a bit of work in training judges on issues related to freedom of expression and safety of journalists. We've also developed MOOC, a multi-open online course for lawyers and judges in Latin America on these issues. So there is work being done.
And then on nonlegal responses, our study on online hate speech showed that there are efforts towards positive speech, for example, in Myanmar, that has been very successful, by civil society, also in Kenya to track violence before elections -- during election periods and how threats online are translated, or not, into the off-line world. So those are some issues that could be looked at, as well as the role of Internet intermediaries and self-regulation and developing best practices to foster freedom of expression online. Thanks. Issue.
>> KARMEN TURK: Thank you for that. I have a question. Does anybody have any issues about regarding logarithmic censorship? Can we do something about it? Since it is a business secret, all of the platforms. Does anybody have best practices, any bright ideas how to tackle, tackle the issues of logarithmic censorship?
>> COURTNEY RADSCH: We talked about at the online news association was the idea, you know, newspapers, media organizations, have ombudsmen, some, not all, but, you know, what about the idea of an algorithmic ombudsman. I think this holds problem because there's a trade secret and you don't know what's being excluded or censored and how they're weighing choices and there have been lots of reports about the types of ads that certain groups of people, minorities or whatever those groups are getting on, say, Facebook or Google ads and the racial profiling that happens with that, et cetera.
So it does seem like algorithmic censorship is a problem, but algorithms could be a way to address issues with hate speech, with online hatist speech rather than removing content, since I think there was a study in Europe that showed -- it was around the right to be forgotten, 90% of Europeans don't go beyond the first page of Google. So, you know, there's -- it seems like there is some midway between removal of content and just letting, you know -- deciding that, okay, this is the only possibility for the algorithm. So maybe that's something to explore.
And I guess what would be great to get, what, about ten -- we have, what, about ten minutes left, is to get first of all, any last comments, please do raise your hand --
>> Can I have a word on this --
>> COURTNEY RADSCH: I can't see who's speaking.
>> No, because -- that's a very good point on the transparency, and I think we need to invest in more, all of us, all of us that work for Internet companies -- more transparency, we do it with transparency (?), the algorithm is not so mysterious. It's just -- there is a lot of little later about that. I would see this actually the other way around. So how we can help the Civil Society to have a better quality in terms of the Web site in terms of the content so they can actually be more visible through all the different -- different tools, because again, it's not only (?), there is social media, there is different social media, different tools that are available online to make the content available. So I would like -- actually like to see, maybe through the coalition, we can brainstorm on how to help Civil Society to have a better way to communicate their activities online and be more visible.
>> DOMINIC BELLONE: I've talked to activists from both the Middle East and Latin America about programming concerning sort of data collection on censorship, particularly around elections whereby certain Web sites would be monitored by computers and such and would track issues of censorship, particularly like online journalism sites or different Civil Society sites around elections so that you could actually data map attempts by the government to try to shut down critical voices and reporting around elections. So that's one interesting idea that I've heard.
Also, getting back to trying to raise awareness around what is actually being censored, I think it would be a very interesting idea to have some kind of programming where certain countries you evaluate what is actually being censored, like direct URLs and such, and then try to raise awareness around that. Just getting back to the woman, for example, on Lebanon, is sometimes people don't know what they don't know about censorship, so trying to work to build capacity around that kind of what is actually being censored, the actual URLs.
>> COURTNEY RADSCH: That's a great point, and I think that maybe, you know, the broader point that that points to is for the technical community to assist Civil Society. So instead of relying on typically less funded Civil Society groups to figure out what's being censored, to proactively provide that information. So notify Internet platforms, social media platforms, be very proactive and transparent. Obviously the adoption of transparency reports is great. There's more to be done, but, you know, some great first steps. But, you know, working together on these issues I think is core -- adopting a multi-stakeholder approach.
In the last few remaining minutes, you know, we'd like to get a sense from the people in this room how interested you are in being involved in this Dynamic Coalition and contributing to a better understanding of best practices using a variety of, you know, case studies, the specific examples, I think, you know, again, using -- we have some really great examples thrown out in this room, and one of the things we'll be doing is reporting out on this session, but I think there's a lot more that can be done, and we were thinking for next year, the paper that we'd like to produce are essentially some solutions for combating these types of censorship.
Are people in this room interested in contributing to that sort of work or are there other priorities that you think this Dynamic Coalition should have? We thought censorship. You know, we've been discussing this and working with people in the room, but we thought censorship is this issue that really does cross-cut across themes and are also relevant to other Dynamic Coalitions and might be able to contribute to some of those debates as well.
>> KARMEN TURK: And when we say censorship, then we really do mean if its widest sense possible, because the forms of censorship online might not be the ones that we are -- we have traditionally been used to seeing as censorship. There are new forms emerging, I guess every single day, so we were thinking of maybe providing a template or just, you know, collecting your ideas, your issues, your best practices, worst practices, solutions from your countries, your regions, and really try to put it together, either on the report and best practices for the next IGF. So if you raise your hands, and more than half of you raise your hands, then we will do it, and we'll provide this -- we will find a good platform to doing that without you needing to give away your data or being surveyed, just doing -- give us this -- state your issues. So if at least half of you will raise your hands, then -- so please do. Then we will take this up and we'll ask you to join this mailing list so you could get info on that. So hands up who thinks this could be done and should be done?
>> COURTNEY RADSCH: Is there anyone who thinks we should adopt a different agenda? Okay. Great. We love consensus at the UN, great. (laughter) So we've got the agenda. You've got the email address list to sign up here. Remote participants, do we have any interventions from remote participants?
>> KARMEN TURK: No.
>> COURTNEY RADSCH: I think most people were using Twitter. So I think with that we can end a couple minutes early. Why not, you can get a jump on the snacks. I want to thank you all so much for coming here and sharing your experiences, and also listening to everyone else's experience. I know not everyone wants to share publicly in this forum, but I hope that, you know, you will join the mailing list. You're welcome to just observe and be part of it. I've certainly done that on a couple Dynamic Coalitions and you're welcome to be actively involved. Thank you to everyone who raised their hands. Thank you to everyone who participated up here on the panel and to those in the audience, and if you could please give yourselves a round of applause right after --
>> KARMEN TURK: Just a really, really small comment to end this. This mailing list, just feel free, if you have a really nice case in your country or a law initiative or whatever, share it with others. I know we all have a lot of email correspondence. We all hate it. I'm sure we all agree, but still, it's a list. So do share it. Do ask questions and let's try to restart this Dynamic Coalition that is one of the oldest in IGF and that has been kind of dormant for the last years. So with your help we hope to do that. Thank you for being here and thank you for panelists who came to share their ideas with us.