IGF 2021 – Day 1 – WS #276 Reinterpreting Free Speech Guarantees for the Digital Era

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> Hello.  Can you all hear me, please?  Great.  Great.  Great.  All right.  Thank you once again for joining the call.  This is Reintegrating Free Speech Guarantees for the Digital Era.  So once you can hear me speak, very soon I will be connected from the venue.  Thank you very much for your patience. 

    (Captioner standing by).  (No audio).

   >> MODERATOR:  Hello.  Welcome all to a session that we will try to take over together now.  And let's try to make good use of our free speech here.  I work with the DW academy and local community networks.  And I think from this perspective free speech is very dear to us.  And we are looking a lot in to how, what does free speech mean from a community level perspective.  But first of all, I think the idea would be to hear the people who are supposed to talk here on today's panel, if there is nobody else in the room who would like to say something about the dynamic or want to do a presentation, improvised one or should we start with the panelists? 

Okay.  So I would like you to present yourselves.  Though we cannot do this favor.  Who would like to start for those already online attending?  Do we have any feedback from people online?  No one? 

    Okay.  Then I would like to welcome our friend from Free University Moscow.  Maybe you can share with us a little bit about what free speech means to you and these days. 

   >> Yeah.  Actually, I wanted to listen about ‑‑ I'm ‑‑ I want to raise some concerns.  Because Internet make speeds much faster.  And usually when we were talking about free speech, we were talking about freedom of press.  So the possibility to print something and to bring to the store.  Or the possibility to walk out and express yourself. 

    But the Internet changes this situation really dramatically.  Because nowadays, everyone is in your backyard.  So you have a nice comfy backyard with a barbecue and friends.  Republicans and Democrats came from the United States.  Nazis and Antifa came to your backyard and anyone wants to express themselves in a place which you are expected to become yours, and actually we need to understand what to do.  Because Governments in most cases, you know, this is NetzDG tries to regulate the space.  In Russia we have blocking, marking, whatever you can imagine. 

    Some social networks are policing and moderating content automatically.  The funniest, because they try to protect some minority groups from harassment but it also affects freedom of speech.  Example from Russia, for example, is like people with certain family names, which sounds like an offensive definition of Ukrainians got banned because they have a family name.  It was done automatically.  They were a famous fiddler player.  But it was something affecting it. 

    So one more issue is that everyone is speaking in your backyard.  And as an issue of freedom of speech it is tech companies and Governments tries to protect you from this.  We can find balance easily. 

    And I really want to hear from you, from your community.  Your status on what, how do you see it.  We can find solutions, but at least we need to speak and exchange opinions.   

   >> MODERATOR:  Yeah, good point.  For those who came late the Moderator of the session did not show up.  And the panelists also online were nonresponsive at the beginning.  We somehow took over the session.  I don't know if there are original panelists available yet.  Maybe sharing or giving the question to those here.  What are your experiences with free speech and digital areas?  Would like to share an experience or an example or ‑‑ please come. 

   >> Yes.  So I'm Xavier.  I work for a network of collective of citizens, of NGOs based mostly in Europe but also in other countries in the world.  What we do is counterspeech.  And what we see online in digital spaces is that speech is not that free, even in democracies.  Because a lot of people are being targeted, are being attacked.  And if we want a real free speech, we should have every category of population should have access to public debate and to digital spaces and public spaces under the same terms.  And what we see is it is not the case.  It is a minority.  Minority of people who are more let's say assertive, more aggressive.  More violence who have the floor and have the space to discuss.  So it's a bit hard to say at the moment that digital spaces are free.  In France a lot of people are self‑censoring.  And there is a majority of people who do not participate, who never write anything, put up content, or, you know, write comments or, you know, discuss online.  So I think we need to reconsider our definition of free speech and assess the situation.  And make digital spaces safer in order to achieve the goal.   

   >> MODERATOR:  Do you have any findings or suggestions how to improve this?  Because this is like similar to what he mentioned.  So the haters are taking the space.  And so then don't leave room for others to talk.  How can you incentivate people to express themselves in digital spaces? 

   >> It is a matter of feeling safe and feeling safe.  Who enforces the safety.  There is a regulation from the Government.  There is the participation of the platforms.  And it's not at the moment, like ‑‑ yeah, regulating themselves, they don't have incentives to regulate themselves, at least economic because it's ‑‑ it makes money to let hate and misinformation proliferate.  So that's a big concern. 

    So I guess all the actors should be involved.  When you are in a bar, for example, if you have a minority of guys who are attacking other people, aggressing them verbally, you would have, you know, the manager or the owner of the bar who would say something of you would have the other clients.  You would have the staff.  So there is peer pressure.  There is social pressure.  There is a control and there is, you know ‑‑ there is a frame and this frame we don't mind it always on digital spaces.  So I think we need to empower maybe.  Maybe see what's happening in Wikipedia, what's happening on certain social media where you have, you know, people who have the capacity to organize, to moderate, to allow access.  To ‑‑ or to not allow access to people who are, you know, verbally violent and so on. 

    So I guess it's ‑‑ it requires the participation of every level, you know. 

   >> MODERATOR:  Thank you. 

   >> But who will be those managers who will moderate or organize?  Because a lot of people came not from Western European countries where the managers are good.  A lot of people came from countries where managers are bad.  Where they are definitely abusing these possibilities against minorities. 

    And Internet in this case allows different laws of expression of everyone.  But your opinion can be lost or by a thematic moderation or by fluid of information.  Or it could be removed, not by corporational, automatic moderation but by content removal by Government. 

    And when the Universal Declaration of Human Rights was written, there was no such media like Internet, such as universal.  Such fast but also allowing such moderational capacities. 

    I don't know.  Should we change Universal Declaration of Human Rights?  Should we make a part to it?  Or should we have recommendations for illegal or elicit content or something like this? 

   >> Yes.  So it is just a suggestion.  If there are other people with other ideas I would be really glad to hear them.  This is ‑‑ so for me, for example, I'm one of the cofounders of the French speaking group, which is part of this group, this IMEA groups.  And what we have seen, because we are doing ‑‑ dealing with hate speech, with discussion, public debate every day in the sections.  So we see what's happening on Facebook.  We know less the other social media or other Internet spaces where there is a discussion. 

    But what's ‑‑ what we see is that there is people, if you give them the possibility, you know, to ‑‑ it is all ‑‑ it comes down from the ‑‑ to democracy, like I believe in democracy, what we do also with IMEA, the other people can be activists who are part of this network.  Believe in democracy.  So it boils down to that basically.  It is like how do you vote and how do you elect in a fair, fair elections, you know.  People who could be the Moderators of the group.  And could ‑‑ and could, you know, moderate the social network, for example.  Or ‑‑ yeah, you try to promote that, to foster this kind of civic tech.  You know, we are talking about democracy but how do we bring democracy in digital spaces.  And yeah, there are examples to take from Wikipedia, for example.  It is really a community led initiative.  And so all these things, I think we need more bottom‑up initiatives. 

    The role of the Government is super important, of course.  But as you said every Government should respect, you know, the international, the Universal Declaration of Rights.  So that's the common denominator.  So yeah, I think there is this discussion to have, not everything should be in the hands of the Government.  But there is a need for regulation.  There is a need of what's ‑‑ what should be illegal offline and what should be legal online.  You cannot commit a crime in the streets.  Why would you commit a crime in a digital space. 

    So it's a complex question.  I hope that's a part of the answer. 

   >> MODERATOR:  Okay.  Thank you.  And opening up again I think to all of us who are here.  I don't know who would like to share an example or experience.  Until now we were talking about spaces that were taken by ‑‑ yeah.  Okay.  Bless.  Please sit here also. 

   >> Well, I think, yeah, those questions are like really complex.  I'm Vladimir from Article 19.  This is kind of a déjà vu in the same place with the people.  But I'm coming from Mexico.  And in the Latin American region, just like recently, the Inter‑American Human Rights Commission they pointed out and say we are in an inflection point on freedom of expression on the Internet.  They are making this big call and big consultation with Civil Society and different actors on yes, we have standards.  We have like Human Rights standards.  We have Inter‑American Human Rights standards.  We have such like more ‑‑ like stronger protection to freedom of expression, if we compare to Europe.  Because of the history that we have and how we are built our democracies and we have like sent principles.  You cannot have like previous censor and some other standards and thresholds that really protects freedom of expression. 

    But yes, there are like new challenges.  This information is one of the challenges.  Hate speech is another of the challenges.  But we have to see this in light of the cultural backgrounds, the linguistical background.  Social media companies are not taking in to account this language diversity and they are making moderation.  There is 80% or 70% of moderation in English.  What we are seeing in Mexico is one indigenous community they were publishing something in their language.  They take down the things that I was saying just because it is in another language.  They don't understand. 

So I think there must be and now it is like the part of the discussion, if we need these new standards or we have like this higher thresholds to protect freedom of expression as a Democratic element and Democratic piece for societies, but how do we see in light of this new phenomenon, these changes that we are seeing?

And I think from my perspective, there are like three or at least like two ways.  One I believe in oversight and how the multi‑stakeholder oversight of companies and Governments can somehow like make this balance to guarantee free speech, to guarantee due diligence, to be able to sometimes appeal when you really think that a social media company is taking down illegitimately your content or taking certain decisions to reinforce transparency and accountability on the way they're acting on the explainability of how their algorithms are working.    Perhaps not from a technical perspective, but like to really understand how they are working. 

And the second point I think there might be also a chance to move on to a decentralization of content moderation.  On bundling these big powers of social media companies.  And start like creating different companies or different ways in which someone will host the content, but some others will provide certain specific services, for example, for content moderation. 

    And then you start like reducing somehow the power of these big companies and really having this approach on protecting freedom of expression at the same time of tackling hate speech, misinformation and some other issues that may come up from this perspective. 

    So just throwing some ideas around it.  And recognizing that yes, there is like a big challenge.  I would say like freedom of expression, yes, it is not an absolute right.  But we have to be really very careful on we are talking about certain like restrictions.  Because sometimes there are like Governments that use this to persecute journalists or to silence this end and critical voices.  So yeah.  So if everyone also wants to add to this complexity, it will be really great. 

   >> I think we have been hanging out too much because I wanted to say something really similar.  I had a point with the languages because I think that's something we need to be more inclusive.  And I need to be taller.  That's okay.  Thank you. 

    That's fine.  I had two examples from a colleague in Lebanon.  Because as you said I think Facebook has 80% in English and Arabic has been censored 50% more than other languages because they don't have the bots, for example, to read Arabic.  So they just easily censor at the beginning.  And I have other examples from Ethiopia where it happened with local languages.  This is a really good example to see we need better monitoring from social platforms.  We need more engagement from social platform.  And I thought your decentralization was a good idea.  But also we need more regulation for private companies.  Know that's not new.  That's nothing new.  I have more.  I think I can say thank you, Vladimir.  Thank you for your comments.     

   >> MODERATOR:  Anyone else would like to add to this or has further questions or remarks on what was said?  I don't know if anybody's online now also from the panelists?  Are we online?  Is this going out? 

   >> Yeah, we are online. 

   >> MODERATOR:  Okay.  Yeah.  Cool.  While you are thinking, I have also a question or kind of a comment that I brought to the session that was how far freedom of speech is also related to access.  So if you say to express yourself freely in a digital space means in the first place also that you ‑‑ there should be the possibility to have access and to move around.  So language is something but there are preconditions that are more related to freedom of speech maybe than before.  But if we think back it is a bit the same way.  So you can stand up and speak at the corner of the street to people who come by but this won't be the same thing as talking on the radio.  How is freedom of speech related to this access?  If you want to share some examples or experiences how this relation was affected in the time of pandemic.  So this would be yeah, for my interest.  Please also put other questions or ‑‑

   >> I have a comment on what previous people here said because there was this motive that I think emerged from every ‑‑ everything that's ‑‑ from everyone that we should find some Democratic way to maybe ‑‑ to control in a way what people say.  But in ‑‑ but to find the way to democratically express ourselves.  My name is Marsi.  I'm a University student at University of Dusque.  I'm in my last year of journalism.  And there is ‑‑ I had this class with a guy who was a director of the ‑‑ one of the most popular Polish website, information websites.  And he said that every time that, for example, they published an Article about Israel they had to disable comments because it was so much hate, that it was impossible to moderate at all. 

    Because people in the Internet don't behave like in real life.  So, for example, here no one would like express those opinions.  But with anonymity.  There is this site and it is probably now ready, and it has this mechanism of controlling the comments that it doesn't like or it has only likes and dislikes and they are scored as points.  If you have a higher balance, like if you have five likes and three dislikes, then you have two points. 

    And the comments with a lot of dislikes are in a way disabled from the conversation.  So they stop popping up.  This is in a way a Democratic way of doing this.  But yeah.  But it in a way limits free speech because not everyone can say what they like but you don't have hate comments.  I think because let's assume that most people in these sites are not normal.  So they don't condone hate speech. 

    And this mechanism doesn't work in Facebook, for example.  Like I don't think any way to express dissatisfaction.  You have reactions that can be like ‑‑ they don't work.  They aren't negative because you can get angry about news that angers you.  But the news itself isn't hate speech.  So as a ‑‑ like for experiment I would like to propose this way of limiting comments about democratically involving them.  If you reach the threshold of negative points they are taken out or just hidden and you have to agree to see the hidden comments.  Okay. 

   >> Let me respond to this.  First of all, such ‑‑ looking like a Democratic way of resolving conflicts or hate speech doesn't work.  Because newly ontological ways of realizing democracy or decision of which opinion should be stated, doesn't work well.  It's very well‑known, since classical Forums, because if one of the sites becomes more technologically advanced they will bring what can be called boards or automated decisions.  So they will find voices for their opinions. 

    So the general concern and a thing that we should write it as an outcome of our session, that any automated moderation or any automated democracy, very doubtful way of finding decision.  But you mentioned Facebook.  And that brings another concern, because on Facebook news feed you do not ‑‑ you nearly do not make decisions on what do you see. 

    Mark Zuckerberg decides what will be shown to you, among advertisements, among recommendations.  But because for Facebook you are good.  So that's trying to sell to the advertisers and that's another issue.  Because we are nearly not all but many sessions here deciding to put out accountability, including these algorithms. 

    And this is also a concern of free speech.  Because if on such big platforms you are not deciding what do you see.  There was a real huge advancement when websites started using artistic technology for providing their news in machine readable form.  So you can configure your own RSS feed to get that content that you want. 

    But as far as I feel, a few years ago I stopped doing this because this is stalling their advertisement money.  And they will start working on Facebook like things.  So they show you what do they want to show you.  And this is another thing we should mention in this ‑‑ about this digital era, because you no longer or very difficult can assess information, what we want.  And again that's another issue.  One of the speakers said about distributed moderation, no, it doesn't work.  Because there is a beautiful federative technology.  Realize I think in Matter Most or something like where the federation of possible charts was realized.  But this technology as I started with bots, and as any other technology are usually being used by evil actors.  So this federated chart system started being used by Nasis.  They first found if they are being banned in Twitter and Facebook, they started using these federated systems.  As far as I remember the Google enable start removing applications.  Realizing that clients of such systems. 

    So it is another concern because any technological things, back to Phil's point, are usually being used by evil actors first.  Like all these financial systems, created cards, we know then whatever else are being used by cheaters.  So if previously actually they reacted and you have money, when you are a wealthy target, now Internet allows them to spread illegal activity along the whole Internet.  And big corporations have to protect themselves or protect you. 

    That's also ‑‑ we have to reinterpret this somehow.  Still not clear.  We do not have representative of huge corporations here just to hear them.  But maybe we could raise such concerns to them during another session or on communications with them. 

   >> MODERATOR:  Thank you for the answer.  Any other responses or new questions that you would like to throw in the ring concerning free speech and digital times?  For those who came late we don't have the foreseen Moderator and the panelists.  This session is up on us.  And just trying to make good use of the space provided here. 

    Okay. 

    Okay.  The question was if it is not audible, people who are present in the room who come from countries who have restricted free speech violations or have experience with that, could raise his or her hand. 

   >> I can say Mexico is one of the most dangerous places for exercise journalists.  We have like each 12 hours in aggression towards journalists who exercise their right to freedom of expression.  We have a Government that's stigmatizing and eroding public debate by also pointing out to like journalists.  And when someone makes a critique to the Government, they are like pointed out.  And they are like the blaming and shaming.  And then we have like this huge and massive response in like viral attacks towards journalists.  And a way of yeah, looking to silence them. 

    And we recently have ‑‑ when worth mentioning ‑‑ when mentioning about regulating, I was thinking there was an attempt in Mexico to regulate social media platforms.  They were doing like such in a bad way that they were saying like okay, now we have like this regulator, telecommunication regulator.  And he is going to be in charge to take down Fake News.  And it is like okay.  What is Fake News?  We don't know.  They use this very vague and broad concept. 

    And they have to take down all the hate messages.  And again it was like okay, what do you mean by hate messages.  Because Fake News can be something that a journalist is investigating.  And he is revealing something of the public interest.  But it might be and it might be like a true story, but sometimes the Governments are just like hiding or looking for a way to take down this information by saying that it is Fake News to the opponents. 

The same way for hate message.  We were like saying from Article 19, we have to be very careful because when you are thinking about freedom of expression there are like certain types of expression that are protected.  Some offensive languages are protected.  Because have satire, you have like political cartoons and some other things that might be not according to your beliefs, but they are like being protected.  So when you are saying hate messages, sometimes it can be tricky. 

    Or it can be like yeah, really worrying approach.  And that we have in other countries, in Latin America and Brazil they have a discussion, a bill on Fake News on the same way.  Reminding also that now it is one of these Presidents that attacks constantly to the critical media.  In Chile they have also this discussion.  That's why the Inter‑American Human Rights Commission was like okay, we have like first to think about what should be like this ‑‑ the standards for companies and for states.  And how they should approach in order not to restrict more society that historically has been like yeah, fighting for the freedoms and fighting for ‑‑ and free speech, like shaping their democracy. 

So I think just like to point out that there is like this complexity also when we are approaching certain thematics as it can be disinformation or it can be hate speech because yeah.  Sometimes this can be used by Governments in the opposite way and not in the ‑‑

   >> Exactly the same situation with Russia by the way.  And I think technological means and legality means are a bit different in Russia than in Mexico and Latin America.  But as I said, evil actors, even Governmental actors start abusing the Internet first.  It maybe Democratic countries like United States, or France are self‑healing in this case, the non‑Democratic countries tries to protect Democracy with evil things.  In Russian regulations relayed to hate speech and comments and making existence of official ‑‑ not official, licensed media, life very difficult if they have comments to their Articles.  So media just turning off possibility to comment any Articles on their websites. 

    Or another thing evil actions of Government protecting Democracy, protecting quoted.  Even one of the Russian requirements is to keep ‑‑ to advise commenters, we have some technological things like state services website with official log‑ins, something like that.  Just in protection of democracy against not hate speech.  Hate speech is not a very popular thing to fight with ‑‑ by the Russian Government.  But to protect people from terrorists or extremist content.  Not just hate speech but hate speech based on nationality or origins.  Yes, in different countries, with strange regime, maybe legal and technical means different by the purposes and especially consequences for the free speech are the same. 

    And by the way even we took over the session.  I see there are ten participants online.  Dear online participants, feel free to ask questions or raise hands.  I'm the monitoring Zoom session.  You will have the possibility to speak also.   

   >> MODERATOR:  Yeah, and maybe to share, so we don't have to go to Russia or Mexico, I think.  So freedom of speech is under threat in most of the countries.  So there are minor restrictions in many places.  And France if I'm well informed you cannot access certain pages because of rhetorant technology is banned.  Just proceed without any discussions.  You can question these kind of measures.  Or in Germany there was a large debate about upload filters of since the providers are made responsible for the content or platforms, sometimes they will take down information from those spaces that are protected by the free speech act. 

So what happens then?  How can they be empowered to do something like that?  You as a user are not in a position to react.  There is also a shrinking space of freedom of speech that you can see in different countries, in different contexts.  And maybe some of you would like to share another example of this.  Or has another question.     

   >> You had the example from France, from Poland.  So don't be shy.  Even if you are really pleased by what your Government does, or what's happening in your country with freedom of speech, just give us a positive example. 

    Don't be shy.  Any way to be heard.  It is given. 

   >> Hi.  I don't know if I can be heard. 

   >> Yes, we hear you. 

   >> Okay.  So I'm a student from DeVry University in Brussels.  I'm attending a ‑‑ I'm a digital communication student.  And I'm attending an Internet censorship course.  We are writing a paper about self‑censorship.  We were wondering about your thoughts on how self‑censorship is being ‑‑ how digital platforms is shaping the climate of these two countries?  Any thoughts on that?  Because we are ‑‑ we are doing like preliminary research.  And I have done research on the U.S. side where I know that platforms are not legally liable to what the ‑‑ to what people are actually saying online.  There is a thrive of hate speech and also echo chambers in that aspect.  Because you can only ‑‑ you are only listening ‑‑ you can actually choose what to listen to in these platforms.  But on the other side of the world China, we are wondering about how the country is managing this.  So any feedback on that would be appreciated. 

   >> Anyone from the U.S.?  Maybe online?  Maybe somebody who knows the real situation in China or the United States. 

   >> I already talked about that.  But I could give examples about self‑censorship in European countries.  That's what we witness from our members of the network.  #imenetwork.  It is normal.  Men and women of all political inclinations.  Mostly women actually.  And it is really ‑‑ it falls in line recently when there were Facebook papers, we were not surprised that among these revelations there were these Facebook study, internal study showing that their users were self‑censoring for most of them.  And often it is women, categories that are targeted by ‑‑ in online spaces.  LGBTQ community, for example.  People of Color who are attacked.  And so yeah, self‑censorship is real and it is everywhere.  We always talk about this right of freedom of speech, but there is another right, there are several rights, Human Rights that exist.  And there is another right which is a right to safety.  You have a right to safety as a woman of color, is not less important than the right to freedom of speech of a white male, for example. 

    So yeah, this is really something to take in to account.  And it is great.  I'm super happy to learn that there is ‑‑ you are doing this research about self‑censorship because it is so important.  It is happening everywhere.  And we need to better assess this and bring back citizens to online spaces and, you know ‑‑ so definitely in all the countries we are in like 18 countries.  We see that all the testimonies we get is like I didn't have the courage to go online because I'm going to be attacked.  I'm going to be threatened.  I'm going to be humiliated.  I'm going to face digital violence.  And we know that digital violence first targets women. 

    So this is really a huge issue to tackle. 

   >> Maybe you already have some results of your research and you can share with us. 

   >> Rodessa:  Yeah, just I will look it up.  But just to give feedback on what was said, I agree that we're looking at ‑‑ because I think it is ‑‑ despite the fact that we claimed that there are Democratic countries, the fact that people are conforming to what the majority of ‑‑ like, for example, in the case of U.S., where there are instances where people are silenced or placed in a certain like conformity because they don't want to ‑‑ so we're looking at that. 

Let me look at my file.  I don't know if I can ‑‑ if I can ‑‑ so essentially I don't know because this might be too theoretical.  But we are using a pathetic dot theory of a framework.  We are looking at the four aspects of free use speech legislation, restrictions of law.  The norms which in this case U.S. versus ‑‑ not versus, but like U.S. and China, how norms can influence human behavior in the digital platforms.  We are looking at platforms terms and conditions in the market.  And we chose U.S. and China because they're home to ‑‑ to some of the biggest platforms, Gafa, Google, Amazon, Alibaba.  We are here to actually more hopefully learn more from the panelists. 

    So we're also looking at Benoptican theory, in platforms you kind of learn how to self‑regulate your expression.  Because you fear like you are being watched all the time in platforms, whatever you say will ‑‑ can be recorded or traced back to you.  So we're also looking at those theories and frameworks and comparing the cultural aspects of that and how it impacts how healthy a citizen can actually like enter debates in platforms. 

    Yeah. 

   >> Yeah.  Thank you, Rodessa.  I think it is really great.  Something that we have seen in Latin America and Mexico, and I'm going to think of three cases.  The first is the creation of silenced zones or that's what we call silenced zones in which the extreme violence against journalists.  It is like a chilling effect or a self‑censorship to them to basically cover anything related to the organized crime or to drug dealing in a certain state or corruption or the things that are going.  So it is like journalists, that just like stop doing their work because they are like facing threats because they are like facing risk security. 

    So we have like big states just like being silenced. 

    The other thing I was like thinking is perhaps related to one of the last comments, how women journalists when they are also like publishing investigative journalistic pieces.  And when they are stigmatized and when they are like pointing out, pointing out by the President in our country, they are like facing these viral hate and viral aggressions.  When I was like interviewing some of these women journalists I have to just like remove Twitter or remove any social media.  Just like take back.  And now think twice when I'm like conducting an investigation or when I'm like publishing.  So that also like creates this yeah, great self‑censorship of not express. 

After a time and a certain period of moment I think that contra narratives are very important.  Solidarity is very important and how other women journalists also create a network to support.  I think that's super important.  And the last thing is also how certain legislation can create self‑censorship and how certain legislation can limit freedom of expression.  If you say something you are going to be criminalized or if you say something you are going to have like a fine or you are going to be put in jail. 

Just to mention recently, there was like some decree approved in Cuba, which has like this chilling effect.  And this self‑censorship effect.  Like if you are going to use social media to publish something against the Government, you are going to be persecuted and you are going to be criminalized. 

    So it's like trying to put you in an indirect way of silencing you, perhaps they are not going to be ‑‑ in the case of Cuba they are going to be out of your house.  Also like restricting your right to movement.  But it is like also thinking on how certain legislations can push for self‑censorship.   

   >> Let me continue.  The issues of digital era and self‑censorship.  In predigital era if you said something on the streets or in the kitchen, it might be forgotten very soon.  But in a digital era, anything you said is stored forever.  We definitely know the cases of a lot of people counselled for prices or mail chauvinist statements in the United States.  So they did it in Twitter years ago.  And then they became politicians and now they have issues. 

    But that's light cases.  Because in Russia now, we have some criminal charges arise against people who said, written something 10 years ago, 15 years ago and it is done very easy way.  Because there is a ‑‑ one of the most cases is stupid song about terrorist act, by then standup comedian, when we know the standup comedians are not very smart some cases.  So he wrote this song and put a clip on Youtube in 2012. 

    But nowadays even that timeout for criminal cases already passed for this case, there was a kind of victim controlled by Russian law enforcement, purchased this video clip this year or a year ago and the criminal charge arose.  This guy forgot he wrote this song and now he is in jail.  And this is an example that everyone will really think many, many times before they express themselves.  So that's another concern of freedom of speech in digital era.  Anything you said will be recorded.  And you never know how it will be interpreted in 10 years, in 20 years. 

Okay.  I see you ‑‑ I will just remind that because organizers of this session haven't shown up we just took over this session.  And well, if you have something to express about changing freedom of speech in the digital era, feel free to join us.  I see a colleague from Finland.  If you have no concerns, feel free to express.  There is an example from Russia, Mexico, U.S. and China, Poland and France. 

   >> I'm Leana from Armenia.  I will show a concern of freedom of speech.  It is not like the situation as in Cuba as you described.  But lately we have kind of not criminalized but kind of a bad situation with saying something bad against the Government.  It's concerned mostly on a hate speech in that framing the word.  But mostly yes, they ‑‑ there are those who say some bad words and hate speech regarding the Government.  These cases are being charged.  Penalties.  Some money penalty over them. 

    And for the media as well.  So the media, they need to think a lot more before putting something over their websites or their pages. 

    And also that concerns to the comments as we know in the social media, people like to put comments over the Articles, the situation that is described.  So that's being charged for the editors of that medium.  So they need to somehow well, censor, shape or form or most moderation.  When we see some hate speech in the comments they put that down.  I don't know whether this is good or bad but this is the situation.  I agree with you that nowadays in the digital era, we need to think very carefully what we put.  And for the future years to come. 

    I am sorry for that situation that comes out from the very, very late past because you never know what will happen ten years afterwards.  And what laws were there. 

    So I would say here anyway we as a new generation of this digital era we need to be careful, yes, really what we say and do in this way.  Thank you.  

   >> MODERATOR:  Okay.  Thank you for your contribution.  And yeah. 

   >> (Off microphone).  So like in the digital space, for example, we don't have a way in ‑‑ in any way to get together or make strike or protest.  During the pandemic times I think it is even more important.  Like a year ago Polish Government made a person more like ‑‑ make a decision that makes a person more restrictive in Poland.  And people have gathered and protested on the streets and other cities.  But like the Government tried to destroy the gathering.  But I think in some way it wasn't safe because of the pandemic.  It was especially during the fall.  So it was the peak of the second I think wave of Coronavirus. 

And my question was, do you ‑‑ do ‑‑ like do any of you from different countries have a good example of a gathering in the digital space that was seen by someone else than people interested?  Because if you go on the street, then everyone sees you.  Yeah.  Okay. 

   >> First of all, let me give another negative example from Russia.  Related to freedom of association, Coronavirus and freedom of speech and digital era, a lot of people, position leaders, different levels, middle levels, got fined or arrested for violating Coronavirus regulation because of retweeting or posting something about position rallies on the streets.  So somebody posts a photo.  We have now a protest meeting against the rest of Minister in the streets of Moscow.  And he is being charged for Coronavirus restrictions as organizer of illegal gathering. 

    See this?  It is kind of ‑‑ he is realizing his freedom of speech telling about what's happening.  But being fined for violating Coronavirus restrictions.  It is a negative example how the regime can abuse it.  During the previous pandemic years there was organized virtual protests on the annex maps.  You are using your navigation system on the Smartphone, I think also Google allows this, using ways.  You cannot ‑‑ not only just mark an accident, but you can also start a chat with geo location.  Just put your message on the map. 

    And I think in the previous year also there was organized a kind of protest and a lot of people were putting their messages around the Kremlin in Moscow stating that they don't like the Coronavirus restrictions, pressurizing.  You have to be in the exact location where you put your message but there was such examples of safe gatherings. 

   >> (Off microphone). 

   >> Yeah, and I can ‑‑ now recently Colombia has a good example of a combination of protests in the streets.  But also like a very significant protest, digital protest using hashtags, using social media apps, also like sharing, documenting police abuse.  So basically also complimenting the exercise of digital protests in the streets. 

Cuba is another great example on 11 of July.  Part of the protests, historical protests that they have in the streets, take part also are SOS Cuba in the digital field. 

    So that's also a good ‑‑ another good example.  And feminists collectives and feminists groups in Mexico and all over Latin America they are such a great example on how digital protests really works when they are like fighting against femicides and fighting for the rights.  I think that's another good example on how it creates a massive digital wave all over digital America.  We are going to expect to see these types in sometimes creative ways of using as in Russia and other countries, to really like drive the attention and put their demands to the states.   

   >> My name is Raul.  Working for the Electronic Frontier Finland.  I can give you two examples from Finland which is quite recent.  The other one was a case where one of the members of the Parliament is on the Christian religion party, they are sort of the Chair of the party put out some hate speech against gay people.  And in doing that, she used just to quote from the Bible, and that she got judged for that.  That's fair enough.  We have a law in Finland that you can't attack minorities with speech or aggravate violence against people with minorities. 

    Another one was a journalist, this basically is an openly, very racist person who became a council member in one of the northern cities in Finland.  And this Council member was called out by a journalist who called him a Nazi.  And they actually ‑‑ he actually made a ‑‑ like a ‑‑ he sued the journalist for saying that.  And that actually ‑‑ that's already being tried in two levels of courts.  And there is only the highest one left.  But already like basically she got judged twice for calling that Nazi a Nazi.  So ‑‑

   >> You are talking about possible abuses done by people in power.

   >> Well, for example, this last example, I think it is a scarey example in a way that I mean if you can't call a Nazi a Nazi, that's kind of stifling for the freedom of speech.  And, you know, even if you quote a religious text it doesn't allow you to attack minorities. 

   >> Yeah, that's a really ‑‑ but this I think is a cultural issue.  So that will be changed.  A lot of things changed since even previous century, not since 19th Century which we have to take in to account. 

    Okay.  This screen shows that we have six minutes left of this session.  So maybe some wrap‑ups? 

   >> It was really great to take ‑‑ we ‑‑ I think we made like a different ‑‑ from digital protests to hate speech to challenges of freedom of expression, to Human Rights perspectives.  And different contexts.  So yeah.  I think it was really great.  Thank you for opening the ‑‑ and moderating the ‑‑ this space.  So yeah.  Just like getting other perspectives and other ideas that I think are neat to continue the discussion. 

   >> Yeah, thanks a lot for the opportunity to discuss.  Sorry for the whole male makeshift panel.  So next time we will do better.  Bring some females.  But yeah, it is really interesting to be able to discuss all that, really ‑‑ in all these countries, sharing this experience.  I think we have a lot of food for thought.  So thanks a lot. 

   >> Yeah.  Thanks a lot.  Because we raised a lot of concerns.  And exchanged experience seeing that in some cases our experience is the same.  In some cases they are completely different.  And as a part of IGF, we have to continue this discussion because even if somebody complies, this IGF doesn't make decisions or doesn't make clear statements, but even in these discussions we are moving forward to understanding what we have to do.  And I hope we will understand to do without greater repressions and greater wars just because we are able to discuss. 

   >> Many things have already been said.  One last comment at least here in the room.  It seems we have a full Civil Society panel or anyone or ‑‑ who is representing corporate interests or Governments?  And if not, then yeah, let's take this also as a ‑‑ I don't know if it is a lesson but in this debate about free speech as we heard, multi‑stakeholder approach is needed.  There should be a common interest in defending such principles of living together.  And I think it is not a good idea to only push this on Civil Society.  But there should be more interest from the other sides also to be here and maybe as an invitation for future sessions or panels also on the IGF.  So yeah.  Governments and corporates are welcomed and yeah.  Thank you all.  I don't know if there are other comments from people following online.  If not, also thanks for the technicians and the people who helped make this possible.  That's it. 

   (Applause.)

   >> Thanks.