IGF 2022 Day 1 Town Hall #91 The war in Ukraine and the disinformation war

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> GIACOMO MAZZONE: So we will start in two minutes.

Okay.  Let's start.  Thank you very much for being with us today, for this session.  I see many friends in the room.  This is a Town Hall meeting, the title is "The War In Ukraine and the Disinformation War" as being proposed by EDMO, the European monitoring, of European Commission and Eurovisioni.  We wanted to analyze carefully what is happening since ten months already in Ukraine, not only on the battlefront.  This is not our matter here at the IGF, but on the virtual front.

An interesting exercise to understand how the Internet could be used as a weapon, not only in the world theater, but also in the rest of Europe and the world, to fight the battle for the inference of public opinion.  This became a test bed for the recent measures the European Union put in place to fight the disinformation and the misinformation, such as the code of practice that you know, because it's been presented here in the past years, but has been renewed.  And the observatory within the member countries which EDMO is the central hub.

During this month, EDMO has deployed a task force of experts and fact checkers to monitor and identify the disinformation campaign, made by the forces involved in the conflict and their geographical spread around Europe and the rest of the world.  An experience from which we can at the IGF learn very much because it's a laboratory life size what could happen in the future, in cyber war, besides the world battlefield.

So we have with us today, a number of panelists all from Europe, because we are talking about Europe and European experience, but we asked to be reinforced by Roberto Zambrana who is kindly here to support the session.

And among the speakers we have Krisztina Stump of the European Commission.  I will ask them to introduce themselves one by one later, when they take the floor, Paula Gori, the Secretary General of EDMO and Claire Wardle, and Tommaso Canetta, and then Francesco Sciacchitano.  Why this composition?  It will be explained later.

And without wasting more time, I will give the floor to Krisztina Stump, because she can give you an overview of what European Union is doing to counter disinformation in ‑‑ in the rest of the process of regulation of the Internet.

The floor is yours, and you can try to start with the slides.

>> KRISZTINA STUMP: Thank you very much.  I will start my presentation by an overview and it may be useful to explain that I am the head of the unit of the European Commission that is in charge of the commission's policy to fight disinformation, and it will be my pleasure to give you, indeed, this overview of what the commission is doing to fight disinformation.

This approach that the European Commission has devised is seen as unique by many, and why it is seen as unique, is because of several elements and I will give you this overview.

So first of all, in my perspective that the EU has a strong toolbox all rooted in the freedom of speech and this can be seen in the code.

The EU approach is also an approach that is combining self‑regulation, with regulations or backing up by regulation that means that if the code of practice on disinformation is not followed, then we have also a regulatory instrument behind the Digital Services Act.

Last but not least, the EU's approach was also rooted in a true multi‑stakeholder approach.  It is also very well demonstrated by the approach taken by EDMO, the European digital media observatory and its hubs and the stakeholder community that it is assembling.

So first, going into the code, what is this instrument?  Did we find the magic bullet to fight disinformation and now we put it in practice?  That is not the case.

That is not what I see.  This complex problem actually requires complex solutions.  So instead of one or two that can fight disinformation, the code of practice is basically how I would put it a tool, a toolbox with a variety of instruments that all together can be efficient in fighting disinformation and that is where I will attempt to share my screen and hope that it works.  Does it work?

Hopefully yes.

Maybe I have to put it on ‑‑ yeah ‑‑ do you see it ‑‑ yes.

>> GIACOMO MAZZONE: It's not full screen.

>> KRISZTINA STUMP: For some reason it doesn't work.  Instead of trying to fix this, because for some reason, it will ‑‑

>> GIACOMO MAZZONE: No worry.  We can see most of it.

>> KRISZTINA STUMP: The toolbox is comprising a variety of instrument, actually consisting in monetization and making sure that it's not supported by the information, and it's achieved by a variety of players involved in the sharing ‑‑ of the placing of the advertisement.  And it consists in tools to make ‑‑ make political advertising more transparent.  It also consists in tools reducing manipulative behaviors.  So basically commitments by the signatory to counter manipulative behavior, current and emerging forms like fake accounts, bought, or malicious.  We have seen it regarding the Ukraine.  And then it contains the user empowerment tools which gives more information to the users to identify a flag and act the disinformation.  And it provides.  This a context and tools to enhance the media literacy or to have a safer design of online platforms.

Also, the ‑‑ the code contains unimportant measures on fact checking.  This is also important regarding EDMO's activity.  So the code content commitment from the signatories from the big online platforms to have fact checking coverage throughout the EU and also to provide the financial contributions for the fact checkers to their worker.

Also, last but not least, the code contains commitments from the signatories to give better access to carry out their work.  And this is shown by EDMO's activities that also aim to facilitate access to data.

Then, maybe just a very quick overview that the code also comes with transparency measures, transparency center to make sure that users can consult how the signatories are doing their commitment.  And it comes with a permanent task force that works on the implementation of the code and the code also has a robust monitoring framework to make sure that its commitments and pleasures are properly implemented.

Then what I still wanted to show you is actually the multistakeholder approach that is behind the code.  Notably, certified signatories.  This is not just a code for the big online platforms, while it is, of course key to have onboard, Google, TikTok, Twitter, and Meta, and the data associations but also smaller and specialized platforms are on board because this information is spreading there.

We have the advertising industry on board.  That is important for the demonetization of this information.  We have the fact checkers on board so that also their needs are met and they can help to ‑‑ in the implementation.  That also civil society, research organizations are on board and players that offer technological solutions to fight disinformation.

I will stop my PowerPoint presentation here, because in any event, it didn't work out as I wanted, but I think this gives you a good overview about the code and the multi‑stakeholder approach behind it.  What happens if the code is not followed?  Then again it's important to stress that we have the digital services behind it that imposes the mitigation obligations on the very large online platforms.

However, before I close, I also wanted to quickly go into what the code brings for the Ukraine‑related situation and overall what is the approach of the EU on Ukraine.

Here preempting some questions that often arise, regarding sanctions imposed on certain broadcasters, here it's more than to say that this is a very special situation when it comes to this propaganda channels.  Here we are facing war propaganda.  This is not the normal context of disinformation and that's why the EU has taken a targeted approach to limit the distribution of this channel in a targeted manner that are a part of the Kremlin's propaganda mission and part of actual hybrid warfare.  So this is one element that's coming apart from the usual approach to this information, but this is only one element.  It is coupled with the implementation of the code of practice where we are working with the signatories to make sure that they live up to their commitments under the code, notably to demonetize Ukraine‑related disinformation to apply fact checking properly to this type of content and to apply to all the other measures notably giving users reliable information about the situation to label, for example, state‑related accounts and also very importantly to take measures against coordinated manipulated behavior that is also applied here.

So this is the contribution of the code and then last but not least, also ‑‑

>> GIACOMO MAZZONE: Please if you can go to a close.

>> KRISZTINA STUMP: The contribution of EDMO.  And this will lead to the next speakers that will go in depth, but EDMO is contributing, but just let me say on the commission's perspective, EDMO, on this in particular, also through the Ukraine task force is very important and that's why I'm looking forward to the contribution of the other speakers.  Thank you very much.

>> GIACOMO MAZZONE: Thank you.  Because you already made the presentation of Paula, I give the floor immediately to Paula.

>> PAULA GORI: Thank you very much.  Thank you very much, Giacomo.  And she was talking about having a multi‑stakeholder approach.  EDMO is European digital media observatory and the idea is to act as an independent platform that enhances a multidisciplinary approaches and acts, if you want as a body effects evidence and tools.

How do we do that?  We offer, for example, trainings on the specific topics and values topics related to disinformation.  We have on our website repositories with fact checking articles and scientific publications.  We have a map of the media literacy initiatives that are implied in the EU.  We do collect and I will get back to the hubs later on, all of these materials, from the various Member States thanks to our hubs.

And we also, of course, organize workshops and events.  So overall the idea is to make sure that if you want to tackle this information, you have to first understand it, and given that as it mentions, it actually touches on different disciplines and different stakeholders, we are there to bring them all together, and to make sure that we also have access to those in order then to have evidence‑based policy approaches and to be in a position to study them.

I will skip that.  But I wanted to start with just a few examples.  One big achievement of EDMO as you might know, like, when studying disinformation, it's important to access the data of the online platforms because this gives you a complete picture of how the disinformation campaign started, which are the actors behind it and so on.  And this is why EDMO established a Working Group that actually is ‑‑ has ‑‑ how to access this information, in full compliance with data protection rules because we know, of course, these are to be implemented.

We published in the first in the kind in the world with the report of a code of conduct to make sure that this data can be accessed in full compliance with GDPR and we invite you to read this report on our website.  We have a fact checking network and Tommaso will tell you more about this later.  They collaborate together on a secure platform.  They do joint investigations.  They are in close contact, which also helps, for example, in the case of ewe grain to share early warnings like, for example, some disinformation campaigns that were seen in some eastern countries where then arriving also at some few days in other countries and thanks to this network, actually, it was easier to detect and also to be alert.

I was mentioning before our hubs.  So we have hubs, actually, from this week and all Member States.  What they do is they detect and analyze the information campaigns at local level.  You all know how important it is to have the language skills of a given country to do so, because, of course, disinformation spreads in different languages.  They organize media literacy activities at national or multinational level and they are in constant dialogue and provide policy analysis to their national regulatory authorities and they do research.  Are.

This is very important, and we couldn't be what we are without our hubs because as I was mentioning before, they pop late our repositories and they enable, actually, the possibility to do cross ‑‑ cross‑country analysis, to compare data, to try to find common trends, both in research and media literacy and in general to understand the trends of disinformation campaigns.

For the war in Ukraine, what we did and my colleagues afterwards will give you additional details.  On one side, we decided to have a database, in the beginning daily and not to recent days, not daily.  This database has more than 2,000 entries right now and it looks at disinformation‑related to the war on both sides.

And as you can see here in the image, there is a date of publication.  Article that the country in which it is published, translation of the title in English.  And then the link.  We were able to provide weekly insights early in the war and what was coming up.

In parallel to, that we started also the task force with some very important experts to basically ‑‑ and Claire will give some more details on that.  The aim was to understand if we were ready to address disinformation in case of emergency, if, yes, what is working.  If not, what is not working?  And what would be the activities to implement to make this working in case of another emergency?

And on top of that, the task force also ‑‑ some task force members published some very important posts that as you can see here, we just have some examples, but they are, again, covering this multidisciplinary approach to information.  It could go from posts that shows there were some fake fact checking websites that were actually spreading disinformation to a post on the importance of mental well‑being of investigators on the digital front‑line because as journalists and fact checkers have to watch constantly and often repeatedly, some very shocking and strong images, it is important also to help them, give them psychological support.

I will now give the floor to Claire.  As you see, we are bridging very well from one speaker to the other, because she will actually ‑‑ she was actually the chair of the task force that I mentioned, which actually produced a report with ten recommendations which Claire will explain in more detail.  Thank you very much.

>> CLAIRE WARDLE: Thank you so much, Paula.  It's a pleasure to be here.  My name is Claire Wardle.  And I'm a professor at Brown University.  Basically, at the end of the February, early March, EDMO decided to create a task force on the war in Ukraine and I chaired.  There were 80 members of task force and we recommended academia, journalism, civil society from ten countries across the EU and everybody was acting in their personal capacity.

As Paula said, our job was to look at what was happening, and respond.  So we ‑‑ as you just heard, we did regular roundups of fact checks and mini research projects and our weekly conversations were trying to think about what was working and what wasn't working.  In June we published a report with ten recommendations, and I do think we have to be aware of what works and what didn't.

So I think in many ways, for the scale of the problem, even though for the last six years, there's been lots of money pumped into thinking about disinformation, lots of initiatives, one thing on the task force level that we were frustrated by, were things like lack of coordination that we didn't have a shared database, we still didn't necessarily have enough language capacity to understand what was happening in many countries across the EU.

In this report we tried to say, we are not pointing fingers, but we are saying this is not going to be the first information emergency.  We just come through COVID.  You know, we are always going to have elections.  We have to understand what we can do better in a coordinated response.  We heard about multi‑stakeholder.  We're very good at talking the talk, but actually what does it mean when the rubber hits the road?

And so some of our points that we made was that we were unprepared, and there was inefficient coordination.  And so moving forward, what does that look like?  And EDMO is playing a really strong role, particularly with the hubs of thinking about what that coordination looks like.

Another piece of our work was around understanding literacy, information literacy interventions and one of our members was Sonia Livingston who is an expert in this area.  She did not comprehensive, but she did a piece of work to understand the kind of literacy interventions that popped up just after the invasion and what she found was that there were a number, but mostly it was about how to talk to young people about the war, as opposed to helping people understand the sorts of things they were seeing online.  That was a very obvious point which was like, how can we be more coordinated and in moments like this, stand up and kind of a shared curriculum.  And also evaluate that kind of curriculum in realtime to understand, you know, how we're building resilience across the EU.

We also had issue around some examples of a lack of transparency, around, for example, decisions to take down RT and Sputnik.  Lots of discussion on the task force about that.  Lots of pushback from some of our colleagues from countries like Bulgaria and Hungary, saying, this is not just RT and Sputnik.  There were interesting conversations about making content moderation decisions, who is making them and where is the transparency, not just from the platforms but also from government entities.

And overwhelmingly, we saw there was a western focus on so many of these responses, be which meant large proportions of the EU population were unsupported and there's a real obvious point around developing more interventions, working with platforms around their language ability, but certainly, we saw in our task force members from kind of ‑‑ obviously, UK, Germany, France, Spain, Italy, sometimes having slightly different experiences of what they were seeing online to our colleagues who were coming from countries that bordered Russia.

We also found that whilst there was amazing fact checking going on, we wanted to think about how to elevate some of that to understand the narratives that were being shared and also the narratives that were being shared globally, because whilst the EU response was great and having so many Member States working together was really strong.  Also we were saying how can we connect this to the kind of information that's flowing in Ethiopia, the kind of information that's flowing in Brazil about the information.  While the European response is important, we have to do a better job of connecting globally around an event like this, which is a global event.

And so the other element we all talked about was the need for the full information ecosystem to be considered.  So as Paula just explained there is a need for more data from the platforms, but also we have lots of discussions about the role of broadcast media and print media in pushing certain false and misleading narratives and the challenge that fact collectors and others had in trying to make sense of what people were hearing from traditional media.  It's often very easy to think about what is circulating on the platforms without thinking also about the role of elites, whether they are politicians, whether they are religious leaders, whether it's what people are seeing on television or on the newspaper at the breakfast table.  So there was lots of discussion about we need to do a much better job of understanding the full information ecosystem.

And we wanted to think about this longitudinally.  We finished this work in June and the war continues.  What we don't have good measures is how the interventions we have started to put in place, what is working and what is not over time.  So I think a lot of what we saw was, he why, there were outright falsehoods but a lot of problematic content is what we call gray speech.  It's not illegal, but it's many of us in this meeting would probably say, it's leading to harm, but maybe over time, it's a drip, drip, drip of low‑level gray speech.  And what does that mean in terms of shaping narratives and the way that people perceive what is currently happening in Ukraine.

Lots of discussions about the need for more measures that can look at harm longitudinally.  I'm not here to say that it was all problems, but I think we have a responsibility to say that the platforms were not ready for the war in Ukraine.  They were deplatforming and it was easy for them to say, why weren't the platforms ready?  I think one the major things we took away from the task force, we, civil society, governments, we weren't ready either and it's on us to say, how can we do a better job to be more prepared?  And I think that's why these kind of conversations continue to be necessary.  So thank you very much for your time.

>> GIACOMO MAZZONE: Thank you, Claire.  Before we go to the next speakers, the fact checkers that probably have the most interesting example to work on, I think that we need to make a point now, because the first intervention ‑‑ the first two were about policymakers and the how the European Union tried to do the policy making and Claire introduced doubts.

So it seems that the way to build the ministry of truth is still complicated and the problem will not work very easily.

Having said that, I think that the next speaker can bring us concretely in the field of what happened, what this laboratory created, and what EDMO has showed up, and I think that is useful to listen to what he say.

Tommaso of Pagella Politica.

>> TOMMASO CANETTA: Absolutely.  I'm with the fact checkers in EDMO.  I will try to share my screen.

Can you see it?

Okay.  Great.

I am also the deputy director of Pagella Politica, the two main fact checking outlets and also a member of the board of European fact checking network.  So before I dive specifically into the Ukraine issue, a few words about the EDMO fact checking network.  It was established in the spring of 2021 and it was operative by the summer of 2021.  Right now, we have 36 fact checking organizations spread all over Europe, of the European Union, plus Norway.

Some of them are very big organizations like IFA and they separate in different languages and others are very small organizations maybe just three or four people organizations, operating from small countries.

So this is the network, and it was very important that it was established before the beginning of the war.  So we were actually able to see the evolution of the disinformation in the making, basically.

So one the instruments, one of the tools that we used to detect information, disinformation, our monthly briefs, in which we gather information through a questioner that has both qualitative and quantitative questions.  This questionnaire, we gather all the answers centrally and we were able to extract some data.

We know how many fact checking articles have been published during a said month and how many of those Articles were about a specific topic.  So the first topic we focused on was the COVID‑19 pandemic, the white line that you see here.  Later, the Ukraine‑related disinformation and during the summer we started also analyzing specifically the climate change‑related disinformation.  As you can see for Ukraine, the disinformation basically exploded in March.  So it exploded in honesty since the 24th of February, but in March it was like 60% of the total information.

And then it drops down until October until we detected this new rise of the ‑‑ we detected the disinformation, specifically about Ukraine.

Analyzing the qualitative questions, the questions that we asked the organizations of the network, we were able from the beginning to isolate some of the narratives that circulated specifically inside of the European Union about the war in Ukraine.  You can see that the Russian invasion of Ukraine was actually justified.  For example, there was false numbers about that of civilians in Donbas and false information about the President Zelenskyy.  And this is very important to stress.  We do not take orders from central EDMO and we do not take orders from the EU commission or the states.  So fact checkers operate in a truly independent way.  And we detect disinformation as it is.  We are absolutely neutral.  We are not there, you know, looking for disinformation of this or that kind.  So we detected, of course, a lot of pro Russia disinformation but also some pro Ukraine disinformation.  The things are not comparable but I think it's important to underline that we see what we see.  We see how it is.  We do not inventive.

And then the other was about western traditional media spreading false news.  What I want to stress here, we talk about disinformation narratives when these messages are conveyed through false news.  So, for example, we know for sure that some Ukrainian soldiers had Nazi symbols of them, but creating a narrative about the Ukrainian forces being largely composed by Nazi passed through many, many false content, created with photoshop, adding swastikas on Zelenskyy's shirt, or the other side, during the mobilization of Russian troops, many, many young Russians were fleeing from Russia into other countries, but we also detected false news exaggerating, for example, very long lines at the Finnish border that were not true.

And these are powerful tool for extracting the disinformation.  They detect information and counter this disinformation at the national level.  So having a network that allows us to see this information, at the continental level, at the European level was incredibly useful.  Before the pandemic and now, of course with the war in Ukraine.

The analysis of the narrative was so relevant that we decided to create the systemic contrast, thanks to the database that Paula already mentioned that has 1,900 and more articles inside, analyzing those Articles we were able to extract the main narratives conveyed by the disinformation, and also to give some early warnings about the likely developments of the disinformation.  I think this is very, very interesting.

Disinformation is an ancillary phenomenon to information in most of the cases.  So where the information goes, that information follows.  In the news are talking about, I don't know, the ‑‑ for example, young Russians fleeing from Russia to avoid the mobilization, disinformation will likely talk about that.  If information is talking about COVID‑19 vaccines, disinformation will talk about that.

If information is talking about Ukrainian refugees, disinformation will talk about that, et cetera, et cetera.

So the third pillar, let's say of our content published are the cooperative investigations.  These are used by two members of the fact checking network.  And they look at a specific issue related to disinformation.  It was very, very useful to have those during the war, because since the beginning, we detected some very interesting phenomenon.

For example, that the channels, the groups, the pages, that spread until the 24th of February, COVID‑19 conspiracy theories immediately went to pro Russia.  We detected this in Denmark, Spain, Italy, et cetera, and many, many countries.  And given this in information I think it was very, very relevant to the political level and the readers and they were aware of this possibility.

Another interesting case, another interesting example of a cooperative investigation is the one with Ukrainian refugees and how they were targeted about disinformation.  Of course, this Article was first produced by our colleagues from eastern Europe.  From Poland, Slovakia and Romania, those were the countries hit first by the big wave of Ukrainian refugees and, of course, the disinformation about refugees started there, but it was very important to create awareness in western Europe too because with the movement, the secondary movement from eastern Europe to western Europe, and so the disinformation moved as well, but the flow went in the other direction.

In Italy, Spain, and Greece, we had a lot of information during 2014, 2019, about refugees and migrants arriving from western Europe and southern Africa.  So these refugees are vie land or thieves or on the other hand, national states treats them better than what they do with their own citizens so basically exploiting the cracks within European society.

And now let me leave the floor to my colleague from Demagog.  They did a very good job and I'm curious to hear more details about that.  Thank you so much.

>> ADAM MATERNIK: Can I share my screen right now?

>> GIACOMO MAZZONE: Yes, please.  Tommaso, you have to ‑‑

>> TOMMASO CANETTA: Yes.  I'm not finding the control.  Done.  Thanks.

>> GIACOMO MAZZONE: Thank you.

>> ADAM MATERNIK: Okay.  Is my screen visible to all of you?

>> GIACOMO MAZZONE: Yes, we can see it.  Adam, can you introduce yourself?

>> ADAM MATERNIK: Yes, of course.  So my name is Adam.  I'm ‑‑ I've been a professional fact checker since 2018 and today on behalf of the Demagog association, the fact checking organization in Poland, I'm privileged to depict the civil society's attitude towards battling disinformation.  So the mission of my organization, since 2014, focuses on combatting fake news broadly spread on the Internet.  We believe that public debate should be grounded on facts.

What is more, citizens should arrive to the unbiased information.  So hence the question arises how it can be accomplished.  So according to European and international centers we approach a comprehensive approach, based on two pillars.  First of all, fact checking on the other hand media literacy.  Generally speaking, fact checking is defying that all the facts in a piece of righting are correct or truth.  It might be an individual activity.

So each of us can verify information on our own, but being fact checking organizations have been set up across the world in recent years, and Demagog is one of them.  Currently there are approximately 400,000 ‑‑ 400 fact checking organizations.  So what we are doing?  So since the war in Ukraine, since the war this Ukraine has broken out, we have verified misleading claims.  Approximately as you can see over 300.  We have been struggling with pro Ukrainian disinformation and pro Russian disinformation.  Of course, this is different but I would like to point out.

I would like to mention that we analyzed the narratives which have come out in the EU and we published reports with regards to Poland, Slovakia, Hungary and Romania, as Tommaso just mentioned.  And what we have seen, that there are plenty of common narratives across the EU.

Not only in central and ‑‑ in central Europe.  But fact checking is not an active attitude.  It means that ‑‑ the back article will probably not reach as broad of an audience as the fake news?

How can we fix that?  There's a solution, media literacy education.  We want to promote proactive to tackle this disinformation.  It involves how to discern truth from false and how to distinguish fact and opinion, how to do a media diet and trying to build a resilience.  So it can be put in a curriculum or it can be held by fact checking organizations.

Unfortunately ‑‑ or fortunately, if you would like to reach the broad audience, we should educate as many people as possible regardless of their age.

There is one more solution, the prebunking.  We would like to create be antibodies for the future exposure against disinformation.  So prebunking is ‑‑ it's like during a real vaccination, we like to put effort to produce these antibodies in order to be prepared for the dis information, especially with it comes to the narratives.

So we wouldn't want to learn people about ‑‑ I'm sorry, about narratives, about techniques and how people can be manipulated because narratives may be different.  So that's ‑‑ that is the point.

And I would like to show you some case, some example, and so the prebunking campaign has been launched in Poland, due to cooperation between Demagog, and national Research Institute and we have warned Polish citizens against two common techniques, scapegoating and intimidation, used to spread anti‑Ukrainian narratives and I believe this campaign is quite successful because it got 30 million views.  So I think that we will reap the rewards later.

So all things considered, we should bear in mind that tackling disinformation is very difficult, however, it could be successfully done in which it focuses on raising awareness in order to promote proactive attitudes because prevention is better than cure.

Thank you very much.

>> GIACOMO MAZZONE: Thank you very much, Adam, for this interesting cases that show us, again, how complicated it is.  And I remember that we are focusing mainly on social media, and Internet, while the media regulation is already ‑‑ has its own established channels that is not concerning the IGF, I would say.

It looks like a very baroquish lecture.  I can see this impression, because you have the European Union at the beginning that says measures, are but these measures are not taken directly by the European Union.  The European Union reserves the right with the digital service work, and then the work is commissioned to independent authority, in this case we have a combination of academia, because EDMO is a consortium of academia and fact checking.  So third‑parties that are not within the institutions.  They are people of civil society dealing with that.

And then the last part of this puzzle, of the ‑‑ tackling the issue of disinformation in the European construction and this is the regulatory authorities, and this is why we have with us Francesco that is in another meeting today but he accepted kindly to extract himself for some minutes to be with us.

Francesco, can you explain the role of the regulators in this scheme?

>> FRANCESCO SCIACCHITANO: First thank you to Giacomo and Paula and all the organizers of this interesting meeting, because it is a very timely ‑‑ this has been organized in a very timely manner.  This is a moment in which, really, all the rules are changing.  We heard Krisztina, the call of practice on information and explaining what type of Internet information there is on the code of practice and disinformation and the DSA, and we heard also from Paula what are the difficulties in monitoring the compliance with the practice, monitoring the problems related to the disinformation and this is also very clear from the presentations of the friends who are doing the fact checking activity.

Now, we are ‑‑ this is ‑‑ you introduced us in the perfect manner, because at the end of the day, the institutions which have the power to monitor the compliance with the general principles, with freedom of information, but also with the free formation of the opinion, et cetera, in the various Member States are the regulators of the individual sector.

But as it is, everything would sound very simple.  The regulators are the ones who have to monitor the compliance and enforce the various regulations, et cetera, but in reality, this is not at all that easy.

I would like to start with a report which was published by OFCOM, just to give you the idea that the news and the media online, in the social media have reached today.  OFCOM says ‑‑ it's a report that was published ten days ago, I think.  OFCOM says that 60% of the British citizen look for news on Internet and 14% of the British citizens look only on the Internet.

Facebook has become the third most popular news provider in the UK, after BBC and ITV, while for the teenagers, the only providers of news are Instagram, TikTok and YouTube.

So the role of the media has almost vanished for the younger generations and the role of the social media is becoming increasingly important, also compared with media, also for the older generations.

The citizens use the digital intermediaries for disinformation, and this is something that Krisztina, which is carved into stone by this report.

Now, against a situation like ‑‑ of this type, that is very similar to the situation that we have in many European countries, we need to rethink the role of regulators for the very simple reason that most of the regulators in European Union they moment, until we have the call of practice, until we ask the DSA were lacking any type of competence, in monitoring the content that is disseminated by the social media.  Of course, we could monitor the content of the media online, but they have to be media.  So if we talk about social media, that's a completely different story.

Indeed, unprecedented modernization of the rules that was this enormous information by information and the European institutions, the rule of modernizing the rules, this effort, I mean was carried out with regards to the media sector, with regards to many sectors but not specifically ‑‑ or better giving us very few competencies and powers when it comes to online preference.

We ensure that the measures that the platforms have adopted when it comes to protecting minors and hate speech, I mean, this is a power that we have, it is a very limited power.  It is true that we have to check and comply, to check the compliance with Article 17 of the new cooperative directive that the platforms need to adopt measures in order to make sure that they do not ‑‑ or that they move quickly, content that is in breach.  And this is, again, something that can be checked by the institutions not necessarily the regulators, but it is also true that in the other areas, disinformation, in particular, this is what we are talking today, the regulators have very little weapons.

Some weapons arrived from the code of practice on disinformation, thanks to the European Commission, which gave us the role of monitoring the compliance with the obligations of the code.  This is even clear, I must say and I'm really very thankful to the European Commission for that in the new strength and call to practice that Krisztina has presented.  And this is something that the regulators have done jointly, not singularly, because singulary there, we still don't have those powers at the national level, but jointly with ERGA and EDMO.  And we have the obligations of the code of practice and we published several reports that you can find in our website, in the ERGA website.

Those are reports are saying basically that platforms are very active in trying to counter disinformation, but at the same time, that the number of data that we are receiving from the platforms, when we try to monitor the compliance with obligations, is extremely limited.  So there are a lot of areas in which the code of practice could be improved when we do the monitoring and a lot of areas in which it has actually been improved.

Now it remains to be seen because we are part of the task force for the implementation of the parties together with EDMO and we are working at the service level ‑‑ at the KPIs, the indicators of key performance, we are trying to come up with practical ‑‑ practical measures that can assess whether they are being implemented, are we being complied with or not, but, of course, this is something that will happen next year when we will have the first new round of monitoring of the compliance, done by ERGA with a new strengthened code.

So at the moment, we are in a situation in which we have new powers, and we are going to use them.  In the meantime, there is also another good news, and the other good news is that those initiatives, the code of practice and the disinformation, and the DSO and of the European Commission, are being joined ‑‑ have been complimented by additional initiatives.  Initiatives on political ads and media literacy those are all ‑‑ and I'm thinking, of course on the ENFA, the European freedom media act.  And it starts in general the media integrity and the correctness of the information that circulates also on the Internet.

So in order to conclude, it is very early to say whether now we have a regulatory framework that gives us enough powers in order to comply with our tasks, are in order to fulfill our tasks and carry out the proper monitoring of the compliance of the platforms with our obligations of the COLA practice and the obligations of the regulations.

We are very optimistic, this disinformation, has become one of the tools or one of the issues that has to be checked in order to assess their ‑‑ their activities that apply that are getting out to mitigate the systemic risk.  And so we are convinced from that now onwards, that the DSA is in place, it can be combined with the code of practice and other initiative from the European legislators.  From now on, we will have additional tools to carry out our activity.

We will use a lot with the cooperation with EDMO and fact checkers and we hope that these coalition of subjects will give us the opportunity of ‑‑ of controlling and monitoring correctly what the platforms are doing and countering the disinformation in a much more effective way than it will ‑‑ than we were capable to do until now.

I hope I was clear.  If there's any question on what we did, et cetera, I'm very happy to answer.  Thank you.

>> GIACOMO MAZZONE: Yes, I have one question.  You said that you were at a regular meeting but looking at the noise behind you, it looks like a construction site.

>> FRANCESCO SCIACCHITANO: Yes, I had to leave the room in which there was a conference and now I'm in another room in which they are doing some construction.  So I apologize for that.

>> GIACOMO MAZZONE: So for fake news, this is fake news from the beginning.

And the second message that I listened from what you say is that you are suggesting to the conflict ‑‑ the parting conflict in Ukraine to wait until the legislation of the European Union will be ready so it's effective against the disinformation, but I'm not sure that they will listen to us.

So anyway, we have very short time for discussion.  There are some questions, I think many.  So we have to select only a few.

Let's start from who is in front, ladies first.

>> AUDIENCE MEMBER: Thank you very much for this interesting panel.  And my name is Katarina, I'm from Ukraine, and I really appreciate the European Union effort in tackles disinformation.

My question is related to the, like, national hubs of the EDMO.  As I understood that there is a national hub also in Hungary as a country that is supporting Russian propaganda.  Have you encountered additional challenges in this regard?  Thank you.

>> GIACOMO MAZZONE: We'll collect a few questions.

>> AUDIENCE MEMBER: My name is Tim and I'm from Russia.  It sounds promising, yes.  Earlier today we have discussed some matters of fact checking and fighting facts from Russian side and I have one question for the representatives of EDMO.  As there was a story ‑‑ a very sad story two weeks ago, a rocket hit Poland territory, sadly killing two Poland civilians.

And there was an investigation done, and the result of the investigation was that that rocket, was actually a Ukrainian defense missile, sadly, but it was Ukrainian missile.

Regarding that, however ‑‑ Ukraine officials and Zelenskyy himself, still insists that this rocket was Russian.  So my question is that in this way, does Ukraine government and Mr. Zelenskyy are recognized as disinformation source and is that activity ‑‑ is recognized as disinformation narrative?  Thank you so much.

>> AUDIENCE MEMBER: I understand that there are policies to field questions first.  Yeah, actually, my question is not so ‑‑ yes, my name is Piach, and I run an intergovernmental organization dealing with satellite communications, based in Paris.

Actually, my question is not so much about Ukraine and Russia per se, and this conflict, but more of a general nature.  And this is a question which I would like to address first to the gentleman who actually run fact checking organizations, Mr. Maternik, and Mr. Canetta.

First question is do you have internally policies that will do some assessment of your own work?  Actually, if you go back to the, let's say rulings of statements you made about other people's facts, are and found out ex post that you were wrong and then what do you do about it?

And the question that follows up on that is, is there ‑‑ and I really don't know in answer to this question.  So it's not like ‑‑ you know, I'm not asking rhetorical question.  Is there a mechanism, a watchdog that would conduct, I don't know, audits, assessments of the fact checkers?  Basically, as I see it and also as a former journalist and media manager, we have a group of ‑‑ well‑meaning, you know, people all over the world, trying to check facts but we don't know who those people really are, what they do, what are their procedures and techniques.  Actually, that could be disinformation to the nth power if actually those organizations are infected.  So that's my question about possible audit procedures.

>> GIACOMO MAZZONE: Do you have a question?

>> ROBERTO ZAMBRANA: No.

>> GIACOMO MAZZONE: So I think we give the floor to the speakers to answer the questions.  And then.  So the first question about Ukraine/Hungary, I think this is for Paula about the network of the monitoring bodies across Europe.

>> PAULA GORI: Yes, and ‑‑ yes, so those who work in the mandate, as we do at EDMO to work independently.  And so they are composed by multidisciplinary, where they have fact checkers and media literacy and the work they do is supposed to be done independently.  There will be, indeed a hub starting in Hungary as well and, of course, we are pretty sure that the work they will be doing will be independent as all the other hubs are doing.

Regarding the second question, I think that Tommaso will also enter into that.  What I can say more in general is that I think the case showed how ‑‑ I mean I would see that even in a broader picture in the sense that nowadays we also know, like in journalism how much it is a tendency to be there immediately, which has quite an impact on journalism in general.  We saw it also in the last years also with other news and I think that we ‑‑ Tommaso can confirm because we also have it in our database, but disinformation, as we had before, is spreading on traditional media, and we saw, for example, at the very beginning of the war, there was actually an Italian broadcast that was using images from a video game, saying it was images from the war.

Unfortunately, the disinformation still spreads on all media, but media and journalists also have a rectification rule.  There was ratification of the information that was given.

I would link that to bridge it to Tommaso who can continue on that and then reply to the last question.

>> TOMMASO CANETTA: Yes, absolutely.  Traditional media, sadly, sometimes plays a role in spreading disinformation.  Honestly, this cannot put the traditional media as the same level as platforms, because in general, the traditional media is a reliable source and platforms are not editor.  And so a lot of disinformation can flow there but I want to answer two questions from the public.  The first one from the Russian colleague.  

Yeah, the statement from Zelenskyy statement is more than likely false.  We have one side claiming this and that side climbing that.  Something similar is going also about the North Stream probably attack.  The situation is yet not clear entirely.  Let's make it the case that in the end it's absolutely true that it was a Ukrainian missile landed in Poland, and Zelenskyy lied about this maybe in good faith, maybe in bad faith.  We don't know this is a false news and, of course, we will publish articles about the specific false news, but it's not a narrative, because a narrative required a lot of false news conveying a specific message.

So if in the next months we see many false statement, many false news coming from the Ukrainian side saying that Russia is hitting NATO countries, trying to start World War III, of course, that would be a disinformation narrative.  A disinformation in the interest of Ukraine, tried to fear mongering NATO allies in giving more weapons and entering directly the war, et cetera, et cetera.

This is, of course, a theoretical case.

>> GIACOMO MAZZONE: The other question ‑‑

>> TOMMASO CANETTA: Yes, this is very, very important to us.  Yes, we have a policy of honest correction.  This is required by the international code for fact checkers the IFCN code and this is required also by the recently borne European code for fact collectors.  The European fact checking standard network code.  Both of these codes require that if they say something false, they commit mistakes they publish in the same article, republishing on the same channels and the same content saying, we are sorry.  We published something that was wrong, and this is the correct information.  And in the websites of the fact checking organizations you will have a page with all the honest corrections done in the previous years.

And more in general, about the second question, who fact checks the fact checkers?  Controls the controllers?  This is very important because anyone can self‑appoint as a fact checking organization, but we want to have actual independent, and professional fact collecting organizations that are not, let's say propaganda actors in disguise.  So we have these codes, the IFCN code, from 2016 or '15 maybe.  And the European code that.  It's something that you promise you will respect.  These are codes that envision an assessment procedures with independent assessors that work about controlling the applications that you respect criteria about financial transparency, organizational transparency, methodology, the honest corrections for policies under the methodology sections and ethical standards.

So you need to have very, very transparent organizations, provide information, go under very strict assessment and at that point you receive the badge of being an actual fact checking organization.  IFCN globally and EFCN Europe‑wise.  So I think we have some strong guarantees about the true nature of the fact checking organizations that are part of this international European bodies.

>> GIACOMO MAZZONE: Thank you, Tommaso.  I suggest you to that if you can, publish the link to 9 code of conduct that recently the fact checker associations did.  I suggest you look at the other documents.  It will be interesting for you.  Roberto?

>> ROBERTO ZAMBRANA: We have a question from the online audience.  Are and the question is what could be done in the situation that the global digital platforms don't want to cooperate with law enforcement in order to ‑‑ in other countries regarding illegal content like violence, organized disinformation campaigns on their platforms.  What is the responsibility of the states regarding the cross‑border platform behaviors and noncooperation?

>> GIACOMO MAZZONE: I think that is a question for regulator, Francesco you are still with you from the working site?

>> FRANCESCO SCIACCHITANO: I am still with you, and I hope there will be less noise.  That's a very interesting question and the answer is not particularly easy is that.  The point is this, we have pretty clear provisions that regard ‑‑ that concern, and they apply to platforms when it comes to certain specific issues.

For example, there is the European audio media services, which in article Ms. 28 and 28b, prescribes ‑‑ or imposes on to the platforms the obligation to take measures in order to avoid that content, that is harmful for minors or insights to hatred is disseminated platforms.  While it is clear, and we don't have a specific situation for the content.  When there's a situation where it's harmful for minors or inciting to hatred then when they get to ‑‑ because there is a trusted flagger that gives them this information.  There have been a number of complaints that provide the disinformation, et cetera.  Since they are obliged for moderation, content modernization, then in this case they have to delete that.  So the problem ‑‑ it doesn't come from the moment when the content is uploaded.  The content is uploaded and the platforms don't have any obligation to monitor that content.

But the moment in which they get to know that this content is illegal and as a said for the early digital directive, it is against ‑‑ it is harmful for minors or inciting to hit read.  Then they have to remove it or demote it, and if not, the Member States which have power over the platforms can react with some clear sanctions.

I must say that this is ‑‑ this has been reinforced by the digital services act.  The digital services act ‑‑ the reason why I was saying that this is a very ‑‑ it's a very important moment because now with this publication of the Digital Services Act, we will have all a new set of tools that we can use in order to ensure that there is compliance from the platforms.  If they don't comply with the orders of the administrative institutions according to Article 8 to remove illegal content, they can be brought in front of the you're European Commission.  The European Commission would come to a sanction ‑‑ a decision of sanction with a fine that can go up to 6% of the global turnover of the platform, in the previous year.

So if the platforms are not complying about the obligations of the DSA, including the obligations to remove illegal content, the sanction can be this one and it's a powerful sanction.

>> GIACOMO MAZZONE: We have to close, sorry.

>> FRANCESCO SCIACCHITANO: Okay.  I'm done.  So this was the question.  This was the answer any way.

>> GIACOMO MAZZONE: Thank you very, very ‑‑ the answer to the question, very proper.  We have to close.  I'm sorry for the others would want to raise questions online and offline, but we have to give the floor to the next panel.  I think that we have learned something today.  As you see, it's a work in progress.  I don't ask Roberto what he thinks from a non‑European viewpoint.  Is it something viable?

>> ROBERTO ZAMBRANA: No, I only wanted to say that you will remember that we have a meeting this morning, regarding a similar subject, but with a different version and I think in any case, like, when we have a ‑‑ when we have interest, we will be expecting to do these kind of things.  And if we cannot handle but we can see that one life, one valuable life is lost during war, then we will expect that these kind of things happen.  So the promise, I think in the future for us in humanity, is that things can be dealt with.

>> GIACOMO MAZZONE: And with these words of wisdom, we will close the session.  Thank you for your attention.  And, of course, the slides will be ‑‑ we will publish the slides on the website.