IGF 2020 – Day 10 – WS260 ‎COVID-19‎ “Dis-infodemic”: ‎Challenges,‎ lessons, opportunities

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

 

>> GUY BERGER: Good afternoon.  Thank you very much.  If you're not speaking, please mute your mic so we can time to have the best audio possible.

You will find that this is very timely of a topic.

A quick introduction and we'll post three ideas on what this session is all about.

I'm Guy Berger, I work at WEONG,  I'm very much involved in the communication internet sector, with the type of work that we're doing to look at the free flow of information, it is part of the constitution there, and we have become very super conscience of dis‑infodemic during this COVID period and we hope that COVID‑19 pandemic will become history, it is raging and the effect will  be long lasting and this information, it is not only in the COVID space.  Here we'll particularly look at what it means to understand and deal with this question of COVID dis‑infodemic.  In particular, I have been involved in a report called balancing act and I'll post the link in the chat there.  This balancing act, it was done for an institutional Broadband Commission for Sustainable Development, and the Broadband Commission for Sustainable Development, it was started in the UN system and it had a lot of commissioners that are linked to various NGOs, companies, regulators, ministry, so on.  We have two questions with us today.  I'll introduce them later.

This Broadband Commission decided to commission a Working Group to look at the information.  This was even before this period.  Looking at this information in context of freedom of expression, how you balance the two, so after about a year's worth of work led by Julie Posetti and others, they produced a report that was almost 300 Pages long.  What's unique about this report, it looks at the responses to dis‑infodemic.  Not the debate on the information and what touches on that, but how people are responding, how they're responding across the supply of the dis‑infodemic, the transmission of the dis‑infodemic and the receipt of the dis‑infodemic.  It is of course very important with this idea that we look at this, it is about the big internet companies that have involved in this.

Really what, we find, it is that dis‑infodemic and COVID is condensing a lot of issues around internet governance and in fact, these, of course, impact not only the distribution but they have relevance of the supply and at the receipt and the study, it is a balancing act, it is a very big rich study, while we were doing that, we decided to do policy briefs focused specifically on the question of dis‑infodemic, particularly during the COVID period and there is a quiz that's come out of this and we're working on more tools that we will coming up with and it says we'll have toolkits, so on.

In the meantime, we have to look at what are the issues that arised out of the study and what it means for our understanding of COVID dis‑infodemic.  To start with, we have the author of the study, the director of research for the international center of journalists and also a senior researcher at the center for freedom of the media in the U.K. and a research associate at the University of Oxford in the U.K.

Please, give us the quick overview of this research, and then we'll jump into our discussions.  Over to you.

>> JULIE POSETTI: Bear with me as I get the slide show ready here.

As Guy said, I'm a researcher and a journalist and I come at these issues from those perspectives.  As we all know, COVID‑19 has led to a parallel crisis, a pandemic of disinformation that's directly impacted on all of us really, on our health and ability to access reliable information.  One of the reasons we pivoted as Guy said during the work on the big study to produce to small ‑‑ two smaller policy briefs on the dis‑infodemic, we were seeing ways in which COVID‑19 related disinformation were potentially elements that were potentially deadly and so in confusion on policy choices and the personal safety choices we were making.

Within these three pieces of work we're addressing all of the Spectrums of the disinformation Spectrum rain what's original, we're looking at the targets of the intervention rather than the individual actors, if you like, at assessing responses as well, importantly in terms of freedom of expression and implications and we have seen a range of risks associated with freedom of expression erosion as a result of the extreme responses to disinformation during the COVID period.

It is also a body of work underpinning the studies geographically diverse, very much hard work was trying to be put into representing voices from the Global South in particular and particularly with the dis‑infodemic briefs, our objective, it was we would try to flatten the curve of COVID‑19 disinformation.  Let me take you through over key points here.  I'll fiddle with my screen a bit to see what I'm doing as well.

Important distinction to make, we're calling this a dis‑infodemic, not an infodemic per se.  The reason for that, because the infodemic as expressed originally by the World Health Organization refers to this tsunami of information both information, incredible content and false information.  Disinformation so, we have coined this term dis‑infodemic to show the disinformation and the misinformation.  The reason why we're using disinformation quite generally as an umbrella for disinformation and misinformation, because misinformation is often seeded when it goes viral by acts of disinformation.  The other point so note, influencing that decision to use disinformation generically was the reality that much of the false and misleading information we're seeing in connection with COVID‑19 is extremely damaging potentially and possibly deadly.

The effects of the disinformation of false, misleading content likely to be the same regardless of whatever the intent was behind the creation and distribution of it.

With the disinformation briefs, we identified nine key themes that were associated with the dis‑infodemic and we developed this tie apology of things on the basis of an examination of databases that were being kicked by the international fact checking network and also we were influenced by the work of our first draft and Claire Wardle is on this panel discussion and will provide additional insights based on their own analysis. 

You can see on the screen there nine different types, the idea of the origins, the spread of coronavirus, they manipulated often for political reasons, statistical manipulation, de‑contextualization, such false and misleading content connected to economic impacts, and one really dangerous situation has evolved with regard to the targeting of journalists in the context of disinformation campaigns, which have escalated, in the context of COVID‑19.  Medical and scientific related misinformation going to the very nature of the disease and how it man fasts, it impacts society and the environment and the one that will pay more attention politicization, which is also connected to the attacks on journalists we mentioned earlier, so as much as there is an acceptance by many stays and political actors, of COVID‑19 factual basis and the responses to it, they're in accordance with that, there has been a real attempt to politicalize using the tools of disinformation and content driven by fraudulent financial gains along with W. celebrity‑focused disinformation.

We also identify four key formats of disinformation, the ways in which the content was distributed, emotional narrative constructs and memes, fabricated websites, authoritative identity, trying to mislead people by creating false authorities and false websites, including websites purporting to be news sites along with decontextualized images and videos that sometimes are altered or fabricated and those actors looking to infiltrate networks, seed orchestrated campaigns, those were the four main formats.  Beyond that, we developed a typology of COVID‑19 disinformation responses and that follows the identification and the monitor of the disinformation, it looks at the producers and the distributors of the disinformation along with the production and distribution mechanisms and support offered to audience, the targets of disinformation we had in context of the COVID‑19 dis‑infodemic and then we looked at 10 types of responses which we extended to 11 different types in the context of the broadband and UNESCO ‑‑ the Broadband commission and UNESCO report balancing act, the reason for that, it was that we added to that an element under the category of responses aimed at producers and distributors that emphasized electoral responses and of course we're in a situation now where we have a great need to emphasize electoral responses in the context of the U.S. election and interestingly taking it to 11 subcategories of responses, these are often designed to detect and track and counter disinformation that's spread during elections.  The separate category was designed in recognition of the threats to democracy and electoral processes and citizens' rights associated with disinformation.

Based on that, it is important to think about the distinctions and the intersections or overlaps between COVID‑19 disinformation and political disinformation.  As you can see there, we have the situation that COVID is genuinely global and political information is specific to individual states or regions although we're seeing trends towards global issues related to sort of polarization based on populous politics, COVID‑19 disinformation is massive in scale and jumps across internet services or sites or apps resulting in swifter, decisive responses from internet communication companies and this is raised as issues around challenges to freedom of expression and potential erosion of international rights of freedom of expression and privacy and so on, this is continuing to be politicalized which undermines to an extent the cross‑party agreement that we saw earlier on in the crisis in reference to managing and responding to COVID‑19.

A data point from a recent study that I ran with Emily Bell and Pete Brown from Colombia University, which I think adds textures to this, we surveyed 1400 English speaking journalists from a diverse geographic locations and we found substantial evidence of political leaders and elected officials being identified as the main sources of disinformation which correlates with other research being done in reference to audiences.  We see that I think as a feature that should feed into responses and recommendations.

Just to highlight a few of the recommendations now that we made being both within the Broadband Commission balancing act ‑‑

>> GUY BERGER: I have to interrupt, Julie.

If you could just stop it there and wrap it up.

>> JULIE POSETTI: Hard to time keep with a Zoom with no clock.

It is probably the point to highlight, it is that there is a 23‑step freedom of expression tool to assess responses to disinformation in the Broadband Commission book which we encourage people to delve into and apply, it is a practical response and I think the recommendation to states to political actors of all sorts, to avoid engaging in political disinformation tactics which is an ongoing problem, we have to address the source of this problem along with a couple of recommendations aimed at the internet communications companies, number one being to apply fact checking to all political content and to ensure transparency and independence with regard to their intervention, particularly with regard to funding.

I'll leave it there, that all of these recommendations, it is something like 85 of them I think available in the documents that's been shared by Guy.

Thank you.

>> GUY BERGER: Thank you so much, Julie.

I think that gives a nice perspective of the key insights at the top of the recommendations and also shows you the overlap between COVID disinformation, political disinformation, how what we learn from one space could move to the other space.  Especially when the issues are being combined.

Let's move on to the speakers now and our first speak, Tina, Tina Purnat, in digital health technologies in the World Health Organization.  Of course, she and her colleagues have been right at the forefront of this issue carrying the brunt of the infodemic, it drives what people do on the ground, their behavior.  Let's ask Tina about one, two, maybe three of the major efforts made by World Health Organization to counter disinformation and misinformation with COVID‑19.

>> Tina Purnat: Thank you, Guy, very much for this opportunity.

Yes.  As both Julie and you have pointed out, in public health space when we're looking at the infodemic we're concerns with either too much information and also false or misleading information both in the digital and physical environment because what we experienced during public health emergencies is that all of this leads to confusion or risk taking and behaviors that can harm health.  That of course leads ‑‑ can potentially lead to mistrust and public authorities and public health response.  What you're seeing, increasingly misinformation and disinformation with the pandemics, they intensify or lengthen outbreaks if enough people change their behavior based on misinformation or lack of trust.  We recognized early on in the pandemic that we need to really think through, consult across all of society, already in April, how we can step together in order to respond to the infodemic and the results of that April consultation actually resulted in a framework for managing infodemics and 50 actions of all parts of society can do to help respond to that infodemic.  What we really feel is that infodemic response requires a whole society response and this includes health authorities, fact checker, Civil Society organization, academia, social media, tech sector, media, journalist, a couple of examples.

So one is definitely partnership with social media and the tech sector to promote placements of high‑quality content, health information online, including building of chatbots and other tools, and we also partnered up with the UN Secretary‑General's office to support the Verify Campaign, for example, which was designed to build awareness among the public on the circulating health and misinformation and promoting citizen skills to take pause before they share information and we have identified with several others on media and journalism trainings and the use of the science for better monitoring and evaluation on infodemic response, reaching communities through a variety of channels.

In the spirit of whole society response, really what we have seen, it is needed for response, now in the next step, it is operationalizing the response.  For example, in the African continent, WHO regional office for Africa stepped together with a variety of partners, UN organizations, fact checkers and other operational partners to really put together an infrastructure and collaboration allowing a more nimble response to misinformation.  This is something that's not been done before and we're building the ship as we're flying it and it is really significant response, a whole‑society response.  What we recognize is that we need to build a workforce, we need to build up skills for infodemic response across all of society but also in the health sectors because we're just running an infodemic manager training that's extremely multidisciplinary in nature looking at what's the evidence and what kind of interventions grounded in communities looking at people, human‑centered to help us build resilience to misinformation.  Of course, you know, recognizing this, coming up in December, we're taking stock at the next whole society conference on infodemic management where we need to chart out next steps in the response going into 2021 and really how to promote health behaviors and then also build resilience to misinformation and I'm pleased that all of the talks that are following this one, because all of us have a role to play and all of our perspectives can come together.

Thank you.

>> GUY BERGER: Thank you.

A quick follow‑up, I understand that you have got the perspective of the infodemic and what you want to do, it is to increase the supply of quality content and the visibility, verified content, and combat the disinformation and the misinformation, so you at the World Health Organization, you are creating a kind of research field, infodemicology, you have had one conference, another in December, what can you tell us about how it actually works on the ground?  People are involved in exposure to some content and some is good, some of it is damaging, how are people actually reconciling that?  What are you finding for the evidence?  That affects how we're going to respond.

>> Tina Purnat: Yes.  One major consideration when we're thinking about how does health misinformation effect people's behavior, there are actually ‑‑ there's no comprehensive review or evidence informing this, there is still a lot to learn about what actually ‑‑ what is the connection between people receiving information, having an intent to act and then acting in a certain way.  This is one of the reasons why we feel that ‑‑ we have brought together a variety of scientific disciplines over the summer to develop public health research agenda for the infodemic management.

We cannot assume that by merely providing information to people that that will result in health behaviors.  There's such a variety of factors that affect people's decision to act or to share misinformation that we really need to understand what affects this and any interventions, building resilience, it really needs to lap at the individual and community level.  This is something that's very challenging.  We all experience infodemicies locally in our community lives, our personal lives, but this require as retool, a rethinking of how we respond and at the same time feedback that from what we learn, both offline online, operational research, but then it feeds immediately into response by health authorities, fact checkers, media partners to promote information to respond debunking, for example, other interventions that responds to the infodemic.  A lot of things need to be learned but what's really clear is that we need to really put people at the center, communities at the center, ensure participatory cooperation, both of understanding interventions and also for the design of what actually is important for the people in the communities to inform then how we respond to locally to the infodemics.

Thank you.

>> GUY BERGER: Indeed, Tina.

It is something that's been underlined by Clare, the information, it is the tip of the iceberg, you have the emotion under that, you have the community norms, you have the family relationships, so to tackle this at the informational level, it is really important but not the only part of the equation.  For researcher, they're pointing out that in the future.

Let's move on to the next speaker.  Beeban Kidron, a film maker and also the Chair of the foundation and a commissioner on the Broadband Commission for Sustainable Development and was in the Working Group that oversaw this UNESCO study that was introduced.  Thank you for being here.

>> BEEBAN KIDRON: A pleasure.

>> GUY BERGER: I know that you see there is a gap in the study, a children's point of view on disinformation and how this disinformation can impact young people, especially relationally.  What's your comments on this?  How do we understand COVID disinformation not only as an adult but the impact on young children, young people?

>> BEEBAN KIDRON: Thank you, Guy.  Thank you very much.

At 300 pages, this is a rich study and there is much to get your arms around so my concern and my concern throughout the pandemic really has been about the way that children are largely forgotten in the disinformation debate.  You know, they're a demographic most likely to get information online and they don't have a habit of traditional news media, in fact, doing my own personal study and I spend an awful lot of time with teenagers and I always ask do you read the newspaper and they look at me blankly.  You know, they don't have money to get behind pay wall, some of the information, some of the good information is not accessible to them.

On the whole ‑‑ not whole, I think, I know, but we're finding out that on the whole, they have quite a poor information diet and it is something that they're very anxious about and I think that when you see this play out in the pandemic, there was a wonderful moment in I think it was about April where there was one report from the center of disease role, you know, the intersection of the two report, they say those who see most health conspiracy, they're less likely to social distance and wash hands, if you map that against children and young people media habits, here they are in this intersection and they were also sent a false message of a kind, a grain of truth message that it doesn't affect them.

So we have this other piece of information and then they then, you know, are in this sort of toxic place where they believe things they shouldn't believe, think that they are immune, don't do the necessary and take that disease home and think about responsibility of that and the problems of that.

Moreover, this is not our last health crisis, is it?  What we're teaching, a whole generation not to believe.  Whatever lives it is costing now, there is a long tale of this not sitting in a vacuum and one of the things that preCOVID I was at pains to point out, it is who are the real victims of the anti‑vac movement?  Children.  Yeah.  Even that is not played out from their perspective.

You know, an informed society may not translate to a healthy society, I believe it is the best chance that we have and I think if we sacrifice the plurality of verified and all of that information, that that sort of confusion, doubt, that I'm referring to, it is the result.  I do want to do a shoutout to Julie and colleagues because I thought that it was very, very strong of a report on the role of journalists and I think we have to look at the decimation of that community, of the news community as one part of her picture.

At some sense, the free speech justification to avoid responsibility for responding swiftly, decisively to disinformation has undermined the freedom of the press and the usefulness of the press to all of us and I just want to perhaps say one very brief thing, brief observation and I was really ‑‑ I really hear loud and clear from Tina, the whole society approach and clearly that's true with most problems of this scale, it must be true, we all have a part to play, and that's also sort of reflected in the recommendations of the report.

While we must defend any single person, freedom to speak, I don't think we can quite say that about the right to package up and sell and spread particularly when packaged as entertainment bad information.  I think there is separation between the spread and the host of information and the creation of the information, it is something that we could perhaps spend more of our time worrying about.

>> GUY BERGER: The first one, what to do about it, a lot of people say, everybody just needs more digital literacy, whatever that may mean.

So to educate, particularly young people, those that are dealing with the complicated communication environment, what are the top competencies that you think that young people need.

>> BEEBAN KIDRON: You're right, it is something that people can grab off the shelf, it is not working, let's educate them to be clever about it.  It is an important principle but we don't create a system that's toxic, making the solution it that prepubescent kids need to make nuanced decisioned beyond their capability, that's a no‑no.  We have to move to a place, having said that, that's actually about a data literacy, a systems literacy, a literacy about automated systems, algorithmic, human bias, paid for content, visual literacy, checking authorship, checking sources and very importantly, checking who paid for information.  I would like to see that as a marker on the spread of information.  Who paid for it?  There are things that can be learned and in my role, I spend a lot of time with young children doing data literacy workshops and they become very, very sophisticated about understanding all sorts of problems.  I absolutely insist that's not a solution, that's something we need in the 21st Century alongside masks, language, numeracy skills.

>> GUY BERGER: I have to ask you one last question.

When one is growing up, you are still forming your identity, emotions are really being developed, you don't necessarily know who you are yet, you are vulnerable to all kinds of measures, ideas, what is fashionable, what's beautiful.  This information content, this aspect, it reinforces the racism, it reinforces sexism, so help to us unpack in terms of the harmful ‑‑ potentially harmful cop tent that young people are exposed to, what is the disinformation component as opposed to this emotional identity component.  Does it have an effect ‑‑ for example, you gave the COVID thing.  How does that case get the surrounding perspective like Tina was saying the epidemiology perspective.

>> BEEBAN KIDRON: I'm not sure I can do as good of a job as Tina did.

Let me quote the report back.

It makes a really very important point towards the end and it talks about that you can't absolutely separate out heath disinformation from the political disinformation, from the climate change disinformation and so on, what it starts to do, hang on, there is people and planet as well as profit and it doesn't put it like that, those are my ‑‑ that's my language, you know, it does make that point in a cool way.  You have to look at that, about the industrial levels of self‑harm, pro‑ana stuff, incitements to suicide, the violence in the hands of 8 years old, why are these not a health crisis?  I think saying that's a health crisis for a number of years ‑‑ I have been ‑‑ and in the context of childhood, I think some of it is disinformation and I am a very, very keen advocate for free speech and I'm not saying we should block and ban stuff like that, those where you want, don't put it in the hands of kids without them even inviting it and so I sort of make my same point again saying the pandemic, it has been a miserable time but we must learn let sons and they're profound, if we follow the pattern of health misinformation, some of the solutions that we find for that are solutions for these broader issues around what is disinformation in childhood.

>> GUY BERGER: Let's move on to our next speaker that I'm sure will be equally interesting.  Piotr Dmochowski‑Lipski, Executive Director of the intergovernmental organization and a member of the Working Group that oversaw this piece of work.

A point that we have heard, a warning that you shouldn't offload the problems to the victims, as it were, and ‑‑ it was indicate that had we need some effective measures without violation from that introspects.  You deal with policy issues and not immediate but how do you see the role of policy and regulation in looking at this information and question, particularly around COVID and building public trust.

>> PIOTR DMOCHOWSKI-LIPSKI: You just said the word, it is trust.  Actually it is saying I understand the main theme of this particular module of the IGF.

We're dealing with issues pertaining to public policy and public policy ‑‑ just my opinion ‑‑ cannot focus on just one crisis which we have right now.  Right.

We have to understand that there will be life after that, people have said that, Beeban, before that, it shaped our culture, legal systems, so on..

If the societies and the plethora of different culture, different societies, they create networks of responses to this particular crisis and have to remember that whatever we put in the law, whatever is in the regulations, it is in the law, for good or for bad.  The typology that the researchers put together in this report to endorse, policy responses, it is mostly focused on the responses aimed at protection, distribution of the disinformation, it is the truth but it is more broader than that.

States or entities that regulate our lives, governments, regulatory environment, law, enforcement and all that ancient Romans wall imperial, right now in my opinion, they're crossroads.  We're ‑‑ we run a risk of creating the system that may not help this particular and this information and pandemic in the time before hopefully the vaccine is done but creating a lot of harm for the future.

This is recognized in the report, this broadly speaking, everything pertaining to freedom of speech, Freedom of Information.  All the responses aimed at producers and distributors could basically be divided into two major methods, methodology.

One, it is negative, basically prohibiting somebody from spreading the disinformation, effecting the platforms and the certain panel way, using panel codes, so on, and then there is positive ways.  For example, using state funds to promote fact checking, to promote general awareness and so on.  I think that just to finish this first intervention, the principle question from somebody that's been involved in setting governmental policies is to recognize the possible effects for the future as much as we're focused on actual fighting with the crisis at hand.  We in the report, we say that it is better to organize and promote certain positive stimulus and positive ways of behavior rather than focused only on regulating and enforcing new laws which may be too hasty and enacted, enacted too soon.

I will stop there unless there are questions.

>> GUY BERGER: Normally, we think of regulation and it is enabling the damage.

I know that another issue that is of interest to, it is this question of culture and dealing with disinformation so, we have spoken a bit already about how information is part of a bigger human condition which includes norms, communities, emotion, fear, aspirations, identities and culture, it is not something that this report was really made for, but previous to why we think that we need to understand culture in the face of responses to disinformation, especially around COVID.

>> PIOTR DMOCHOWSKI-LIPSKI: I'm adding something to the report as ‑‑ I actually hope and recommend to everybody listening to actually read the report, which is very good and could be useful also.

Let me start with this, bar I jump to the culture thing.  This is actually which is in the report.

There are certain constraints of public policy response to this crisis and taking into account possible ramifications in the future.

First constraint, it is very technical, it is that the pace of technology is faster than whatever the states and lawmakers and regulators can ‑‑ second, the international dimension and the international dimension, it brings me to the cultural aspect.

You can have in one society an approach to the regulations, just completely different from another and the definitions of disinformation, it is different.  The approach, for example, whether to penalize or criminalize this information or not, it is different.

Leaving COVID aside, let's say disinformation about head of state, in one country, it is a right, a Constitutional right, the other is criminalized and only punished with a slap on the wrist or a $100 fine and the other carries a death penalty.

How in the global sense can we create the right legislative or regulatory approach to spreading false news, even if following on the previous speakers, even if we can define what is false or what is not, which is a task in itself.

The answer to the question, I ask myself, it is that we cannot ‑‑ this is exactly what I would call the cultural dimension or the cultural environment and it depends on our set of beliefs.

I grew up in a country that for many, many years was governed by an oppressive communist system and all of the free speech could be and was on many occasions penalized not as disinformation, but basically ‑‑ not as political ‑‑ excuse me for ‑‑ as political free speech but disinformation.  Culture, it is very important because law itself, and we're talking about law, yeah, mostly, public policy law, and both in terms of creating it and then enforcing it, it all stems from culture and vice versa.

Just to finish and to conclude, I myself find myself in a strange situation because I very much believe in public policy, I think we should do ‑‑ those of us that have something to do with public policy, we should do everything possible to solve this crisis and not to create too much harm, but eventually be cultural changes that we actually witnessed in Europe and other countries, most recently in ‑‑ you know, United Arab Emirates, for example, other places, my native country, eventually, those culture changes will define public policy with regard to disinformation.

In the meantime, we have highly skilled technical people who are working day in and day out to make this job even more difficult for public policy creators.

>> GUY BERGER: I hope you can hear me.  My mic is not working well.

I think we'll certainly hear from Steven about the issues for the case of technology communications company dealing with the different regimes and trying to have some respect for international standards and dealing with the culture.

Let's move on before we come to Stephen to be clear, so Clare is many things, among other things, she's an expert advisor on the study.  Thank you for all of the time you have put into giving advice to the researchers.

She was, of course, very well‑known as one of the founders of first draft new, which leads to strategy and research and so a first draft, it is ‑‑ you know, it is a leading research agency looking at the whole phenomena and how it is changing, and particularly has worked to service the media in terms of how media can really play a better role in terms of covering disinformation.

Tell us a bit about how you do this, how media can cover the story without giving more oxygen to it.

>> CLAIRE WARDLE: Thank you.  Thank you for inviting me on to this panel and to give more props to Julie and the team, I really couldn't believe it when it dropped and it was so long, trying to get a handle on this topic because it shifts all the time, it is an excellent piece of work and will be one of those documents that will remain really relevant, it is a concept as opposed to the tactics and techniques as just explained, they keep changing.

It was founded in 2015 as website to teach journalists how to verify images and videos online.  Then, a year later, in 2016 many people became interested in how to verify what's true online but our focus has always been media institutions as a key element in the information ecosystem and I will say that over the last year, we have recognized that the media is just one of many, many critical gate keepers and as we're seeing, there are many people, unfortunately in many, many countries that no longer see the news media as gate keepers, we're now increasingly working with other organizations and the challenge for the media is they need to be trusted by the audience.  There is now a reason why different government, politicians have a reason to undermine that trust.  The challenge for the news media is that they have always had a paradigm of more sunlight is a disinfectant that by shining a light on something, you're going to uncover truths and that's at the core of journalism.

The problem is, that has been weaponized and used against the news media.  Look at this shiny thing over here, come report that.  Journalists have unfortunately been used increasingly over the last few years and Julie is a right, a tactic, a technique to deliberately have the media cover rumor it's and conspiracy, if the media didn't cover them, they may not have that audience.  Disinformation actors are relatively small in number and they don't really have an audience, they're only effective if they turn this or create a process by which the disinformation turns into misinformation.

When people believe it, share it, not realizing that it is false, not realizing it is harmful and the media is a critical element of the ecosystem and as we have seen over the last four years in the country that I currently live in, the U.S., which is the media have been caught in a very difficult position because the main driver of disinformation in the U.S. has been the President and administration.  It is very, very difficult for news media to say we're not going to give oxygen to these rumors because it is a story in itself.  It has been an incredibly tough time and I think we're seeing many news media now grappling with what's happened over the last four years and how they move forward.  As we have seen, there's been a determined effort not just by trump, but by leaders around the world to undermine an independent media as deliberate tactic.  Acknowledged research shows when you ask people in the street what's news, CNN, BCC, New York Times, they see these previously trusted organizations as being part of a system that's against them.  Where we go here, from here, it is critical.  There is no ‑‑ you know, as was said, every country is different, every news ecosystem is different, I'm not making grand claims globally, but there are many, many worrying patterns in many countries around the world that suggest you have half the population turning to trusted news sites for information and half of the population not.  That's absolutely playing out in the U.S. right now, it is like two parallel realities and that's ‑‑ I don't know where we come back from that.  It is a real worry ‑‑ the New York Times this week published a story after interviewing 50 secretaries of state to say was there any examples of election fraud in your states and of course, everybody said no, and the New York Times put it on the front page.  I don't think there was any New York Times reader that was like oh, thank you for letting me know.  I was want sure.

So how do we reach the 50% of the populous that are not consuming the New York Times and believe that the election was rigged?  Thinking through this, we can't keep having discussions about misinformation, disinformation as a problem of the platforms, it is, there's a lot of work to be done there, if we don't think about how the media ecosystem fits within all of that we're missing out and to Tina, who is ‑‑ I just have to pointed out, it is exciting to have Tina on the call, the work being done with the WHO, training up over 280 people in every country in the world is just wonderful.  I want to say there's a lot of things to talk about but Tina is a doer, how great is that!  What she was saying on how do we root this in communities, how do we understand all of these different elements, platforms, news media, government, all of this ‑‑ we have to understand it ‑‑ and to ‑‑  what do we do together with this problem.  The internet companies, they're not completely neutral unless more and more they have to take some moderation and curational position.  Do we see a kind of more possibility for better relations, I don't mean sweetheart relations but I mean some other resources to make sure that verified information does ‑‑ they have the deliberate falsehoods, the annotations, all of that, how do we deal with that disinformation.

>> CLAIRE WARDLE: We're talking about the platforms, we have seen more actions in the platforms in the last three months than the last 10 years and on one hand, I'm glad we are seeing that and that's a step forward in terms of the recognition that there can't be a handoff approach.  All of the researchers on the call, it is insane, there is no independent oversight on any of this.  Props to Twitter, they blocked a post last night and listed the amount of take down, how it affected reTweeting, it is a huge step forward and I'm excited about it.

I just would really want to see some independent analysis alongside the platforms marking their own homework.  We're getting there, it is good.  We can't keep saying these takedowns are ultimately a good thing unless we understand the unintended consequences of what's it mean that more people are moving to parlor this week?  We need to understand what it means to have these take downs and what it means to the wider communication ecosystem.

I do think the relationship between the media and the platforms continues to be tense and my slight frustration with that is that this relationship is one that the platforms unfortunately have benefited from the news media during a lot of this research and investigations and ultimately becoming content moderators for the platforms, the news media are deeply frustrated with the platforms because of business model issues, what it means, I feel that we're sort of stuck in this tech clash paradigm that's not necessarily helpful for saying what's the public need and instead it is just this angry relationship between the PR guys and the platforms and the disinformation journalists and I think ultimately to Guy's point, we have to recognize there is a mixture of decreasing bad quality information and increasing quality information.  It is a question of who is a quality information provider, does ‑‑ do all users believe that they're a quality information provider, and back in March, there was a sigh of relief from platforms, it is okay, we can hang on to the WHO!  It is okay!  They can make the decisions.  Then as we realize that this complex, science takes time to develop a consensus on a new virus, and unfortunately, it was a couple of missteps in the health authorities, there was a sense of we want trust, what's it mean?  This is complex.  There aren't easy solutions.  It is easy to say that increase this, decrease that, as was said by Piotr Dmochowski‑Lipski, definitions of both of these, they're really, really, really again in every different country with different media ecosystems, so as ever we'll continue to talk about these issues for years and years to come.  It is okay.  There is no easy solution to this.  It is the challenge of our times.  They are very important questions.

>> GUY BERGER: Thank you.

That brings us to Stephen.  Thank you for being here, you are engaging in those issues.

Now, you're a director of public policy in the European Union for Twitter and I guess you could also speak from the global perspective, but tell us how you see these connections in the misinformation in the spaces, with COVID, what you're doing about it in general, this phenomena that impacts other areas as well?

>> STEPHEN TURNER: Thank you.  Thank you everyone for all of the great contributions so far.  (Poor audio quality).  (Poor audio quality).

>> GUY BERGER: There's a bit of an echo, I don't know if you step back ‑‑

>> (Poor audio quality).  (Poor audio quality).

>> GUY BERGER: Try and witch the mic on and off.  Let's try that..

>> (Poor audio quality)..

>> GUY BERGER: Claire Wardle, associate director of a fact checking network, and so we come having covered a lot of ground with issues such as culture norms, emotion, identity, we come back to this question of fact checking.

Could you tell us, Christy you are in the international fact checking network, give us a sense of the network and then tell us how you have been working and what impact you see from fact checking, has fact checking had an impact that was talked about that you could create more friction in the system in terms of the flow of disinformation.

Over to you.

>> CRISTINA TARDAGUILA: Hey, everyone.  Thank you for having me.  I'm talking to you from Tampa, in Florida, it is a pit early for me still.

First I wanted to share the great news that I just got, that the Coronavirus Fact Alliance, the group that we put together back in January to fight COVID‑19 misinformation has been selected by the Paris Peace Forum to be supported for one year which means that the work that we have been doing together as fact checkers to fight all of the hoaxes and we have detected more than 9,000 hoaxes so far is being highly recognized.  I was a bit late in the call, I was in the other event just learning that and I really want to share this with you and thank you for the support that you have been putting out and we really need support here, being a tough, tough year as you may know.

To your question, the ‑‑ well, fact checkers, they definitely ‑‑ they definitely believe that they do have a high impact and I can kind of tell you why.  We have based on the coronavirus fact database we put together, 9300 hoaxes that have been duo bunked, we managed to identify different waves of misinformation, waves that went global and I can detail that if you feel that's important in call, but what we're really proud of, it is the fact that we managed to kill or minimize some of those waves and I'm talking about, you know, we're no longer seeing people blaming the vac suit, that's global, we're not receiving those videos and images of people fainting in supermarkets and subway, that's gone, because ‑‑ there are so many other ‑‑ China is no longer burning people that are effected, these hoaxes were big in sectors of this year and I truly believe that if it wasn't for fact checkers and their attitude towards working together and fast those hoaxes would still be around and the misinformation environment and the disorder would be much worse.  Of course, we still need a lot of research and the database is now being shared with about ‑‑ I said 10 but I think it is 12 researchers that sent us the papers to use the database and they're analyzing more information regarding the impact so we're looking forward to having more data regarding the impact if that be helpful, I would be happy to share with you all as soon as possible.

>> GUY BERGER: Thank you.

Congratulations.

I think the work that you do to coordinate this international fact checking initiatives, it is super important and they need that practice and support because they need the support.  There is a second chapter in the study, it is on the effect, what it involves, how independent is it, how it could be weaponized in some cases and also then what the companies can do to support it.  Let me ask you then, what do you think is the biggest challenge faced by fact checking at the moment?  Is it the nature of the activity?  Is it because you can fact check and then people didn't care about facts?  Is it that we tonight have enough money?  Tell what you say are your biggest challenges?

>> CRISTINA TARDAGUILA: Right now I believe there are a few huge challenges to be honest!.

I believe the biggest challenge is the amount of misinformation.  If you take a look at the fact checking community, it is not big to be honest.  I'm Brazilian as some of you may know, and just ‑‑ I can just use some numbers from Brazil.  Brazil has 20 million ‑‑ 200 million people and about 50 fact checkers.  That doesn't quite go well, right?  There are lots and lots of misinformation floating around.  They will never, ever be fact checked and they can be very, very harmful.  The amount of misinformation is a big issue in the planet.

The second problem I would say is the relationship or the lack of a relationship with some important platforms.  We do have a good relationship and a program with Facebook and Instagram and we still don't have anything in place with Twitter, with others and there are so many ‑‑ how many false resumes are we seeing around the planet right now?  You know, I believe this would be an important move for 2021 to make sure that the platforms understand that they should work with fact checkers because we do have a code of ethics and that keeps us ‑‑ keeps us on track.

We need ‑‑ when we fight against misinformation, you have to be ‑‑ you have to be transparent, you have to have a methodology, and, you know, you have to have a correction policy and I kind of fear that platforms are moving towards solving the issue of misinformation on their own as we saw Twitter labeling some tweets in the U.S. election with kind of no transparency on how they choose and how they do it.

Connected to that idea, I would say that we could, and we need as a fact checking community, to explain to the platforms that whatever policy they put in place in the United States, it should be taken worldwide.  What we're seeing, it is some content there some policy, enforcement that works in the U.S., but doesn't work anywhere else.

You know, we are ‑‑ we need to understand that misinformation, it doesn't respect barriers so this would be a very important thing.  I'm glad to say that I think for the first time fact checking is not out of money.  It was a sad but good year for fact checkers.

Of course, we need to, as I said in the first point, to have more people fact checking, our teams need to grow.

>> GUY BERGER: Now we have Stephen back.  Stephen, let's see if the audio works better now.

>> (Poor audio quality)..

>> GUY BERGER: Thank you.  I hope my micing is working okay.

There is a vibrant chat going on, and it is in Q&A, please, if I can ask people to join both of those as we continue.

We have I think about 15 minutes left.  I'm going to ask each of the speakers to sum up what do you think is the number one issue related to internet governance that you take away from this discussion and the issue of discrimination around COVID.

What's the number one issue, internet governance?  Let me say, remembering that internet governance is a question of norms, principles, programs, practices, in the field of a lot of different actors in particular roles that they play.

>> JULIE POSETTI: I would highlight, however, I think that per a report that was publish that had I consulted on earlier this week by the democracy forum, the onus needs to be to a large extent on the platforms.  It is a result of the newly structured information ecosystem in which we live that we have this in the first instance.  There is information to the biggest extent and I would cautious against news organizations and platforms ‑‑ caution against ‑‑ working in a critical sense and I highlight the need for extreme transparency and governance, independent governance of funding of various programs and interventions.  I don't think we can take our eyes off of that.  That's something that a friend and colleague Maria would insist on me highlighting as the individual case study in all of these issues converging in a very cataclysmic way.

That does not preclude collaboration that's done with all of those caveats in place.

I want to give kudos to Steven Turner to turn up beside the audio issue, we need good faith in transparency and a seat at the table from the platforms.  And I have personally, and from a research perspective, I have been impressed with what Twitter attempted to do, particularly in the context of the U.S. election does information that we're seeing.  This study, it was also over seen to an extent from the Broadband Commission, that Facebook is not at this table.  I think it is important in the interest of transparency, with an eye to governance to highlight that fact and there needs to be, you New York City trust between the various actors in the processes.

The other important thing I would like to briefly highlight, it has been said several times, I think that we must try to secure independent critical journalism as much as we possibly can in the context of internet governance, how do we enable it to be better surfaced, how do we insist upon funding mechanisms that support journalism as a potential bull work, yeah, there are many actors, information providers, what we're seeing in an erosion of the critical journalism, its role as a pill already of democracy, it is amiss in the tsunami of misinformation, disinformation, I want to end with that point.    there is rich, interesting interventions here that we can take away.  Decide want to under line those two things.  Thank you.

>> GUY BERGER: Thank you so much.

Tina, 2 minutes, or 1 and a half minute?

>> Tina Purnat: Thank you.

Yeah.  I would like to bring in the health perspective to this.  So health is a Human Rights.  People have the right to seek and access information about their health.  You know, the wait we think about this in public health, people have the right to be scared, they're concerned in the middle of a pandemic which can also lead to sharing of low‑quality information, it is really the job of health authorities to offer health services to mitigate the health impact of the pandemic and to provide accurate information and address the needs of their populations.

Now, what's that mean for internet governance?  What would a healthy information environment look like?  I would, you know, really think about this way mentioned earlier, we have to foster more access to high quality health information and at the same time slowing the spread of disinformation.  Really we should be thinking, and I don't have easy answers here, what is the equivalent of two meters of social distance online?  We need to try to find the answers to this question and I really feel research and transparency and really measuring the effects of everything that we're trying to do is critical.  I underscore also what Julie was saying earlier, we really need transparency and research and trust in order to work together.

>> GUY BERGER: Thank you.  Beeban Kidron.

>> BEEBAN KIDRON: I want to echo something that Tina said, because information is one of children's rights and I think I was misunderstood by one of the attendees suggesting that kids should be pushed offline, I don't think they should be pushed offline, there ‑‑ they should be invited in.  I absolutely agree, we need a few more roadblocks to that information and a little bit more supersonic good information, but obviously we can talk about what good and bad is.  I actually think that at the governance level, we're going to have independent oversight, I think that if the companies are lending their network effect to the forms of disinformation that's set out so well at the beginning by July and many of them are avoidable, you have to ask yourself the question which is are they fit for purpose if they lend their network effect to something that's having ‑‑ creating a health crisis, a crisis in democracy, a crisis in childhood, so on.  I think you get asked big questions.  Absolutely, the platforms have a role to play.  I do shoutout to Twitter for some of the moves they have made recently.  You know, it is interesting to watch that happen.

I think that there was one thing that I will argue with, Piotr is used to that, everywhere is different, every country is different, have you noticed that the problems are very similar?  We're all using the same ‑‑ the same platforms, right?  I think we just have to concentrate on that.  We have to say that there's a new kid in town and we need that kid to grow up a little.  I don't think ‑‑ if we don't get that, you know, in our sights, it is too big for us.

>> PIOTR DMOCHOWSKI-LIPSKI: You are absolutely, right.  The thing is, we obviously have the same problems because we're human and we're also very different because we're human.  I stop at that, I look forward as we look over a glass of wine, this is a major floss call issue and somebody that Karl Marx would see something have different for people that follow others, Smith, whoever.  So to reply to the question about the effect of the current situation, on the internet governance, there is a person in the audience, sorry for pronouncing this name, there was a question on the catalyst for disinformation, so to stay within this chemistry metaphor, I would say it is not a catalyst, but just a fuel.  This information, it has been with us for ages and will be with us.  It is just a fuel.

However, I do hope, and this is replying to the question about the internet governance, it is a catalyst for debate about internet, about technology, the role in the society.

The internet has been relatively new phenomena for all of us.  This is probably leaving aside armed conflicts, so on, this probably is the first major crisis that internet is such a big part of ‑‑ it is not so much creating the crisis but sometimes accelerating it, sometimes helping to fight it, and I hope we'll continue the debate and from the public policy standpoint, from the organization of our society standpoint, we will try when the whole thing is over, everybody is healthy, and even before that, that we try to come up with some not only recommendations like with the study but also with actual complex policies not to solve everything, because this is impossible, to understand the phenomena that internet is and the governance of it is even more.

>> GUY BERGER: Claire, your turn, internet governance and what we have been discussing?

>> CLAIRE WARDLE: Quickly.  There is a drumbeat we need regulation, people say that, my fear is that we still have almost no knowledge of the scale of misinformation and the impact of misinformation.

So without that, we shouldn't be thinking about regular listening until we have more understanding.  Skilled say for the last three years academics have been on the same bandwidth, we need the data, the data, the platforms, what type of data, it is complex, we just need the data.  There is consent questions and I'm not completely in the platform position of we need data infrastructures a researcher, we have to be innovative on the research questions we're asking, think much more qualitatively and stop believing that unless we have huge datasets there is nothing we can do.  We have just been doing some research with a partnership on AI asking people in the field to make a note in a diary essentially every time they come across a piece of problematic content and the flag that they see on Facebook.  It will not surprise any of you to hear there is not uniform responses to flags on the Facebook, people on the right are angry, driving down trust generally, people on the left love it.  There is no easy way here.  The way I get from that research is, the process of asking people to reflect on this, in itself, it was a tool of information literacy, people were forced to think is this the kind of content I would like to see, not see it.  How do I feel about it being flagged?  There is a million research questions we should be asking.  We should be auditing, what is it that people see, Cristina is right, there is a lot of people out there, the information, how is it different, not different, we know almost nothing about what people are seeing which is why we're jumping autopsy and down on the blog post yesterday, which is just the bare minimum!  Internet governance, yes, a lot we need to do, we have to build an empirical foundation and can't be just stuck in this we need the data, we have to think of other more qualitative, working with the community to ask themselves about what they're seeing because that process, it actually will do a really important job of getting the public to be part of this conversation about the types of information they want to see and how they want to see it handled.  It is just a call for more qualitative research.  It won't surprise people in this call.

>> GUY BERGER: I think we're also calling for the public to become stakeholders in internet governance as well.

Cristina.

>> CRISTINA TARDAGUILA: Well, I firstly completely agree with Claire, she's awesome, keeps being awesome day after day.

I would like to raise four topics very quickly if I have the time.

First, the IFCM, until this moment, it is completely against any type of regulation to fight misinformation and say that based on a database that we have been building since 2018 and we haven't opinion tracking the efforts that are being done to fight this misinformation in 60 countries and what we see so far, it is kind of sad.  First, it is that no country has managed to fight the information and they can't serve as a benchmark for the planet, we don't have that.  No one is successful.

Number two, those countries that have moved towards legislation, they have ‑‑ that's mainly in Asia, they have created other problems without solving misinformation, they have created censorship, internet shut downs and they have created fact checking organizations by government and, you know, they're arresting people.  That's dangerous and we don't see that as a solution.

Going back to others that talked about Human Rights, education is one of them.  I truly believe that's the path.  I can't believe my daughter who is 11 years old goes to middle school and does not have a fact checking or a media literacy class.

I mean in the United States, just picture how it works in Brazil, 0, 0, 0.  That will be something that I would like to push governments to think about.  Everybody hates fake news, everybody hates disinformation, what are they actually doing to prevent the next generation to have to do what we're doing here, trying to come up with solutions.  They do have a solution.  They have to teach fact checking and media literacy as they teach math.  It is just the best way.

The other idea, it is that I have been thinking a bit about two topics and the role that media is playing in all of this, click bates have good stories, I believe we could start talking about the disinformation editor, that role isn't around, I don't know any organization, any media outlet that has this disinformation editor, hey, this headline doesn't look good, this photo needs a label that says it is false or whatever.  I mean, we really need the media to get this person, some of them to review whatever they publish.  The latest is that I would love ‑‑ I keep ‑‑ since I saw social dilemma, that document, for me, it is very crazy, I'm wondering if those guys got everyone on the planet addicted to social media, could they get together with us or make things addicted to facts, they know how to make people addicted what, if we got them in the room, we say let's brainstorm on how to use platforms to get people addicted to facts.

I have been running around these two ideas.  Thank you.  Sorry for talking so much.